This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-10 03:15:20 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
tests
/
async_engine
History
sasha0552
7a3d2a5b95
[Frontend] Support for chat completions input in the tokenize endpoint (
#5923
)
2024-07-16 20:18:09 +08:00
..
__init__.py
[CI/Build] Move
test_utils.py
to
tests/utils.py
(
#4425
)
2024-05-13 23:50:09 +09:00
api_server_async_engine.py
[Frontend] Add FlexibleArgumentParser to support both underscore and dash in names (
#5718
)
2024-06-20 17:00:13 -06:00
test_api_server.py
[Core] Fix engine-use-ray broken (
#4105
)
2024-04-16 05:24:53 +00:00
test_async_llm_engine.py
[Core] Pipeline Parallel Support (
#4412
)
2024-07-02 10:58:08 -07:00
test_chat_template.py
[Frontend] Support for chat completions input in the tokenize endpoint (
#5923
)
2024-07-16 20:18:09 +08:00
test_openapi_server_ray.py
[ci] try to add multi-node tests (
#6280
)
2024-07-12 21:51:48 -07:00
test_request_tracker.py
Add health check, make async Engine more robust (
#3015
)
2024-03-04 22:01:40 +00:00