This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-23 21:45:34 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
tests
/
entrypoints
/
llm
History
Maximilien de Bayser
e25fee57c2
[BugFix] Fix server crash on empty prompt (
#7746
)
...
Signed-off-by: Max de Bayser <mbayser@br.ibm.com>
2024-08-23 13:12:44 +00:00
..
__init__.py
[CI/Build] [3/3] Reorganize entrypoints tests (
#5966
)
2024-06-30 12:58:49 +08:00
test_encode.py
[CI/Build] [3/3] Reorganize entrypoints tests (
#5966
)
2024-06-30 12:58:49 +08:00
test_generate_multiple_loras.py
[CI/Build] [3/3] Reorganize entrypoints tests (
#5966
)
2024-06-30 12:58:49 +08:00
test_generate.py
Chat method for offline llm (
#5049
)
2024-08-15 19:41:34 -07:00
test_guided_generate.py
Support for guided decoding for offline LLM (
#6878
)
2024-08-04 03:12:09 +00:00
test_prompt_validation.py
[BugFix] Fix server crash on empty prompt (
#7746
)
2024-08-23 13:12:44 +00:00