This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2026-04-13 21:27:07 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
vllm
/
engine
History
Antoni Baum
ff36139ffc
Remove AsyncLLMEngine busy loop, shield background task (
#1059
)
2023-09-17 00:29:08 -07:00
..
__init__.py
Change the name to vLLM (
#150
)
2023-06-17 03:07:40 -07:00
arg_utils.py
Implement AWQ quantization support for LLaMA (
#1032
)
2023-09-16 00:03:37 -07:00
async_llm_engine.py
Remove AsyncLLMEngine busy loop, shield background task (
#1059
)
2023-09-17 00:29:08 -07:00
llm_engine.py
Implement AWQ quantization support for LLaMA (
#1032
)
2023-09-16 00:03:37 -07:00
ray_utils.py
fix "tansformers_module" ModuleNotFoundError when load model with
trust_remote_code=True
(
#871
)
2023-09-08 17:21:30 -07:00