This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-21 04:05:01 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
vllm
/
entrypoints
/
openai
History
Dan Clark
03d37f2441
[Fix] Add args for mTLS support (
#3430
)
...
Co-authored-by: declark1 <daniel.clark@ibm.com>
2024-03-15 09:56:13 -07:00
..
__init__.py
Change the name to vLLM (
#150
)
2023-06-17 03:07:40 -07:00
api_server.py
[Fix] Add args for mTLS support (
#3430
)
2024-03-15 09:56:13 -07:00
protocol.py
Add guided decoding for OpenAI API server (
#2819
)
2024-02-29 22:13:08 +00:00
serving_chat.py
Re-enable the 80 char line width limit (
#3305
)
2024-03-10 19:49:14 -07:00
serving_completion.py
Re-enable the 80 char line width limit (
#3305
)
2024-03-10 19:49:14 -07:00
serving_engine.py
Re-enable the 80 char line width limit (
#3305
)
2024-03-10 19:49:14 -07:00