This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2026-04-05 05:27:04 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
examples
/
online_serving
History
Ce Gao
f5f7f00cd9
[Bugfix][Structured Output] Support outlines engine with reasoning outputs for DeepSeek R1 (
#14114
)
2025-03-06 03:49:20 +00:00
..
chart-helm
…
opentelemetry
Deprecate
best_of
Sampling Parameter in anticipation for vLLM V1 (
#13997
)
2025-03-05 20:22:43 +00:00
prometheus_grafana
fix typo of grafana dashboard, with correct datasource (
#13668
)
2025-02-21 18:21:05 +00:00
api_client.py
Update deprecated Python 3.8 typing (
#13971
)
2025-03-02 17:34:51 -08:00
cohere_rerank_client.py
…
disaggregated_prefill.sh
…
gradio_openai_chatbot_webserver.py
…
gradio_webserver.py
…
jinaai_rerank_client.py
…
multi-node-serving.sh
[Misc] Adding script to setup ray for multi-node vllm deployments (
#12913
)
2025-02-20 21:16:40 -08:00
openai_chat_completion_client_for_multimodal.py
…
openai_chat_completion_client_with_tools.py
…
openai_chat_completion_client.py
…
openai_chat_completion_structured_outputs_with_reasoning.py
[Bugfix][Structured Output] Support outlines engine with reasoning outputs for DeepSeek R1 (
#14114
)
2025-03-06 03:49:20 +00:00
openai_chat_completion_structured_outputs.py
…
openai_chat_completion_with_reasoning_streaming.py
…
openai_chat_completion_with_reasoning.py
…
openai_chat_embedding_client_for_multimodal.py
[Misc] fixed 'required' is an invalid argument for positionals (
#13948
)
2025-02-27 09:06:49 +00:00
openai_completion_client.py
…
openai_cross_encoder_score.py
…
openai_embedding_client.py
Update deprecated Python 3.8 typing (
#13971
)
2025-03-02 17:34:51 -08:00
openai_pooling_client.py
…
openai_transcription_client.py
…
run_cluster.sh
…
sagemaker-entrypoint.sh
…