Logo
Explore Help
Sign In
xinyun/vllm
1
0
Fork 0
You've already forked vllm
mirror of https://git.datalinker.icu/vllm-project/vllm.git synced 2025-12-10 04:05:01 +08:00
Code Issues Packages Projects Releases Wiki Activity
vllm/tests/tool_use
History
代君 3dbb215b38
[Frontend][Feature] support tool calling for internlm/internlm2_5-7b-chat model (#8405)
2024-10-04 10:36:39 +08:00
..
__init__.py
[Feature] OpenAI-Compatible Tools API + Streaming for Hermes & Mistral models (#5649)
2024-09-04 13:18:13 -07:00
conftest.py
[Feature] OpenAI-Compatible Tools API + Streaming for Hermes & Mistral models (#5649)
2024-09-04 13:18:13 -07:00
test_chat_completion_request_validations.py
fix validation: Only set tool_choice auto if at least one tool is provided (#8568)
2024-09-26 16:23:17 -07:00
test_chat_completions.py
[Feature] Add support for Llama 3.1 and 3.2 tool use (#8343)
2024-09-26 17:01:42 -07:00
test_parallel_tool_calls.py
[BugFix] Enforce Mistral ToolCall id constraint when using the Mistral tool call parser (#9020)
2024-10-03 16:44:52 +08:00
test_tool_calls.py
[BugFix] Enforce Mistral ToolCall id constraint when using the Mistral tool call parser (#9020)
2024-10-03 16:44:52 +08:00
utils.py
[Frontend][Feature] support tool calling for internlm/internlm2_5-7b-chat model (#8405)
2024-10-04 10:36:39 +08:00
Powered by Gitea Version: 1.23.1 Page: 742ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API