This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2026-03-25 18:14:41 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
tests
/
tool_use
History
Code Jesus
422e793fa6
[Bugfix] Add support for
<tool_call>
format in streaming mode for XLAM Tool Parser (
#22769
)
...
Signed-off-by: Devon Peroutky <devon@kindo.ai>
2025-09-01 14:07:54 +08:00
..
__init__.py
…
conftest.py
…
test_chat_completion_request_validations.py
…
test_chat_completions.py
…
test_glm4_moe_tool_parser.py
…
test_jamba_tool_parser.py
…
test_kimi_k2_tool_parser.py
…
test_minimax_tool_parser.py
…
test_parallel_tool_calls.py
…
test_qwen3coder_tool_parser.py
[Bugfix]: Qwen3 Coder Tool Parser (
#23099
)
2025-08-26 19:58:32 -07:00
test_seed_oss_tool_parser.py
[fix] fix seed-oss-parser (
#23560
)
2025-08-25 23:16:36 -07:00
test_tool_calls.py
…
test_tool_choice_required.py
…
test_xlam_tool_parser.py
[Bugfix] Add support for
<tool_call>
format in streaming mode for XLAM Tool Parser (
#22769
)
2025-09-01 14:07:54 +08:00
utils.py
…