Logo
Explore Help
Sign In
xinyun/vllm
1
0
Fork 0
You've already forked vllm
mirror of https://git.datalinker.icu/vllm-project/vllm.git synced 2025-12-21 23:15:01 +08:00
Code Issues Packages Projects Releases Wiki Activity
vllm/vllm/entrypoints/openai
History
xwjiang2010 d9e98f42e4
[vlm] Remove vision language config. (#6089)
Signed-off-by: Xiaowei Jiang <xwjiang2010@gmail.com>
Co-authored-by: Roger Wang <ywang@roblox.com>
2024-07-03 22:14:16 +00:00
..
__init__.py
Change the name to vLLM (#150)
2023-06-17 03:07:40 -07:00
api_server.py
[VLM] Remove image_input_type from VLM config (#5852)
2024-07-02 07:57:09 +00:00
cli_args.py
[Frontend] Add FlexibleArgumentParser to support both underscore and dash in names (#5718)
2024-06-20 17:00:13 -06:00
protocol.py
[Frontend] Add template related params to request (#5709)
2024-07-01 23:01:57 -07:00
run_batch.py
[Misc] Remove #4789 workaround left in vllm/entrypoints/openai/run_batch.py (#5756)
2024-06-22 03:33:12 +00:00
serving_chat.py
[vlm] Remove vision language config. (#6089)
2024-07-03 22:14:16 +00:00
serving_completion.py
[Core] Optimize block_manager_v2 vs block_manager_v1 (to make V2 default) (#5602)
2024-07-01 20:10:37 -07:00
serving_embedding.py
[Frontend]: Support base64 embedding (#5935)
2024-06-30 23:53:00 +08:00
serving_engine.py
[Core] Dynamic image size support for VLMs (#5276)
2024-07-02 20:34:00 -07:00
Powered by Gitea Version: 1.23.1 Page: 527ms Template: 4ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API