This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2026-03-18 05:07:07 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
examples
/
offline_inference
History
Isotr0py
03fe18ae0f
[VLM] Add TP support for Phi-4-MM (
#14453
)
...
Signed-off-by: Isotr0py <2037008807@qq.com>
2025-03-08 05:57:14 -08:00
..
basic
…
openai
…
profiling_tpu
…
audio_language.py
[VLM] Add TP support for Phi-4-MM (
#14453
)
2025-03-08 05:57:14 -08:00
chat_with_tools.py
…
cpu_offload_lmcache.py
…
data_parallel.py
…
disaggregated_prefill_lmcache.py
…
disaggregated_prefill.py
…
distributed.py
…
encoder_decoder_multimodal.py
…
encoder_decoder.py
…
llm_engine_example.py
…
lora_with_quantization_inference.py
…
mlpspeculator.py
…
multilora_inference.py
…
neuron_int8_quantization.py
…
neuron.py
…
pixtral.py
…
prefix_caching.py
…
prithvi_geospatial_mae.py
…
profiling.py
…
rlhf_colocate.py
…
rlhf_utils.py
…
rlhf.py
…
save_sharded_state.py
…
simple_profiling.py
…
structured_outputs.py
…
torchrun_example.py
…
tpu.py
…
vision_language_embedding.py
…
vision_language_multi_image.py
…
vision_language.py
…