This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-25 10:26:30 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
vllm
/
platforms
History
Li, Jiang
e2f56c309d
[CPU] Update torch 2.9.1 for CPU backend (
#29664
)
...
Signed-off-by: jiang1.li <jiang1.li@intel.com>
2025-11-28 13:37:54 +00:00
..
__init__.py
[TPU] Rename path to tpu platform (
#28452
)
2025-11-11 19:16:47 +00:00
cpu.py
[CPU] Update torch 2.9.1 for CPU backend (
#29664
)
2025-11-28 13:37:54 +00:00
cuda.py
[Bugfix][MM encoder] Fix ViT attention backend resolving for Turing GPU (
#29614
)
2025-11-27 19:17:37 +00:00
interface.py
[Attention] Update attention imports (
#29540
)
2025-11-27 11:19:09 -05:00
rocm.py
[Attention] Update attention imports (
#29540
)
2025-11-27 11:19:09 -05:00
tpu.py
[Attention] Update attention imports (
#29540
)
2025-11-27 11:19:09 -05:00
xpu.py
[Attention] Update attention imports (
#29540
)
2025-11-27 11:19:09 -05:00