This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-09 10:54:55 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
.buildkite
History
Kunshang Ji
b5bae42f91
[XPU] Update latest IPEX 2.8 release (
#27735
)
...
Signed-off-by: Kunshang Ji <kunshang.ji@intel.com>
2025-10-30 11:17:13 +08:00
..
lm-eval-harness
[CI/Build] Update Llama4 eval yaml (
#27070
)
2025-10-17 04:59:47 +00:00
nightly-benchmarks
add SLA information into comparison graph for vLLM Benchmark Suite (
#25525
)
2025-10-23 08:04:59 +00:00
scripts
[XPU] Update latest IPEX 2.8 release (
#27735
)
2025-10-30 11:17:13 +08:00
check-wheel-size.py
[CI] Raise VLLM_MAX_SIZE_MB to 500 due to failing Build wheel - CUDA 12.9 (
#26722
)
2025-10-14 10:52:05 -07:00
generate_index.py
[ci/build] Fix abi tag for aarch64 (
#23329
)
2025-08-21 23:32:55 +08:00
release-pipeline.yaml
Fix AArch64 CPU Docker pipeline (
#27331
)
2025-10-24 05:11:01 -07:00
test-amd.yaml
[CI/Build] Move pre-commit only scripts to
tools/pre_commit
(
#27657
)
2025-10-29 08:04:33 +00:00
test-pipeline.yaml
[CI/Build] Test torchrun with 8 cards (
#27548
)
2025-10-29 10:26:06 -07:00