This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-09 20:04:27 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
tests
/
compile
History
Adrian Abeyta
a5a790eea6
[Bugfix] Ensure calculated KV scales are applied in attention. (
#27232
)
...
Signed-off-by: adabeyta <aabeyta@redhat.com>
2025-11-10 23:42:37 +00:00
..
piecewise
…
__init__.py
…
backend.py
…
silly_attention.py
…
test_aot_compile.py
…
test_async_tp.py
…
test_basic_correctness.py
…
test_config.py
…
test_decorator.py
…
test_full_graph.py
[Bugfix] Ensure calculated KV scales are applied in attention. (
#27232
)
2025-11-10 23:42:37 +00:00
test_functionalization.py
…
test_fusion_all_reduce.py
…
test_fusion_attn.py
…
test_fusion.py
…
test_fusions_e2e.py
…
test_multimodal_compile.py
…
test_noop_elimination.py
…
test_pass_manager.py
…
test_sequence_parallelism.py
…
test_silu_mul_quant_fusion.py
…
test_wrapper.py
…