This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2026-03-27 07:20:19 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
tests
/
kernels
/
attention
History
Matthew Bonanni
4c23690f43
[Attention] FlashAttention ViT support, make default backend (
#28763
)
...
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
2025-11-18 20:06:21 -08:00
..
conftest.py
…
test_aiter_flash_attn.py
…
test_attention_selector.py
…
test_attention.py
…
test_cache.py
…
test_cascade_flash_attn.py
…
test_cpu_attn.py
…
test_cutlass_mla_decode.py
…
test_deepgemm_attention.py
…
test_flash_attn.py
[Attention] FlashAttention ViT support, make default backend (
#28763
)
2025-11-18 20:06:21 -08:00
test_flashinfer_mla_decode.py
…
test_flashinfer_trtllm_attention.py
…
test_flashinfer.py
…
test_flashmla_sparse.py
…
test_flashmla.py
…
test_lightning_attn.py
…
test_merge_attn_states.py
…
test_mha_attn.py
[Attention] FlashAttention ViT support, make default backend (
#28763
)
2025-11-18 20:06:21 -08:00
test_mla_decode_cpu.py
…
test_pack_unpack_triton.py
…
test_prefix_prefill.py
…
test_rocm_attention_selector.py
…
test_triton_decode_attention.py
…
test_triton_unified_attention.py
…