This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-10 02:05:01 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
tests
/
v1
/
attention
History
Isotr0py
cf56cf78b4
[V1] Add sliding window support to Flex Attention backend (
#24089
)
...
Signed-off-by: Isotr0py <mozf@mail2.sysu.edu.cn>
2025-09-21 05:08:07 +00:00
..
test_attention_backends_selection.py
[Attention] Unify mamba and attention backend selection (
#23171
)
2025-08-25 09:09:36 +00:00
test_attention_backends.py
[V1] Add sliding window support to Flex Attention backend (
#24089
)
2025-09-21 05:08:07 +00:00
test_attention_splitting.py
[Core/DBO][1/N] Add Dual-Batch Overlap mechanism to VLLM (
#23693
)
2025-09-16 12:21:48 -04:00
test_chunked_local_attention.py
fix some typos (
#24071
)
2025-09-02 20:44:50 -07:00
test_mla_backends.py
[Attention] FlashAttention MLA cudagraph support (
#23958
)
2025-09-08 22:05:26 +00:00
utils.py
Add FLASHINFER_MLA to backend selector test (
#24753
)
2025-09-12 22:30:07 +00:00