This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-24 08:25:01 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
vllm
/
attention
History
wangshuai09
4e2d95e372
[Hardware][ROCM] using current_platform.is_rocm (
#9642
)
...
Signed-off-by: wangshuai09 <391746016@qq.com>
2024-10-28 04:07:00 +00:00
..
backends
Fix: MI100 Support By Bypassing Custom Paged Attention (
#9560
)
2024-10-26 12:12:57 +00:00
ops
[Hardware][ROCM] using current_platform.is_rocm (
#9642
)
2024-10-28 04:07:00 +00:00
__init__.py
[Core] Add
AttentionState
abstraction (
#7663
)
2024-08-20 18:50:45 +00:00
layer.py
[Kernel] Support sliding window in flash attention backend (
#9403
)
2024-10-20 10:57:52 -07:00
selector.py
[Hardware][ROCM] using current_platform.is_rocm (
#9642
)
2024-10-28 04:07:00 +00:00