mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced 2026-01-23 07:04:27 +08:00
Update gpu.rocm.inc.md to add support for AMD Ryzen AI MAX / AI 300 Series (gfx1151, gfx1150) (#28308)
Signed-off-by: Hamid Mukhtar <15519013+hammmmy@users.noreply.github.com>
This commit is contained in:
parent
d0c7792004
commit
61d25dc44b
@ -11,9 +11,10 @@ vLLM supports AMD GPUs with ROCm 6.3 or above, and torch 2.8.0 and above.
|
||||
# --8<-- [end:installation]
|
||||
# --8<-- [start:requirements]
|
||||
|
||||
- GPU: MI200s (gfx90a), MI300 (gfx942), MI350 (gfx950), Radeon RX 7900 series (gfx1100/1101), Radeon RX 9000 series (gfx1200/1201)
|
||||
- GPU: MI200s (gfx90a), MI300 (gfx942), MI350 (gfx950), Radeon RX 7900 series (gfx1100/1101), Radeon RX 9000 series (gfx1200/1201), Ryzen AI MAX / AI 300 Series (gfx1151/1150)
|
||||
- ROCm 6.3 or above
|
||||
- MI350 requires ROCm 7.0 or above
|
||||
- Ryzen AI MAX / AI 300 Series requires ROCm 7.0.2 or above
|
||||
|
||||
# --8<-- [end:requirements]
|
||||
# --8<-- [start:set-up-using-python]
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user