From b94dce2f4d875752129bb3f29b9e5f1ec0d199b5 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jukka=20Sepp=C3=A4nen?= <40791699+kijai@users.noreply.github.com> Date: Thu, 24 Oct 2024 00:53:55 +0300 Subject: [PATCH] Update readme.md --- readme.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/readme.md b/readme.md index c11a490..7f8b85f 100644 --- a/readme.md +++ b/readme.md @@ -9,7 +9,7 @@ https://github.com/user-attachments/assets/a714b70f-dcdb-4f91-8a3d-8da679a28d6e ## Requires flash_attn ! Not sure if this can be worked around, I compiled a wheel for my Windows setup (Python 3.12, torch 2.5.0+cu124) that worked for me: -https://huggingface.co/Kijai/Mochi_preview_comfy/blob/main/flash_attn-2.6.3-cp312-torch250cu125-win_amd64.whl +https://huggingface.co/Kijai/Mochi_preview_comfy/blob/main/flash_attn-2.6.3-cp312-cp312-win_amd64.whl Python 3.10 / CUDA 12.4 / Torch 2.4.1: