[Bugfix] Safeguard against missing backend in AttentionBackendEnum (#28846)

Signed-off-by: jesse <szxfml@gmail.com>
Signed-off-by: Song Zhixin <szxfml@gmail.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
This commit is contained in:
Song Zhixin 2025-11-18 18:53:44 +08:00 committed by GitHub
parent 439368496d
commit 285eaa4285
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -310,7 +310,8 @@ class Attention(nn.Module, AttentionLayerBase):
kv_sharing_target_layer_name,
**extra_impl_args,
)
self.backend = AttentionBackendEnum[self.attn_backend.get_name()]
backend_name = self.attn_backend.get_name()
self.backend = AttentionBackendEnum.__members__.get(backend_name)
self.dtype = dtype
# For cuda-alike (CUDA and ROCM) and cpu platforms, we control how