[Bug] Fix shape issue for eplb expert weights (#27589)

Signed-off-by: yewentao256 <zhyanwentao@126.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
This commit is contained in:
Wentao Ye 2025-10-28 08:44:05 -04:00 committed by GitHub
parent f58d9b6404
commit 0484b64248
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -1959,6 +1959,8 @@ class FusedMoE(CustomOp):
if name not in NON_EXPERT_WEIGHTS
and weight.shape != torch.Size([])
and not name.startswith("_shared_experts.")
# exclude parameters from non-expert submodules (e.g. gate/shared)
and not name.startswith("_gate.")
]
def set_eplb_state(