[Model] Fix lmhead init bug of bailing_moe (#28777)

Signed-off-by: hwhaokun <haokun0405@163.com>
Co-authored-by: zhaozx-cn <zhaozx2116@163.com>
Co-authored-by: Jee Jee Li <pandaleefree@gmail.com>
This commit is contained in:
hwhaokun 2025-11-15 21:44:12 +08:00 committed by GitHub
parent 89d3679221
commit 085a525332
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -599,7 +599,7 @@ class BailingMoeForCausalLM(nn.Module, SupportsPP, SupportsLoRA):
config.vocab_size,
config.hidden_size,
quant_config=quant_config,
prefix=f"{prefix}.lm_head",
prefix=maybe_prefix(prefix, "lm_head"),
)
self.logits_processor = LogitsProcessor(config.vocab_size)
else: