mirror of
https://git.datalinker.icu/deepseek-ai/DeepSeek-V3.git
synced 2025-12-08 20:34:32 +08:00
Update README.md
This commit is contained in:
parent
f6e34dd267
commit
2af264d674
@ -42,7 +42,7 @@
|
||||
9. [Contact](#9-contact)
|
||||
|
||||
|
||||
## 1. Introduction
|
||||
1. Introduction
|
||||
|
||||
We present DeepSeek-V3, a strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
|
||||
To achieve efficient inference and cost-effective training, DeepSeek-V3 adopts Multi-head Latent Attention (MLA) and DeepSeekMoE architectures, which were thoroughly validated in DeepSeek-V2.
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user