Text Generation
Transformers
Safetensors
bailing_moe
conversational
custom_code
zzqsmall commited on
Commit
1b86b66
·
verified ·
1 Parent(s): bc6505f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -63,7 +63,7 @@ These capabilities form the foundation for **general, collaborative human–AI i
63
  ### Pre-Training at Trillion Scale
64
 
65
  The Ling 2.0 architecture was designed from the ground up for trillion-scale efficiency, guided by the **Ling Scaling Law** ([arXiv:2507.17702](https://arxiv.org/abs/2507.17702)).
66
- This ensures architectural and hyperparameter scalability even under **10²⁵–10²⁶ FLOPs** of compute.
67
 
68
  Key architectural innovations include:
69
 
 
63
  ### Pre-Training at Trillion Scale
64
 
65
  The Ling 2.0 architecture was designed from the ground up for trillion-scale efficiency, guided by the **Ling Scaling Law** ([arXiv:2507.17702](https://arxiv.org/abs/2507.17702)).
66
+ This ensures architectural and hyperparameter scalability even under **1e25–1e26 FLOPs** of compute.
67
 
68
  Key architectural innovations include:
69