Update README.md
Browse files
README.md
CHANGED
|
@@ -63,7 +63,7 @@ These capabilities form the foundation for **general, collaborative human–AI i
|
|
| 63 |
### Pre-Training at Trillion Scale
|
| 64 |
|
| 65 |
The Ling 2.0 architecture was designed from the ground up for trillion-scale efficiency, guided by the **Ling Scaling Law** ([arXiv:2507.17702](https://arxiv.org/abs/2507.17702)).
|
| 66 |
-
This ensures architectural and hyperparameter scalability even under **
|
| 67 |
|
| 68 |
Key architectural innovations include:
|
| 69 |
|
|
|
|
| 63 |
### Pre-Training at Trillion Scale
|
| 64 |
|
| 65 |
The Ling 2.0 architecture was designed from the ground up for trillion-scale efficiency, guided by the **Ling Scaling Law** ([arXiv:2507.17702](https://arxiv.org/abs/2507.17702)).
|
| 66 |
+
This ensures architectural and hyperparameter scalability even under **1e25–1e26 FLOPs** of compute.
|
| 67 |
|
| 68 |
Key architectural innovations include:
|
| 69 |
|