Base models trained on 1T high-quality tokens, demonstrating strong competitiveness among existing SOTA small models (<2B).
ParScale
community
AI & ML interests
None defined yet.
Recent Activity
Collections
4
Base models trained on 1T high-quality tokens, demonstrating strong competitiveness among existing SOTA small models (<2B).
Instruct models from the ParScale-1.8B base models, trained on SmolTalk-1M to enable conversational capabilities.
models
67

ParScale/ParScale-1.8B-P1-Inst
Text Generation
•
Updated
•
39
•
1

ParScale/ParScale-1.8B-P2-Inst
Text Generation
•
Updated
•
25

ParScale/ParScale-1.8B-P4-Inst
Text Generation
•
Updated
•
108
•
1

ParScale/ParScale-1.8B-P8-Inst
Text Generation
•
Updated
•
17
•
1

ParScale/ParScale-1.8B-P1
Text Generation
•
Updated
•
23
•
1

ParScale/ParScale-1.8B-P2
Text Generation
•
Updated
•
35

ParScale/ParScale-1.8B-P4
Text Generation
•
Updated
•
19

ParScale/ParScale-Qwen-3B-P2-Python
Text Generation
•
Updated
•
9

ParScale/ParScale-Qwen-3B-P4-Python
Text Generation
•
Updated
•
11

ParScale/ParScale-Qwen-3B-P8-Python
Text Generation
•
Updated
•
40
datasets
0
None public yet