Datasets:

Modalities:
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
Dask
License:
BigDong commited on
Commit
13abf7b
ยท
1 Parent(s): 6bdb425

update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -43,13 +43,15 @@ English | [็ฎ€ไฝ“ไธญๆ–‡]()
43
  ## ๐Ÿ“š Introduction
44
 
45
  Ultra-FineWeb is a **large-scale, high-quality, and efficiently-filtered dataset**. We use the proposed efficient verification-based high-quality filtering pipeline to the FineWeb and Chinese FineWeb datasets (source data from Chinese FineWeb-edu-v2, which includes IndustryCorpus2, MiChao, WuDao, SkyPile, WanJuan, ChineseWebText, TeleChat, and CCI3), resulting in the creation of higher-quality Ultra-FineWeb-en with approximately 1T tokens, and Ultra-FineWeb-zh datasets with approximately 120B tokens, collectively referred to as Ultra-FineWeb. ***Ultra-FineWeb*** serves as a core pre-training web dataset for the [MiniCPM4 Series](https://huggingface.co/collections/openbmb/minicpm-4-6841ab29d180257e940baa9b) models.
 
 
46
 
47
  ## ๐Ÿ“ข What's New
48
 
49
  - **[2025.05.09]** **Ultra-FineWeb** technical report is available on [arXiv](https://arxiv.org/abs/2505.05427). ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ
50
  - **[2025.05.15]** **Ultra-FineWeb** tops the Hugging Face Datasets Trending list, reaching the #1 spot! โญ๏ธโญ๏ธโญ๏ธ
51
  - **[2025.06.06]** **Ultra-FineWeb-en** and **Ultra-FineWeb-zh** datasets are now available on Hugging Face, released alongside the [MiniCPM4 Series](https://huggingface.co/collections/openbmb/minicpm-4-6841ab29d180257e940baa9b) models.
52
- - HQ-Data Classifier, processing code, and evaluation code are coming soon... ๐Ÿ”œ๐Ÿš€
53
 
54
  ## ๐Ÿ’ก Highlights
55
 
 
43
  ## ๐Ÿ“š Introduction
44
 
45
  Ultra-FineWeb is a **large-scale, high-quality, and efficiently-filtered dataset**. We use the proposed efficient verification-based high-quality filtering pipeline to the FineWeb and Chinese FineWeb datasets (source data from Chinese FineWeb-edu-v2, which includes IndustryCorpus2, MiChao, WuDao, SkyPile, WanJuan, ChineseWebText, TeleChat, and CCI3), resulting in the creation of higher-quality Ultra-FineWeb-en with approximately 1T tokens, and Ultra-FineWeb-zh datasets with approximately 120B tokens, collectively referred to as Ultra-FineWeb. ***Ultra-FineWeb*** serves as a core pre-training web dataset for the [MiniCPM4 Series](https://huggingface.co/collections/openbmb/minicpm-4-6841ab29d180257e940baa9b) models.
46
+ - [Ultra-FineWeb](https://huggingface.co/datasets/openbmb/Ultra-FineWeb): Ultra-FineWeb, a **large-scale, high-quality, and efficiently-filtered dataset**, with 1T English tokens and 120B Chinese tokens. (**<-- you are here**)
47
+ - [Ultra-FineWeb-classifier](https://huggingface.co/openbmb/Ultra-FineWeb-classifier): Ultra-FineWeb classifier, for filtering high-quality data from web corpora.
48
 
49
  ## ๐Ÿ“ข What's New
50
 
51
  - **[2025.05.09]** **Ultra-FineWeb** technical report is available on [arXiv](https://arxiv.org/abs/2505.05427). ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ
52
  - **[2025.05.15]** **Ultra-FineWeb** tops the Hugging Face Datasets Trending list, reaching the #1 spot! โญ๏ธโญ๏ธโญ๏ธ
53
  - **[2025.06.06]** **Ultra-FineWeb-en** and **Ultra-FineWeb-zh** datasets are now available on Hugging Face, released alongside the [MiniCPM4 Series](https://huggingface.co/collections/openbmb/minicpm-4-6841ab29d180257e940baa9b) models.
54
+ - **[2025.06.16]** The **Ultra-FineWeb-classifier** is now available on Hugging Face: [openbmb/Ultra-FineWeb-classifier](https://huggingface.co/openbmb/Ultra-FineWeb-classifier). ๐Ÿš€๐Ÿš€๐Ÿš€
55
 
56
  ## ๐Ÿ’ก Highlights
57