File size: 2,410 Bytes
81ab1ca 1627c3e d89b20f cdeda51 358d7b7 cdeda51 358d7b7 cdeda51 46a73cf cdeda51 358d7b7 cdeda51 92b1d20 cdeda51 2262d61 cdeda51 2262d61 cdeda51 358d7b7 cdeda51 358d7b7 cdeda51 46a73cf cdeda51 358d7b7 cdeda51 358d7b7 cdeda51 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 |
---
license: mit
language:
- en
tags:
- code
pretty_name: ' ArcherCodeR'
size_categories:
- 1K<n<10K
task_categories:
- text-generation
viewer: false
---
<div align="center">
# ✨ ArcherCodeR
<div>
🏹️ Reinforcement Learning for Enhanced Code Reasoning in LLMs 🎯
</div>
</div>
<div>
<br>
<div align="center">
[](https://github.com/wizard-III/ArcherCodeR)
[](https://huggingface.co/wizardII/ArcherCodeR-1.5B)
[](https://huggingface.co/datasets/wizardII/ArcherCodeR-Dataset)
[](https://wandb.ai/wangjkpkucs-peking-university/ArcherCodeR?nw=nwuserwangjkpkucs)
[](https://zhuanlan.zhihu.com/p/1918765619614057424)
</div>
## Overview
[`ArcherCodeR-Dataset`](https://huggingface.co/datasets/wizardII/ArcherCodeR-Dataset) is **a dataset of verifiable, challenging, and diverse coding questions (6.7K)**. This dataset is used to train the **`ArcherCodeR`** model series, which consists of code reasoning models trained using large-scale rule-based reinforcement learning with carefully designed datasets and training recipes.
We select, clean, and curate coding problems from open-source datasets, including
- [agentica-org/DeepScaleR-Preview-Dataset](https://huggingface.co/datasets/agentica-org/DeepScaleR-Preview-Dataset)
- [deepmind/code_contests](https://huggingface.co/datasets/deepmind/code_contests)
- [open-r1/codeforces](https://huggingface.co/datasets/open-r1/codeforces)
### 🔍 Key Notes:
- Both code_contests (DeepMind) and codeforces (Open-r1) datasets include regenerated test cases to mitigate false positives.
- Significant prompt duplication exists across sources. When duplicates occur, code_contests or codeforces data takes priority.
For more details on data processing, please refer to our [Zhihu article](https://zhuanlan.zhihu.com/p/1918765619614057424).
## Technical Report
The technical report will be released soon.
|