Add comprehensive model card for gORM-14B-merged model with metadata, paper link, and usage

#1
by nielsr HF Staff - opened

This PR adds a comprehensive model card for the gORM-14B-merged model, which is part of the research presented in Rethinking Reward Models for Multi-Domain Test-Time Scaling.

The update includes:

  • Relevant metadata: pipeline_tag: text-generation, library_name: transformers, and license: apache-2.0.
  • The paper title, a link to its Hugging Face page, and the abstract.
  • The GitHub repository link.
  • A sample usage code snippet directly from the GitHub README to demonstrate installation and inference.
  • The BibTeX citation.

These additions significantly improve the model's discoverability and usability. Please review and merge if everything looks good!

Cannot merge
This branch has merge conflicts in the following files:
  • README.md

Sign up or log in to comment