PETRA: Pretrained Evolutionary Transformer for SARS-CoV-2 Mutation Prediction
This repository contains the model weights for PETRA (Pretrained Evolutionary TRAnsformer), a novel transformer approach for predicting SARS-CoV-2 mutations, as presented in the paper PETRA: Pretrained Evolutionary Transformer for SARS-CoV-2 Mutation Prediction. The approach is based on evolutionary trajectories derived from phylogenetic trees, effectively mitigating sequencing noise and capturing the hierarchical structure of viral evolution.
Abstract:
Since its emergence, SARS-CoV-2 has demonstrated a rapid and unpredictable evolutionary trajectory, characterized by the continual emergence of immune-evasive variants. This poses persistent challenges to public health and vaccine development. While large-scale generative pre-trained transformers (GPTs) have revolutionized the modeling of sequential data, their direct applications to noisy viral genomic sequences are limited. In this paper, we introduce PETRA(Pretrained Evolutionary TRAnsformer), a novel transformer approach based on evolutionary trajectories derived from phylogenetic trees rather than raw RNA sequences. This method effectively mitigates sequencing noise and captures the hierarchical structure of viral evolution. With a weighted training framework to address substantial geographical and temporal imbalances in global sequence data, PETRA excels in predicting future SARS-CoV-2 mutations, achieving a weighted recall@1 of 9.45% for nucleotide mutations and 17.10% for spike amino-acid mutations, compared to 0.49% and 6.64% respectively for the best baseline. PETRA also demonstrates its ability to aid in the real-time mutation prediction of major clades like 24F(XEC) and 25A(LP.8.1).
Codebase: The official PyTorch implementation and codebase, built on Megatron-LM, can be found at: https://github.com/xz-keg/PETra
Three models represent models trained on different data:
- One (116M parameters) trained on usher tree data before 2025-02-12.
- One (medium size) trained on data before 2025-07-16.
- One large model (458M parameters) trained on data before 2025-09-23.
Performance
Performance for the 20250212 model, evaluated on sequences collected after 2025-2-12 and available before 2025-7-16.
Nucleotide mutation prediction results for PETRA
We report average and weighted recall @1, 10, and 100. In weighted measure, sequences are weighted by their representativeness.
| Method | Average Recall @1 | @10 | @100 | Weighted Recall @1 | @10 | @100 |
|---|---|---|---|---|---|---|
| Random Guess | 0.00% | 0.01% | 0.08% | 0.00% | 0.01% | 0.08% |
| Bloom | 0.45% | 1.50% | 9.15% | 0.49% | 1.48% | 9.41% |
| PETRA | 11.34% | 16.92% | 22.64% | 9.45% | 14.20% | 19.72% |
Spike amino-acid mutation prediction results for PETRA
We report average and weighted recall @1 and 10. In weighted measure, sequences are weighted by their representativeness.
| Method | Average Recall @1 | @10 | Weighted Recall @1 | @10 |
|---|---|---|---|---|
| Random Guess | 0.01% | 0.13% | 0.01% | 0.13% |
| Bloom | 6.26% | 12.63% | 6.64% | 13.08% |
| PETRA | 17.84% | 25.69% | 17.10% | 25.58% |
Citation
If you find PETRA useful for your research, please consider citing the paper:
@article{luo2025petra,
title={PETRA: Pretrained Evolutionary Transformer for SARS-CoV-2 Mutation Prediction},
author={Luo, J. and et al.},
journal={Hugging Face Papers},
year={2025},
url={https://huggingface.co/papers/2511.03976}
}