MACE-MH-1: Multi-Head Foundation Model for Atomistic Materials Chemistry

GitHub

Highlights

MACE-MH-1 is a foundation machine-learning interatomic potential (MLIP) that bridges molecular, surface, and materials chemistry through cross-domain learning:

  • Unified cross-domain capability spanning inorganic crystals, molecular systems, surface chemistry, and reactive organic chemistry with a single model
  • State-of-the-art performance across materials, molecular crystals, surfaces, and molecular benchmarks with a global performance score of 0.862
  • Enhanced MACE architecture with improved weight sharing across chemical elements and non-linear tensor decomposition in the product basis

image

Model Overview

MACE-MH-1 has the following features:

  • Type: E(3)-equivariant graph neural network for interatomic potentials
  • Architecture: MACE
  • Interaction Blocks: Non-linear
  • Training Stages: Pre-training on OMAT-24 (100M inorganic crystals) + Multi-head fine-tuning on diverse datasets
  • Hyper-Parameters: 512 node channels, 128 edge channels, L=2, max_ell=3, 2 layers
  • Chemical Coverage: 89 elements
  • Cutoff Radius: 6 Å
  • Multiple Heads: OMAT PBE (main), OMOL (ωB97M-VV10), OC20 (surfaces), SPICE, RGD1, MPTraj, Matpes (r2scan)

For more details, please refer to the paper, GitHub repository, and MACE foundations.

Performance

Materials Benchmarks (PBE+U Reference)

Benchmark Metric MACE-MH-1 ORB-v3 UMA-S-1.1
Phonon BZ MAE (K) 5 15 9
Phonon ωavg MAE (K) 3 5 4
Phonon ωmin MAE (K) 11 29 21
Phonon ωmax MAE (K) 12 12 11
Entropy (300K) MAE (J/mol·K) 8 13 7
Helmholtz Free Energy (300K) MAE (kJ/mol) 2 3 2
Heat Capacity MAE (J/mol·K) 3 4 3
Bulk Modulus MAE (GPa) 12.49 7.18 14.33
Shear Modulus MAE (GPa) 7.95 8.03 8.18
Thermal Conductivity RMSE (W/mK) 0.24 0.21 0.20

Molecular Crystal Benchmarks

Benchmark Metric MACE-MH-1-OMAT-D3 ORB-v3 UMA-S-1.1-OMAT-D3
X23 Formation Energy MAE (kJ/mol) 15.82 28.76 27.99
Ice Polymorphs (DMC) MAE (meV) 11.23 138.44 310.82

Surface Benchmarks

Benchmark Metric MACE-MH-1-OMAT-D3 ORB-v3-D3 UMA-S-1.1-OMAT-D3
S24 Adsorption MAE (eV) 0.095 0.174 0.329
OC20 Adsorption MAE (eV) 0.138 0.159 0.172
OC20 Correlation Pearson's r 0.98 0.974 0.97

Molecular Benchmarks

Benchmark Metric MACE-MH-1-OMAT-D3 ORB-v3-D3 UMA-S-1.1-OMAT-D3
Wiggle150 MAE (kcal/mol) 4.80 7.65 6.60
GMTKN55 Overall WTMAD (kcal/mol) 11.23 22.30 30.83
PLF547 (proteins) MAE (kcal/mol) 0.626 1.829 2.935
S30L (host-guest) MAE (kcal/mol) 10.13 13.64 15.14

Physicality Tests

Test Metric MACE-MH-1 ORB-v3 UMA-S-1.1
Slab Extensivity Δ (meV) 0.0 -709.7 -453.8
H-Atom Additivity max |ΔF| (meV/Å) 0.0 61.65 969.2
Diatomic Force Flips Mean count 2.09 2.91 10.73
Diatomic Minima Mean count 1.42 1.62 4.82

Training Data

Pre-training

  • OMAT-24: 100M configurations of inorganic crystals (PBE/PBE+U) spanning 89 elements

Multi-Head Fine-tuning

  • OMAT Replay: 10M configurations (10% subset) to prevent catastrophic forgetting
  • MPTraj: 1.5M configurations from Materials Project with PBE+U
  • SPICE-1: ~1M organic molecules (ωB97M-D3(BJ)/def2-TZVP)
  • OC20: 2M metal surface slabs and adsorbate complexes (PBE)
  • OMOL-1%: 1.2M diverse organic and organometallic configurations (ωB97M-VV10)
  • RGD1: 300K organic reaction intermediates and transition states (B3LYP/6-31G*)
  • MATPES R2SCAN: 400K inorganic crystals (r²SCAN)

Installation and Usage

Installation

pip install mace-torch

Basic Usage (Python)

from mace.calculators import mace_mp
from ase import Atoms

# Load the MACE-MH-1 model (using the OMAT/PBE head)
calc = mace_mp(model=path, default_dtype="float64", device="cuda", head="omat_pbe")

# Create an example structure
atoms = Atoms('H2O', positions=[[0, 0, 0], [0, 0, 1], [0, 1, 0]])
atoms.calc = calc

# Calculate energy and forces
energy = atoms.get_potential_energy()
forces = atoms.get_forces()

print(f"Energy: {energy} eV")
print(f"Forces:\n{forces}")

Available Model Heads

MACE-MH-1 contains multiple task-specific heads trained on different levels of theory:

Head Name Level of Theory Best For Access
omat_pbe (default) PBE/PBE+U General materials, balanced performance across tasks Specify in model
omol ωB97M-VV10 1% of OMOL data: Molecular systems, organic chemistry, Organometallic Specify in model
spice_wB97M ωB97M-D3(BJ) Molecular systems and organic chemistry Specify in model
rgd1_b3lyp B3LYP Reaction chemistry Specify in model
oc20_usemppbe PBE Surface catalysis, adsorbates Specify in model
matpes_r2scan r²SCAN meta-GGA High-accuracy materials Specify in model

By default, the OMAT head (PBE) is used, which provides the best cross-domain performance.

Best Practices

  1. For fine-tuning: Use OMAT head first. Test other heads if needed.
  2. For materials: Use OMAT head. Use D3 corrections for systems with dispersions. Test matpes_r2scan head if r2scan better reference.
  3. For molecules: Consider using OMOL head (ωB97M-VV10) for improved intramolecular interactions. OMAT head good for condensed phase molecular systems, test it too.
  4. For surfaces: OMAT head provides excellent performance; OC20 head available for specialized applications

Citation

If you use MACE-MH-1 in your research, please cite:

@article{batatia2025crosslearning,
  title={Cross Learning between Electronic Structure Theories for Unifying Molecular, Surface, and Inorganic Crystal Foundation Force Fields},
  author={Batatia, Ilyes and Lin, Chen and Hart, Joseph and Kasoar, Elliott and Elena, Alin M. and Norwood, Sam Walton and Wolf, Thomas and Cs{\'a}nyi, G{\'a}bor},
  journal={arXiv preprint arXiv:2510.25380},
  year={2025}
}

@article{batatia2022mace,
  title={MACE: Higher order equivariant message passing neural networks for fast and accurate force fields},
  author={Batatia, Ilyes and Kovacs, David Peter and Simm, Gregor and Ortner, Christoph and Cs{\'a}nyi, G{\'a}bor},
  journal={Advances in Neural Information Processing Systems},
  volume={35},
  pages={11423--11436},
  year={2022}
}

License

This model is released under the ASL License.

Acknowledgments

This work was supported by computational resources from:

  • Jean Zay HPC (Grand Challenge GC010815458)
  • Isambard-AI and Sovereign AI

Contact

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support