Côme Cattin
banner
comecattin.bsky.social
Côme Cattin
@comecattin.bsky.social
Theoretical Chemistry PhD. Student at Sorbonne Université
🚀First paper published!
We introduce DMTS, a multi-time-step method for ML force fields
✔️×4 speed-up
✔️Accuracy preserved
✔️Generalizable to any ML potential
📄Link: pubs.acs.org/doi/full/10....
The preprint: arxiv.org/abs/2510.06562
@jppiquem.bsky.social
#MolecularDynamics #MachineLearning
January 28, 2026 at 5:59 PM
Reposted by Côme Cattin
#compchem #machinelearning
1st of the year in J. Phys. Chem. Lett.: "Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models using Multiple Time-Step and Distillation". pubs.acs.org/doi/full/10....
(see also the updated preprint: arxiv.org/abs/2510.06562)
Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models Using Multiple Time Steps and Distillation
We present a distilled multi-time-step (DMTS) strategy to accelerate molecular dynamics simulations using foundation neural network models. DMTS uses a dual-level neural network, where the target accurate potential is coupled to a simpler but faster model obtained via a distillation process. The 3.5 Å cutoff distilled model is sufficient to capture the fast-varying forces, i.e., mainly bonded interactions, from the accurate potential, allowing its use in a reversible reference system propagator algorithm (RESPA)-like formalism. The approach conserves accuracy, preserving both static and dynamic properties, while enabling us to evaluate the costly model only every 3 to 6 fs depending on the system. Consequently, large simulation speedups over standard 1 fs integration are observed: nearly 4-fold in homogeneous systems and 3-fold in large solvated proteins through leveraging active learning for enhanced stability. Such a strategy is applicable to any neural network potential and reduces the performance gap with classical force fields.
pubs.acs.org
January 21, 2026 at 12:06 PM