COSMO Lab
banner
labcosmo.bsky.social
COSMO Lab
@labcosmo.bsky.social
Computational Science and Modelling of materials and molecules at the atomic-scale, with machine learning.
If you're scared by the 700M parameters (you shouldn't be) there's a whole set of models from 🐁 to 🦣. You can find them all on github.com/lab-cosmo/upet !
January 23, 2026 at 7:02 AM
If you got curious by the PET-OAM results a week ago, you can learn more reading up arxiv.org/abs/2601.16195. Including some general considerations on how to train and use safely an unconstrained ML potential.
January 23, 2026 at 7:02 AM
Not going to make a big deal out of a benchmark table, but PET just got the top spot on matbench-discovery.materialsproject.org. And don't be fooled by the huge parameters count, it's faster and can handle larger structures than eSEN-30M 🚀. Kudos to 🧑‍🚀 Filippo, Arslan and Paolo!
January 14, 2026 at 6:32 AM
📢 chemiscope.org 1.0.0rc1 just dropped on pypi! We are making (a few) breaking changes to the interfaces, fixing a ton of bugs and introducing some exciting features (you can finally load datasets with > 100k points!). We'd be grateful if you test, break and report 🐛 github.com/lab-cosmo/ch...
January 5, 2026 at 2:42 PM
Hope y'all are getting a great start of 2026. Here we're taking some time to add the 2025 winter card to the archives www.epfl.ch/labs/cosmo/i... 🎅=🧑‍🚀
January 3, 2026 at 9:17 AM
📢 New chemiscope.org release just landed! To make it even easier to integrate ⚗️🔭 into your workflow, we added a @streamlit.bsky.social component, so you can run analyses and show you atomistic data in a web app by just writing a few lines of python! try it, break it, report it!
December 17, 2025 at 9:21 PM
Congrats to 🧑‍🚀 Sergey Pozdnyakov who received a distinction (best 8% of theses at @materials-epfl.bsky.social) for his PhD thesis "Advancing understanding and practical performance of machine learning interatomic potentials". Поїхали 🚀! infoscience.epfl.ch/entities/pub...
December 10, 2025 at 12:50 PM
No day goes by without a new universal #ML potential. But how different they really are? Sanggyu and Sofiia tried to give a quantitative answer by comparing the reconstruction errors between their latent-space features. If you are curious, check out the #preprint arxiv.org/html/2512.05...
December 9, 2025 at 7:16 AM
However, this seems to damage the transferability of highly-preconditioned models such as MACE - less so for more expressive unconstrained models such as PET. Does this match your experience?
September 23, 2025 at 7:26 AM
This doesn't matter much as most of the fragments that make up the body-order decomposition as deranged soups of highly-correlated electrons. Models with sufficient expressive power *can* learn if presented with the fragments ...
September 23, 2025 at 7:26 AM
TL;DR: not really. ML potentials learn whatever they want, as long as it allows them good accuracy on the train set. We note in particular that MACE is strongly preconditioned to learn a fast-decaying body-order expansion, whether it decays fast or not.
September 23, 2025 at 7:26 AM
Anticipating 🧑‍🚀 Wei Bin's talk at #psik2025 (noon@roomA), 📢 a new #preprint using PET and the MAD dataset to train a universal #ml model for the density of states, giving band gaps for solids, clusters, surfaces and molecules with MAE ~200meV. Go to the talk, or check out arxiv.org/html/2508.17...!
August 28, 2025 at 7:19 AM
The reconstructed surface contains different sites with different reactivity. Despite the higher stability, for some sites the disordered surface is *more* reactive with water, one of the main contaminants affecting the stability of LPS batteries. Useful to design better stabilization strategies!
August 27, 2025 at 6:54 AM
Reconstructed surfaces become lower in energy, and the surface energy less orientation dependent - and so the Wulff shape of particles become more spherical.
August 27, 2025 at 6:54 AM
📢 Now out on @physrevx.bsky.social energy, journals.aps.org/prxenergy/ab... from 🧑‍🚀 @dtisi.bsky.social and Hanna Türk, our #PET -powered study of the dynamic reconstruction of LPS surfaces, and how it affects their structure, stability and reactivity.
August 27, 2025 at 6:54 AM
TL;DR - this is a cross-platform, model-agnostic library to handle atomistic data (handling geometry and property derivatives such as forces and stresses) that lets you package your model into a portable torchscript file.
August 22, 2025 at 7:40 AM
🚨 #machinelearning for #compchem goodies from our 🧑‍🚀 team incoming! After years of work it's time to share. Go check arxiv.org/abs/2508.15704 and/or metatensor.org to learn about #metatensor and #metatomic. What they are, what they do, why you should use them for all of your atomistic ML projects 🔍.
August 22, 2025 at 7:40 AM
August 18, 2025 at 7:43 AM
We can get long-stride geometry-conserving integration by learning the Hamilton-Jacobi action. This fixes for good, doesn't just patch up, the instability of direct MD prediction, although it's not as fast. And work also for serious simulations, like glassy relaxation in deep supercooled GeTe!
August 8, 2025 at 5:46 AM
If you are excited about 30x longer time steps in molecular dynamics using FlashMD, but are worried about it not being symplectic, Filippo has something new cooking that should make you even more excited. Head to the #arxiv for a preview arxiv.org/html/2508.01...
August 8, 2025 at 5:46 AM
Thanks to the 🧑‍🚀🧑‍🚀🧑‍🚀 who put this together, Sofiia in particular, and thanks to the #metatrain team as this would not be so easy without their work metatensor.github.io/metatrain/la...
July 24, 2025 at 1:38 AM
Two new recipes landed in the #atomistic-cookbook 🧑‍🍳📖. One explaining how to fine-tune the #PET-MAD universal model on a system-specific dataset, one training a model with conservative fine tuning. Check them out on atomistic-cookbook.org/examples/pet... and atomistic-cookbook.org/examples/pet...
July 24, 2025 at 1:38 AM
Basically, you just need to define a `torch.nn.module` with a specific API, and then you can define anything you like as a CV calculator. Export as .pt torchscript model, and it's just on METATOMIC action away from reading it in #plumed
July 7, 2025 at 8:21 PM
New 🧑‍🍳📖 #recipe landed, doubling up as a @plumed.org tutorial 🐦 atomistic-cookbook.org/examples/met..., and explaining how to use the #metatomic interface in #plumed to define custom collective variables with all the flexibility and speed of torch.
July 7, 2025 at 8:21 PM
As a nice side-effect, we distribute (well, PR still underway 😆) a featurizer based on PET-MAD latent features that you can use together with `chemiscope.explore` to as a universal materials cartography tool - it even works out of the box to follow the melting of Al!
June 26, 2025 at 11:41 AM