Gilles Louppe
glouppe.bsky.social
Gilles Louppe
@glouppe.bsky.social

AI for Science, deep generative models, inverse problems. Professor of AI and deep learning @universitedeliege.bsky.social. Previously @CERN, @nyuniversity. https://glouppe.github.io

Computer science 62%
Physics 22%
Pinned
<proud advisor>
Hot off the arXiv! 🦬 "Appa: Bending Weather Dynamics with Latent Diffusion Models for Global Data Assimilation" 🌍 Appa is our novel 1.5B-parameter probabilistic weather model that unifies reanalysis, filtering, and forecasting in a single framework. A thread 🧵

Reposted by Gilles Louppe

Lots of interesting LLM releases last week. My fav was actually Olmo 3 (I love the Olmo series due to their full open-sourceness and transparency).
If you are interested in reading through the architecture details, I coded it from scratch here: github.com/rasbt/LLMs-f...

For some puzzling reason my (existing) Scholar profile cannot be found using Scholar itself, it's been like this for years 🙃 (search engines like Bing do find it however)

... and here I thought the new Scholar Labs would finally be able to search and find my Scholar profile 😥
Man, everything is so bleak, anyone got a fun fact or little bit of trivia they want to share

Reposted by Gilles Louppe

What happens when you combine 10 years of brain data with one of the world’s fastest supercomputers?

A virtual mouse cortex simulation, thanks to a global collaboration.

🧠📈 https://alleninstitute.org/news/one-of-worlds-most-detailed-virtual-brain-simulations-is-changing-how-we-study-the-brain/

Reposted by Gilles Louppe

GDM WeatherNext 2

8x faster than v1, it can compute extreme situations and game out scenarios in one minute flat on a single TPU (as opposed to hours of supercomputer time for traditional algorithms)

will be available in all of Google’s weather apps

blog.google/technology/g...
WeatherNext 2: Our most advanced weather forecasting model
The new AI model delivers more efficient, more accurate and higher-resolution global weather predictions.
blog.google
I am super happy to share that our project on training biophysical models with Jaxley is now published in Nature Methods: www.nature.com/articles/s41...
Jaxley: differentiable simulation enables large-scale training of detailed biophysical models of neural dynamics - Nature Methods
Jaxley is a versatile platform for biophysical modeling in neuroscience. It allows efficiently simulating large-scale biophysical models on CPUs, GPUs and TPUs. Model parameters can be optimized with ...
www.nature.com

Oui c'était bien moi! Je me disais bien aussi que tu me disais qqch :-) au plaisir de se recroiser

Reposted by Gilles Louppe

It's remarkable how AI generated images have gone from remarkable and visually attractive to repulsive AI slop in just a few years.

Reposted by Gilles Louppe

I am incredibly honored to have received the inaugural AI in Science Research Excellence Prize from the Margot and Tom Pritzker Foundation
dsi.wisc.edu/2025/11/10/d...

‪Once again an interdisciplinary research project gets poor, even condescending this time, evaluations because reviewers make little effort to understand the maths. Science is not only about killing rat models and to see if some random drug worked. Tired of this game.

Reposted by Gilles Louppe

"stole Rosalind Franklin's work" has become the new orthodoxy. While she was certainly the victim of sexism from Watson, I think her colleague Wilkins was the real villain. Events 1951-53 well covered in Nature in 2023 www.nature.com/articles/d41...
What Rosalind Franklin truly contributed to the discovery of DNA’s structure
Franklin was no victim in how the DNA double helix was solved. An overlooked letter and an unpublished news article, both written in 1953, reveal that she was an equal player.
www.nature.com

Reposted by Gilles Louppe

New paper, with @rkhashmani.me @marielpettee.bsky.social @garrettmerz.bsky.social Hellen Qu. We introduce a framework for generating realistic, highly multimodal datasets with explicitly calculable mutual information. This is helpful for studying self-supervised learning
arxiv.org/abs/2510.21686

Reposted by Gilles Louppe

"The Principles of Diffusion Models" by Chieh-Hsin Lai, Yang Song, Dongjun Kim, Yuki Mitsufuji, Stefano Ermon. arxiv.org/abs/2510.21890
It might not be the easiest intro to diffusion models, but this monograph is an amazing deep dive into the math behind them and all the nuances
The Principles of Diffusion Models
This monograph presents the core principles that have guided the development of diffusion models, tracing their origins and showing how diverse formulations arise from shared mathematical ideas. Diffu...
arxiv.org

It is only useful when the training data is noisy or incomplete. See eg. arxiv.org/abs/2405.13712 where train diffusion models from sparse images only.
Learning Diffusion Priors from Observations by Expectation Maximization
Diffusion models recently proved to be remarkable priors for Bayesian inverse problems. However, training these models typically requires access to large amounts of clean data, which could prove diffi...
arxiv.org

EM algorithm: 1977 vintage, 2025 relevant. New lecture notes on a classic that refuses to age. From fitting a GMM on the Old Faithful data to training modern diffusion models in incomplete data settings, the same simple math applies. 👉 glouppe.github.io/dats0001-fou...

Reposted by Gilles Louppe

Fisher meets Feynman! 🤝

We use score matching and a trick from quantum field theory to make a product-of-experts family both expressive and efficient for variational inference.

To appear as a spotlight @ NeurIPS 2025.
#NeurIPS2025 (link below)
What if we did a single run and declared victory

Reposted by Gilles Louppe

Excited to share SamudrACE, the first 3D AI ocean–atm–sea-ice #climate emulator! 🚀 Simulates 800 years in 1 day on 1 GPU, ~100× faster than traditional models, straight from your laptop 👩‍💻 Collaboration with @ai2.bsky.social and GFDL, advancing #AIforScience with #DeepLearning.
tinyurl.com/Samudrace
SamudrACE: A fast, accurate, efficient 3D coupled climate AI emulator
A fast digital twin of a state-of-the-art coupled climate model, simulating 800 years in 1 day with 1 GPU. SamudrACE combines two leading…
medium.com

Reposted by Gilles Louppe

🕳️🐇 𝙄𝙣𝙩𝙤 𝙩𝙝𝙚 𝙍𝙖𝙗𝙗𝙞𝙩 𝙃𝙪𝙡𝙡 – 𝙋𝙖𝙧𝙩 𝙄 (𝑃𝑎𝑟𝑡 𝐼𝐼 𝑡𝑜𝑚𝑜𝑟𝑟𝑜𝑤)

𝗔𝗻 𝗶𝗻𝘁𝗲𝗿𝗽𝗿𝗲𝘁𝗮𝗯𝗶𝗹𝗶𝘁𝘆 𝗱𝗲𝗲𝗽 𝗱𝗶𝘃𝗲 𝗶𝗻𝘁𝗼 𝗗𝗜𝗡𝗢𝘃𝟮, one of vision’s most important foundation models.

And today is Part I, buckle up, we're exploring some of its most charming features. :)
Thrilled to have two years' of work out, in a pair of papers led by @gradientrider.bsky.social and @maxecharles.bsky.social.

We've built a data-driven calibration of the James Webb Interferometer to near its fundamental limits for high-res imaging - explainer at @aunz.theconversation.com!
How we sharpened the James Webb telescope’s vision from a million kilometres away
The only Australian hardware on board the legendary telescope is starting to fulfil its duties.
theconversation.com

Reposted by Gilles Louppe

Dans cette interview avec la JASRAC (équivalent japonais de la SACEM), Nobuo Uematsu a donné son avis sur la musique générée par IA. À sa façon, il relaie ce nouvel adage : si personne ne s'est fatigué à l'écrire, je ne me fatiguerai pas à l'écouter.

📃 www.jasrac.or.jp/magazine/int...

Bravo Gaël pour cette reconnaissance amplement méritée! Tu dis que la science est un sport d'équipe, et c'est vrai, mais une équipe a besoin d'un leader qui l'inspire. Je suis sûr que l'équipe scikit-learn, passée et présente, est fière de toi et de ce que nous avons construit ensemble. Champagne!

Reposted by Gilles Louppe

🚀 After more than a year of work — and many great discussions with curious minds & domain experts — we’re excited to announce the public release of 𝐀𝐩𝐩𝐚, our latent diffusion model for global data assimilation!

Check the repo and the complete wiki!
github.com/montefiore-s...
GitHub - montefiore-sail/appa: Code for the publication "Appa: Bending Weather Dynamics with Latent Diffusion Models for Global Data Assimilation".
Code for the publication "Appa: Bending Weather Dynamics with Latent Diffusion Models for Global Data Assimilation". - montefiore-sail/appa
github.com

Reposted by Gilles Louppe

Had never read Jane Goodall's original 1963 article in Nat Geo on the wild chimpanzees in Tanzania until now. It's a wonderful blend of science and journalism, and well worth your time.

www.nationalgeographic.com/pdf/jane-goo...

Reposted by Gilles Louppe

Yeah, I think that "AI is the new petrol" is about right...

Reposted by Gilles Louppe

Rudolf Kalman put it nicely (and provocatively): link.springer.com/chapter/10.1...

How do you type all these long long hyphens? — I keep seeing them everywhere. I learned the UTF-8 code by now but there must be something easier 🤔

Reposted by Gilles Louppe

🔭 It's paper day! Today I'm sharing the latest in a series of papers looking at the weather on other worlds, in this case bringing you the weather report from a nearby T-dwarf, SIMP-0136. 🪐

🧵 to follow...

Reposted by Gilles Louppe

We’re excited to introduce ShinkaEvolve: An open-source framework that evolves programs for scientific discovery with unprecedented sample-efficiency. It leverages LLMs to find state-of-the-art solutions, orders of magnitude faster!

Blog: sakana.ai/shinka-evolve/
Paper: arxiv.org/abs/2509.19349