Machine Learning in Science
banner
mackelab.bsky.social
Machine Learning in Science
@mackelab.bsky.social
We build probabilistic #MachineLearning and #AI Tools for scientific discovery, especially in Neuroscience. Probably not posted by @jakhmack.bsky.social.

📍 @ml4science.bsky.social‬, Tübingen, Germany
Guy Moss (@gmoss13.bsky.social) developed and applied simulation-based inference methods to solve inference problems in glaciology, in collaboration with @geophys-tuebingen.bsky.social. E.g., openreview.net/forum?id=yB5... 3/3
January 13, 2026 at 3:43 PM
Julius Vetter (@vetterj.bsky.social) worked on deep generative modeling and simulation-based Bayesian inference, with applications to (physiological) time series data. E.g., openreview.net/forum?id=kN0... 2/3
January 13, 2026 at 3:43 PM
Happy 2026 everyone! Two freshly minted PhDs 🧑‍🎓emerged from our lab at the end of last year.
We congratulate Dr Julius Vetter (@vetterj.bsky.social) and Dr Guy Moss (@gmoss13.bsky.social)! Here seen celebrating with the lab 🎳. 1/3
January 13, 2026 at 3:43 PM
Fifth, we bring AutoML to SBI pipelines with a practical performance metric that does not require ground-truth posteriors, improving inference quality on the SBI benchmark! By @swagatam.bsky.social, @gmoss13.bsky.social, @keggensperger.bsky.social, @jakhmack.bsky.social 10/11
December 1, 2025 at 4:16 PM
Fourth, in collaboration with Ian C Tanoh and Scott Linderman, we used the Jaxley toolbox and extended Kalman filters to estimate the marginal log-likelihood of a biophysical neuron model. We showed that this enables identifying biophysical parameters given extracellular recordings. 8/11
December 1, 2025 at 4:16 PM
Third, in collaboration with @kyrakadhim.bsky.social, @philipp.hertie.ai, and others, we built a task- and data-constrained biophysical network of the outer plexiform layer of the mouse retina. To optimize this model, we built it on top of our Jaxley toolbox for differentiable simulation. 6/11
December 1, 2025 at 4:16 PM
Second, come by to check out NPE-PFN: We leverage the power of tabular foundation models for training-free and simulation-efficient SBI. SBI has never been so effortless! By @vetterj.bsky.social, Manuel Gloeckler, @danielged.bsky.social, @jakhmack.bsky.social 4/11
December 1, 2025 at 4:16 PM
First, we introduce FNOPE, a new simulation-based inference approach for efficiently and flexibly inferring function-valued parameters. By @gmoss13.bsky.social, @leahsmuhle.bsky.social, Reinhard Drews, @jakhmack.bsky.social and @coschroeder.bsky.social 2/11
December 1, 2025 at 4:16 PM
We provide three examples and code to demonstrate the complete workflow: gravitational wave parameter estimation (astrophysics), psychophysical model fitting (cognitive science), and ion channel inference (neuroscience).
November 21, 2025 at 3:08 PM
We present a structured workflow with practical guidelines for each step: simulator setup, prior specification, method selection, network training, and validation. We illustrate each stage with concrete implementation details and common pitfalls.
November 21, 2025 at 3:08 PM
Congrats to Dr Michael Deistler @deismic.bsky.social, who defended his PhD!

Michael worked on "Machine Learning for Inference in Biophysical Neuroscience Simulations", focusing on simulation-based inference and differentiable simulation.

We wish him all the best for the next chapter! 👏🎓
October 2, 2025 at 11:28 AM
The Macke lab is well-represented at the @bernsteinneuro.bsky.social conference in Frankfurt this year! We have lots of exciting new work to present with 7 posters (details👇) 1/9
September 30, 2025 at 2:06 PM
But does it scale to complex real-world problems? We tested it on two challenging Hodgkin-Huxley-type models:
🧠 single-compartment neuron
🦀 31-parameter crab pyloric network

NPE-PF delivers tight posteriors & accurate predictions with far fewer simulations than previous methods.
July 23, 2025 at 2:28 PM
What you get with NPE-PF:
🚫 No need to train inference nets or tune hyperparameters.
🌟 Competitive or superior performance vs. standard SBI methods.
🚀 Especially strong performance for smaller simulation budgets.
🔄 Filtering to handle large datasets + support for sequential inference.
July 23, 2025 at 2:28 PM
The key idea:
TabPFN, originally trained for tabular regression and classification, can estimate posteriors by autoregressively modeling one parameter dimension after the other.
It’s remarkably effective, even though TabPFN was not designed for SBI.
July 23, 2025 at 2:28 PM
We obtain posterior distributions over ice accumulation and melting rates for Ekström Ice Shelf over the past hundreds of years. This allows us to make quantitative statements about the history of the atmospheric and oceanic conditions.
June 11, 2025 at 11:47 AM
Thanks to great data collection efforts from @geophys-tuebingen.bsky.social and @awi.de, we can apply this approach to Ekström Ice Shelf, Antarctica.
June 11, 2025 at 11:47 AM
We develop a simulation-based-inference workflow for inferring the accumulation and melting rates from measurements of the internal layers.
June 11, 2025 at 11:47 AM
Radar measurements have long been used for measuring the internal layer structure of Antarctic ice shelves. This structure contains information about the history of the ice shelf. This includes the past rate of snow accumulation at the surface, as well as ice melting at the base.
June 11, 2025 at 11:47 AM
In FNSE, we only have to solve a smaller and easier inverse problem; it does scale relatively easily to high-dimensional simulators.

We validate this on a high-dimensional Kolmogorov flow simulator with around one million data dimensions.
April 25, 2025 at 8:53 AM
We apply this approach to various SBI methods (e.g. FNLE/FNRE), focusing on FNSE.

Compared to NPE with embedding nets, it’s more simulation-efficient and accurate across time series of varying lengths.
April 25, 2025 at 8:53 AM
We propose an SBI approach that can exploit Markovian simulators by locally identifying parameters consistent with individual state transitions.

We then compose these local results to obtain a posterior over parameters that align with the entire time series observation.
April 25, 2025 at 8:53 AM
Does any of this sound interesting to you? You might be excited to know that we are hiring PhD students and Postdocs!

Go to www.mackelab.org/jobs/ for details on more projects and how to contact us, or just find one of us (Auguste, Matthijs, Richard, Zina) at the conference—we're happy to chat!
March 27, 2025 at 2:03 PM
Finally, @rdgao.bsky.social is in another Tuesday workshop: “What biological details matter at mesoscopic scales?”

He will tell you about the good, the bad, and the ugly of getting more than what you had asked for using mechanistic models and probabilistic machine learning.

17:00, Corriveau/Sateux
March 27, 2025 at 2:03 PM
We're also at the workshops!

Tuesday at 11:20, @auschulz.bsky.social will give a talk about deep generative models - VAEs and DDPMs - for linking neural activity and behavior, at the workshop on

"Building a foundation model for the brain" (Soutana 1).
March 27, 2025 at 2:03 PM