Guy Moss
@gmoss13.bsky.social
180 followers 110 following 9 posts
PhD student at @mackelab.bsky.social - machine learning & geoscience.
Posts Media Videos Starter Packs
Reposted by Guy Moss
mackelab.bsky.social
Congrats to Dr Michael Deistler @deismic.bsky.social, who defended his PhD!

Michael worked on "Machine Learning for Inference in Biophysical Neuroscience Simulations", focusing on simulation-based inference and differentiable simulation.

We wish him all the best for the next chapter! 👏🎓
Reposted by Guy Moss
mackelab.bsky.social
The Macke lab is well-represented at the @bernsteinneuro.bsky.social conference in Frankfurt this year! We have lots of exciting new work to present with 7 posters (details👇) 1/9
Reposted by Guy Moss
rdgao.bsky.social
I've been waiting some years to make this joke and now it’s real:

I conned somebody into giving me a faculty job!

I’m starting as a W1 Tenure-Track Professor at Goethe University Frankfurt in a week (lol), in the Faculty of CS and Math

and I'm recruiting PhD students 🤗
a man wearing a white shirt and tie smiles in front of a window
ALT: a man wearing a white shirt and tie smiles in front of a window
media.tenor.com
Reposted by Guy Moss
sbi-devs.bsky.social
From hackathon to release: sbi v0.25 is here! 🎉

What happens when dozens of SBI researchers and practitioners collaborate for a week? New inference methods, new documentation, lots of new embedding networks, a bridge to pyro and a bridge between flow matching and score-based methods 🤯

1/7 🧵
Reposted by Guy Moss
mackelab.bsky.social
New preprint: SBI with foundation models!
Tired of training or tuning your inference network, or waiting for your simulations to finish? Our method NPE-PF can help: It provides training-free simulation-based inference, achieving competitive performance with orders of magnitude fewer simulations! ⚡️
Reposted by Guy Moss
jakhmack.bsky.social
I have been genuinely amazed how well tabpfn works as a density estimator, and how helpful this is for SBI ... Great work by @vetterj.bsky.social, Manuel and @danielged.bsky.social!!
mackelab.bsky.social
New preprint: SBI with foundation models!
Tired of training or tuning your inference network, or waiting for your simulations to finish? Our method NPE-PF can help: It provides training-free simulation-based inference, achieving competitive performance with orders of magnitude fewer simulations! ⚡️
Reposted by Guy Moss
danielged.bsky.social
My first paper on simulation-based inference (SBI) as part of @mackelab.bsky.social!

Exciting work on adapting state-of-the-art foundation models for posterior estimation. Almost plug-and-play, and surprisingly effective.

Paper/code in thread below 🧵
mackelab.bsky.social
New preprint: SBI with foundation models!
Tired of training or tuning your inference network, or waiting for your simulations to finish? Our method NPE-PF can help: It provides training-free simulation-based inference, achieving competitive performance with orders of magnitude fewer simulations! ⚡️
gmoss13.bsky.social
Have I been to Antarctica? No. But my colleagues have, and we can learn a lot from the data they collected! Really happy to share that our work is now published!
Reposted by Guy Moss
sbi-devs.bsky.social
More great news from the SBI community! 🎉
Two projects have been accepted for Google Summer of Code under the NumFOCUS umbrella, bringing new methods and general improvements to sbi. Big thanks to @numfocus.bsky.social, GSoC and our future contributors!
Reposted by Guy Moss
sbi-devs.bsky.social
Great news! Our March SBI hackathon in Tübingen was a huge success, with 40+ participants (30 onsite!). Expect significant updates soon: awesome new features & a revamped documentation you'll love! Huge thanks to our amazing SBI community! Release details coming soon. 🥁 🎉
A wide shot of approximately 30 individuals standing in a line, posing for a group photograph outdoors. The background shows a clear blue sky, trees, and a distant cityscape or hills.
Reposted by Guy Moss
mackelab.bsky.social
🎓Hiring now! 🧠 Join us at the exciting intersection of ML and Neuroscience! #AI4science
We’re looking for PhDs, Postdocs and Scientific Programmers that want to use deep learning to build, optimize and study mechanistic models of neural computations. Full details: www.mackelab.org/jobs/ 1/5
Jobs - mackelab
The MackeLab is a research group at the Excellence Cluster Machine Learning at Tübingen University!
www.mackelab.org
Reposted by Guy Moss
mackelab.bsky.social
Excited to present our work on compositional SBI for time series at #ICLR2025 tomorrow!

If you're interested in simulation-based inference for time series, come chat with Manuel Gloeckler or Shoji Toyota

at Poster #420, Saturday 10:00–12:00 in Hall 3.

📰: arxiv.org/abs/2411.02728
Compositional simulation-based inference for time series
Amortized simulation-based inference (SBI) methods train neural networks on simulated data to perform Bayesian inference. While this strategy avoids the need for tractable likelihoods, it often requir...
arxiv.org
Reposted by Guy Moss
psteinb.bsky.social
It's been a blast, thanks to @sbi-devs.bsky.social ! This week's hackathon was phenomenal! 🙏 😍 The sbi hackathon welcomed about 25 people in Tübingen with contributions spanning the globe , e.g. 🇺🇸🇯🇵🇧🇪🇩🇪. Wanna see, what we did? Check out the PRs👇
github.com/sbi-dev/sbi/...
Pull requests · sbi-dev/sbi
sbi is a Python package for simulation-based inference, designed to meet the needs of both researchers and practitioners. Whether you need fine-grained control or an easy-to-use interface, sbi has ...
github.com
Reposted by Guy Moss
sbi-devs.bsky.social
🙏 Please help us improve the SBI toolbox! 🙏

In preparation for the upcoming SBI Hackathon, we’re running a user study to learn what you like, what we can improve, and how we can grow.

👉 Please share your thoughts here: forms.gle/foHK7myV2oaK...

Your input will make a big difference—thank you! 🙌
Reposted by Guy Moss
sbi-devs.bsky.social
🚀 Join the 4th SBI Hackathon! 🚀
The last SBI hackathon was a fantastic milestone in forming a collaborative open-source community around SBI. Be part of it this year as we build on that momentum!

📅 March 17–21, 2025
📍 Tübingen, Germany or remote
👉 Details: github.com/sbi-dev/sbi/...

More Info:🧵👇
Reposted by Guy Moss
sbi-devs.bsky.social
🎉 Just in time for the end of the year, we’ve released a new version of sbi!

📦 v0.23.3 comes packed with exciting features, bug fixes, and docs updates to make sbi smoother and more robust. Check it out! 👇

🔗 Full changelog: github.com/sbi-dev/sbi/...
Reposted by Guy Moss
arnauddoucet.bsky.social
The slides of my NeurIPS lecture "From Diffusion Models to Schrödinger Bridges - Generative Modeling meets Optimal Transport" can be found here
drive.google.com/file/d/1eLa3...
BreimanLectureNeurIPS2024_Doucet.pdf
drive.google.com
Reposted by Guy Moss
auschulz.bsky.social
1) With our @neuripsconf.bsky.social poster happening tomorrow, it's about time to introduce our Spotlight paper 🔦, co-lead with @jkapoor.bsky.social:

Latent Diffusion for Neural Spiking data (LDNS), a latent variable model (LVM) which addresses 3 goals simultaneously:
Reposted by Guy Moss
matthijspals.bsky.social
How to find all fixed points in piece-wise linear recurrent neural networks (RNNs)?
A short thread 🧵

In RNNs with N units with ReLU(x-b) activations the phase space is partioned in 2^N regions by hyperplanes at x=b 1/7
gmoss13.bsky.social
Interested to learn more? Come visit our poster at #Neurips2024, or simply get in touch! Huge thanks again to @vetterj.bsky.social , Cornelius Schröder, @rdgao.bsky.social , and @jakhmack.bsky.social
(8/8)
gmoss13.bsky.social
We apply Sourcerer to a real dataset of single-neuron recordings and the Hodgkin-Huxley model. This model is misspecified and highly nonlinear. Still, Sourcerer estimates source distributions that accurately reproduce the dataset, again achieving higher entropy “for free”!
(7/8)
gmoss13.bsky.social
With the likelihood-free loss, Sourcerer can make use of differentiable simulators to quickly estimate source distributions, even when the data is high-dimensional, such as for time series data.
(6/8)