Luca Scimeca
@lucascimeca.bsky.social
31 followers 6 following 19 posts
AI Research @ Mila | Harvard | Cambridge | Edinburgh
Posts Media Videos Starter Packs
lucascimeca.bsky.social
We explore how to train conditional generative models to sample molecular conformations from their Boltzmann distribution — using only a reward signal.
lucascimeca.bsky.social
📌 GenBio Workshop

Torsional-GFN: A Conditional Conformation Generator for Small Molecules

👥 Authors

Lena Néhale Ezzine*, Alexandra Volokhova*, Piotr Gaiński, Luca Scimeca, Emmanuel Bengio, Prudencio Tossou, Yoshua Bengio, and Alex Hernández-García

(* equal contribution)
lucascimeca.bsky.social
• Works out-of-the-box with large priors like StyleGAN3, NVAE, Stable Diffusion 3, and FoldFlow 2.
• Unifies constrained generation, RL-with-human-feedback, and protein design in a single framework.
• Outperforms both amortized data-space samplers and traditional MCMC across tasks.
lucascimeca.bsky.social
• We show how to turn any pretrained generator (GAN, VAE, flow) into a conditional sampler by training a diffusion model directly in noise space.
• The diffusion sampler is trained with RL
• Noise-space posteriors are smoother, giving faster, more stable inference.
lucascimeca.bsky.social
👥 Where you’ll find our work:

📌 Main Track

Outsourced Diffusion Sampling: Efficient Posterior Inference in Latent Spaces of Generative Models

👥 Authors
Siddarth Venkatraman, Mohsin Hasan, Minsu Kim, Luca Scimeca, Marcin Sendera, Yoshua Bengio, Glen Berseth, Nikolay Malkin
lucascimeca.bsky.social
I’m attending ICML in Vancouver this week!

It’s already been great to connect, chat, and hear about the amazing work happening across the community.

If you’re attending and would like to meet up, feel free to reach out!

(More details below)

#ICML2025 #MachineLearning #AI #DiffusionModels #GenAI
lucascimeca.bsky.social
🔹 Outsourced Diffusion Sampling: Efficient Posterior Inference in Latent Spaces of Generative Models.
📝 Authors: Siddarth Venkatraman, Mohsin Hasan, Minsu Kim, Luca Scimeca, …, Yoshua Bengio, Nikolay Malkin
paper: arxiv.org/pdf/2502.06999
📍 To be presented at FPI-ICLR2025 & ICLR 2025 DeLTa Workshops
lucascimeca.bsky.social
🔹 Solving Bayesian Inverse Problems with Diffusion Priors and Off-Policy RL.
📝 Authors: Luca Scimeca, Siddarth Venkatraman, Moksh Jain, Minsu Kim, Marcin Sendera, Mohsin Hasan, …, Yoshua Bengio, Glen Berseth, Nikolay Malkin
📍 To be presented at ICLR 2025 DeLTa Workshop
lucascimeca.bsky.social
🔹 Mitigating Shortcut Learning with Diffusion Counterfactuals and Diverse Ensembles.
📝 Authors: Luca Scimeca, Alexander Rubinstein, Damien Teney, Seong Joon Oh, Yoshua Bengio
paper: arxiv.org/pdf/2311.16176
📍 To be presented at SCSL @ ICLR 2025 Workshop
arxiv.org
lucascimeca.bsky.social
🔹 Shaping Inductive Bias in Diffusion Models through Frequency-Based Noise Control.
📝 Authors: Thomas Jiralerspong, Berton Earnshaw, Jason Hartford, Yoshua Bengio, Luca Scimeca
paper: arxiv.org/pdf/2502.10236?
📍 To be presented at FPI-ICLR2025 & ICLR 2025 DeLTa Workshops
arxiv.org
lucascimeca.bsky.social
Thrilled to share that we will be presenting 4 papers across 3 workshops at #ICLR2025 in Singapore this week!

If you're attending, let’s connect! Feel free to DM me for more details about the work or potential collaborations.
See you at the venue! 🇸🇬

(More info to follow)

@mila-quebec.bsky.social
Reposted by Luca Scimeca
coallaoh.bsky.social
Thank Alex for his great efforts and work ethic. Thank @damienteney.bsky.social and @lucascimeca.bsky.social for their continued help with this paper. We’ll humbly address the criticisms to improve it further for future opportunities.
lucascimeca.bsky.social
If you're attending, come check out our posters or feel free to reach out to connect during the conference!

Looking forward to insightful conversations and connecting with everyone; See you all at NeurIPS!

#NeurIPS2024 #NIPS24 #MachineLearning #DiffusionModels #Research #AI
lucascimeca.bsky.social
Amortizing Intractable Inference in Diffusion Models for Bayesian Inverse Problems. Venkatraman, S., Jain, M., Scimeca, L., Kim, M., Sendera, M.,…, Bengio, Y., Malkin, K.
lucascimeca.bsky.social
On Diffusion Models for Amortized Inference: Benchmarking and Improving Stochastic Control and Sampling. Sendera, M., Kim, M., Mittal, S., Lemos, P., Scimeca, L., Rector-Brooks, J., Adam, A., Bengio, Y., and Malkin, N.
arxiv.org/abs/2402.05098
Improved off-policy training of diffusion samplers
We study the problem of training diffusion models to sample from a distribution with a given unnormalized density or energy function. We benchmark several diffusion-structured inference methods, inclu...
arxiv.org
lucascimeca.bsky.social


Amortizing Intractable Inference in Diffusion Models for Vision, Language, and Control. Venkatraman, S., Jain, M., Scimeca, L., Kim, M., Sendera, M.,…, Bengio, Y., Malkin, K.
arxiv.org/abs/2405.20971
LinkedIn
This link will take you to a page that’s not on LinkedIn
lnkd.in
lucascimeca.bsky.social
Excited to share that we will be presenting three papers at #NeurIPS2024 this week in Vancouver, pushing forward our work on Diffusion Models!
LinkedIn
This link will take you to a page that’s not on LinkedIn
lnkd.in
lucascimeca.bsky.social
Hi, can I be added to the pack? :)