Marta Skreta
martaowesyou.bsky.social
Marta Skreta
@martaowesyou.bsky.social
UofT CompSci PhD Student in Alán Aspuru-Guzik's #matterlab and Vector Institute | prev. Apple
Pinned
Super super excited to share our work SuperDiff 🦹‍♀️ for superimposing pretrained diffusion models at inference time 💪

Check out the 🧵 to see how we superimposed proteins as well as images, all thanks to a fast new density estimator. Curious to see what 🍩 & 🗺️ would produce?
🧵(1/7) Have you ever wanted to combine different pre-trained diffusion models but don't have time or data to retrain a new, bigger model?

🚀 Introducing SuperDiff 🦹‍♀️ – a principled method for efficiently combining multiple pre-trained diffusion models solely during inference!
Reposted by Marta Skreta
📄📄📄 The AI4Mat-NeurIPS-2025 workshop is now open for submissions until August 22, 2025 (AOE)!

Consider submitting full-length papers or shorter-length findings. We also have a special track for papers on benchmarking AI for materials design.

sites.google.com/view/ai4mat/...
August 5, 2025 at 6:56 AM
Reposted by Marta Skreta
Thanks to all the speakers and participants for the engaging discussions today, and for making the AI4Mat@ICLR 2025 workshop a great success! Thanks also to Santiago, @martaowesyou.bsky.social, and the rest of the co-organizing team for the effort putting this together. Great to be a part of it!
April 28, 2025 at 10:42 AM
Reposted by Marta Skreta
🚀 Looking for reaction conditions that work well for multiple substrates? CurryBO can help🍛

Now out on arXiv: arxiv.org/abs/2502.18966

A short explanation thread 👇
One Set to Rule Them All: How to Obtain General Chemical Conditions via Bayesian Optimization over Curried Functions
General parameters are highly desirable in the natural sciences - e.g., chemical reaction conditions that enable high yields across a range of related transformations. This has a significant practical...
arxiv.org
March 3, 2025 at 9:06 AM
Reposted by Marta Skreta
We at @digital-discovery.bsky.social are very happy to announce a new paper type called "Commit". Inspired by version control systems such as git, the idea is that if you have an update on a short and pointed publication, you can send it as a commit. We envision commits to be co-cited with the
January 24, 2025 at 7:37 PM
Reposted by Marta Skreta
Excited to have #selfdrivinglaboratories listed as one of the seven technologies to watch in 2025 by @nature.com Thanks to the #matterlab, the @accelerationc.bsky.social and of course all the global community on SDLs! @uoft.bsky.social @vectorinst.bsky.social l

www.nature.com/articles/d41...
Self-driving laboratories, advanced immunotherapies and five more technologies to watch in 2025
Sustainability and artificial intelligence dominate our seventh annual round-up of exciting innovations.
www.nature.com
January 21, 2025 at 2:15 AM
Reposted by Marta Skreta
"The Superposition of Diffusion Models using the Îto Density Estimator" (@martaowesyou.bsky.social, @lazaratan.bsky.social et al.)
It's nice to see an easy-to-compute log-likelihood estimator for SDE sampling of diffusion models (not just ODE)

📄 arxiv.org/abs/2412.17762
🐍 github.com/necludov/sup...
January 5, 2025 at 10:22 AM
Reposted by Marta Skreta
New paper just dropped! How do you combine pre-trained diffusion models without having to train a new one 🤓?

Turns out you can use our all new Ito density estimator 🔥 to compute densities under a diffusion model efficiently 🚀!
December 28, 2024 at 4:43 PM
Super super excited to share our work SuperDiff 🦹‍♀️ for superimposing pretrained diffusion models at inference time 💪

Check out the 🧵 to see how we superimposed proteins as well as images, all thanks to a fast new density estimator. Curious to see what 🍩 & 🗺️ would produce?
🧵(1/7) Have you ever wanted to combine different pre-trained diffusion models but don't have time or data to retrain a new, bigger model?

🚀 Introducing SuperDiff 🦹‍♀️ – a principled method for efficiently combining multiple pre-trained diffusion models solely during inference!
December 28, 2024 at 7:53 PM
exciting new workshop announcement!! join us in Singapore for Frontiers in Probabilistic Inference: Learning Meets Sampling 🌏⚡️😃 details below 👇 #ICLR2025
🔊 Super excited to announce the first ever Frontiers of Probabilistic Inference: Learning meets Sampling workshop at #ICLR2025 @iclr-conf.bsky.social!

🔗 website: sites.google.com/view/fpiwork...

🔥 Call for papers: sites.google.com/view/fpiwork...

more details in thread below👇 🧵
December 18, 2024 at 8:38 PM
1/3 🧵Quantum conundrum: we want expressive circuits, but current hardware only allows short coherence times, and so more parameters = more problems. Check out our #NeurIPS2024 paper “Quantum Deep Equilibrium Models”.
December 9, 2024 at 8:39 AM