John Ryan
John Ryan
@johnpryan.bsky.social
ML @ Isomorphic Labs
Reposted by John Ryan
New paper just dropped! How do you combine pre-trained diffusion models without having to train a new one 🤓?

Turns out you can use our all new Ito density estimator 🔥 to compute densities under a diffusion model efficiently 🚀!
December 28, 2024 at 4:43 PM
Reposted by John Ryan
DeMo was created in March 2024 by Bowen Peng and Jeffrey Quesnelle and has been published on arXiv in collaboration with Diederik P. Kingma, co-founder of OpenAI and inventor of the Adam optimizer and VAEs.

The paper is available here: arxiv.org/abs/2411.19870

And code: github.com/bloc97/DeMo
DeMo: Decoupled Momentum Optimization
Training large neural networks typically requires sharing gradients between accelerators through specialized high-speed interconnects. Drawing from the signal processing principles of frequency decomp...
arxiv.org
December 2, 2024 at 4:46 PM
Reposted by John Ryan
Some recent discussions made me write up a short read on how I think about doing computer vision research when there's clear potential for abuse.

Alternative title: why I decided to stop working on tracking.

Curious about other's thoughts on this.

lb.eyer.be/s/cv-ethics....
November 29, 2024 at 2:51 PM
If you are interested in working as a research engineer in ML / Drug Discovery come find me/the team at Isomorphic Labs at Neurips! We are hiring!
November 25, 2024 at 3:24 PM
Reposted by John Ryan
Anybody have a bioml starter pack?
November 11, 2024 at 7:49 PM
Nice theoretically motivated Adam variant "ADOPT" (remains to be seen if it stands out from the rest of the Adam zoo)
arxiv.org/abs/2411.02853
ADOPT: Modified Adam Can Converge with Any $β_2$ with the Optimal Rate
Adam is one of the most popular optimization algorithms in deep learning. However, it is known that Adam does not converge in theory unless choosing a hyperparameter, i.e., $β_2$, in a problem-depende...
arxiv.org
November 7, 2024 at 11:11 PM