Dmitry Krotov
@krotov.bsky.social
210 followers 130 following 23 posts
I am a physicist working on neural networks (both artificial and biological). Find me on https://research.ibm.com/people/dmitry-krotov
Posts Media Videos Starter Packs
krotov.bsky.social
The overlaid network illustrates a mathematical model, which represents the astrocyte as a network of processes that enables multi-synaptic communication and supports associative memory function.

Data credit: Leyka Nagendren, Cagla Eroglu.
Artist Credit: Annette Hui, IBM Research
krotov.bsky.social
The raw image was captured in October 2023 using an Olympus FV 3000 confocal microscope with a 60x objective lens and a z-step size of 0.5 μm across a 50–60 μm z-stack for high magnification and resolution.
krotov.bsky.social
A custom Imaris extension, 3D Sholl Analysis, was used to quantify the number of process intersections at 5 μm intervals from the cell soma, marked by the sphere, thereby visualizing the highly complex network of processes of the astrocyte.
krotov.bsky.social
The image summarizes the main message of our paper: astrocyte = network of processes that engage in information exchange and computation. The image shows a 3D reconstruction of an astrocyte from the visual cortex using Imaris Bitplane 9.9.
krotov.bsky.social
Key takeaways:
🔷 Astrocytes compute.
🔷 Dense Associative Memories can be built using astrocytes.
🔷 From the computational perspective astrocyte is a network of processes.
🔷 Memories can be stored, at least partially, in the biochemical machinery inside the astrocyte, as opposed to just synapses.
krotov.bsky.social
I am very excited about our new paper just published in PNAS.

www.pnas.org/doi/abs/10.1...

It was a pleasure to work on this idea together with @leokoz8.bsky.social and Jean-Jacques Slotine.
Reposted by Dmitry Krotov
leokoz8.bsky.social
Big week for astrocyte research: 3 new Science papers link astrocytes to behavior. We're excited to add to the momentum with our new PNAS paper: a theory, grounded in biology, proposing astrocytes as key players in memory storage and recall. w/ JJ Slotine and @krotov.bsky.social
(1/6)
krotov.bsky.social
John Hopfield on pursuing interesting things: “Doesn’t matter what they are - interesting is the important one”. www.youtube.com/watch?v=iYgs...
John J. Hopfield, Nobel Prize in Physics 2024: Official interview
YouTube video by Nobel Prize
www.youtube.com
krotov.bsky.social
I am heading to #ICLR2025 in Singapore. If you want to chat about associative memories, energy-based models, and related topics let’s connect! The highlight for me this year is the New Frontiers in Associative Memories workshop on Sunday April 27. Here is the schedule ⬇️
Reposted by Dmitry Krotov
krotov.bsky.social
Associative memory is a prominent emerging paradigm in the landscape of mainstream AI and I expect exciting developments in this area in the next few years. Please take a look at the call for papers and submit by the deadline: February 14. It will be interesting! 😀
krotov.bsky.social
Now that ICML papers are submitted and we are in the midst of discussions on whether scaling is enough or new architectural/algorithmic ideas are needed, what can be a better time to submit your best work to our workshop on New Frontiers in Associative Memory @iclr-conf.bsky.social?
Reposted by Dmitry Krotov
kylecranmer.bsky.social
I am very honored to return to my alma mater to give a distinguished lecture on AI for Science at Rice University's Ken Kennedy Institute.
events.rice.edu/event/spring...
krotov.bsky.social
I am excited to announce the call for papers for the New Frontiers in Associative Memories workshop at ICLR 2025. New architectures and algorithms, memory-augmented LLMs, energy-based models, Hopfield nets, AM and diffusion, and many other topics.

Website: nfam.vizhub.ai

@iclr-conf.bsky.social
krotov.bsky.social
Wow, what a night! A heartfelt congratulations to my dear friend and collaborator John Hopfield and all the 2024 Laureates. #NobelPrize
krotov.bsky.social
What a wonderful day of Nobel lectures and inspiring conversations at the beautiful Aula Magna, Stockholm University! #NobelPrize
krotov.bsky.social
Physics is a point of view. Wise words from John Hopfield’s Nobel lecture. #NobelPrize
krotov.bsky.social
Please check out the post in the first quote and join us at the Scientific Methods for Understanding Deep Learning workshop at #NeurIPS2024 on December 15 to learn more.
krotov.bsky.social
We demonstrate empirically that spurious states exist in conventional DMs. Their existence is a distinct prediction of DenseAM theory, yet these states have been overlooked in the DM literature. Spurious states mark the onset of the memorization-generalization transition in DMs.
krotov.bsky.social
In our latest work, we leverage the correspondence between DenseAMs and Diffusion Models (DMs) to make a theoretical prediction: a similar phenomenon must occur during the memorization-to-generalization transition in DMs.
krotov.bsky.social
DenseAM theory predicts a distinct phenomenon when the amount of data surpasses the critical memory capacity: spurious states emerge. Spurious states are local minima of the energy function that fail to store each pattern in a unique basin of attraction (two or more patterns sharing the same basin).
krotov.bsky.social
Most of the work on Dense Associative Memory (DenseAM) thus far has focused on the regime when the amount of data (number of memories) is below the critical memory storage capacity. We are beginning to explore the opposite limit, when the data is large.
baopham.bsky.social
Diffusion models create beautiful novel images, but they can also memorize samples from the training set. How does this blending of features allow creating novel patterns? Our new work in Sci4DL workshop #neurips2024 shows that diffusion models behave like Dense Associative Memory networks.
On the left figure, it showcases the behavior of Hopfield models. Given a query (the initial point of energy descent), a Hopfield model will retrieve the closest memory (local minimum) to that query such that it minimizes the energy function. A perfect Hopfield model is able to store patterns in distinct minima (or buckets). In contrast, the right figure illustrates a bad Associative Memory system, where stored patterns share a distinctive bucket. This enables the creation of spurious patterns, which appear like mixture of stored patterns. Spurious patterns will have lower energy than the memories due to this overlapping.
krotov.bsky.social
Check out our new #NeurIPS2024 paper, where we develop a distributed representation for a broad class of Dense Associative Memories. Similarly to the traditional Hopfield Network, the number of weights can be kept fixed and independent of the number of memories when new patterns are introduced.
bhoov.bsky.social
Excited to share "Dense Associative Memory through the Lens of Random Features" accepted to #neurips2024🎉

DenseAMs need new weights for each stored pattern–hurting scalability. Kernel methods let us add memories without adding weights!

Distributed memory for DenseAMs, unlocked🔓