Sarthak Chandra
@sarthakc.bsky.social
120 followers 49 following 25 posts
Interested in neuroscience, development and dynamical systems | Faculty member @ ICTS | Previously: @MIT, @UMD, @IITK
Posts Media Videos Starter Packs
Pinned
sarthakc.bsky.social
1/ Our paper appeared in @Nature today! www.nature.com/articles/s41... w/ Fiete Lab and @khonamikail.bsky.social .
Explains emergence of multiple grid cell modules, w/ excellent match to data! Novel mechanism for applying across vast systems from development to ecosystems. 🧵👇
Global modules robustly emerge from local interactions and smooth gradients - Nature
The principle of peak selection is described, by which local interactions and smooth gradients drive self-organization of discrete global modules.
www.nature.com
Reposted by Sarthak Chandra
sarthakc.bsky.social
8/8
TL;DR: Peak Selection is a novel mechanism for the modularity emergence in a variety of systems. Applied to grid cells, it makes testable predictions at molecular, circuit, and functional levels, and matches observed period ratios better than any existing model!
sarthakc.bsky.social
7/
Peak Selection applies broadly for module emergence:
The same mechanism can also explain:
🌱 Emergent ecological niches
🐠 Coral spawning synchrony
🤖 Modularity in optimization & learning
sarthakc.bsky.social
6b/ (cont'd)
Central results and predictions:
•Self-scaling with organism size
•Topologically robust: insensitive to almost all param variations, activity perturbations; also robust to weight heterogeneity! (no need for exactly symmetric interactions in CANs)
sarthakc.bsky.social
6/
Central results and predictions:
•Nearly **any** interaction shape can form grid cell patterning (Mexican-hat kernels not needed!)
•Grid cells involve two scales of interactions, one spatially varying and one fixed.
•Functional modularity can emerge without molecular modularity.
sarthakc.bsky.social
5b/ (cont’d)
Grid modularity from Peak Selection!
•Discrete jumps in grid period without discrete precursors.
•Novel period ratio prediction: adjacent periods ratios vary as ratio of integers (3/2, 4/3, 5/4, …).
•Excellent agreement with data (R^2 ~0.99)!
sarthakc.bsky.social
5/
Grid modularity from Peak Selection!
•Two forms of local interactions: one spatially varying smoothly in scale, the other held fixed.
•These spontaneously generate local patterning and global modularity!
sarthakc.bsky.social
4/
2 classic ideas for structure emergence in biology
•Positional hypothesis: genes apply discrete thresholds, but discrete gene expression
•Turing hypothesis: Local interactions drive patterns, but single scale
But grid modules are multiscale, from presumably continuous precursors
sarthakc.bsky.social
3/
Various measured cellular and circuit properties vary smoothly across grid cells. Yet, grid cells are organized into discrete modules with different spatial periods. How does discrete organization arise from smooth gradients?
sarthakc.bsky.social
2/ The work introduces “Peak Selection”: a general mechanism by which local interactions and smooth gradients give rise to global modules. We first focus on a classic example of modularity, grid cells in the brain.
sarthakc.bsky.social
1/ Our paper appeared in @Nature today! www.nature.com/articles/s41... w/ Fiete Lab and @khonamikail.bsky.social .
Explains emergence of multiple grid cell modules, w/ excellent match to data! Novel mechanism for applying across vast systems from development to ecosystems. 🧵👇
Global modules robustly emerge from local interactions and smooth gradients - Nature
The principle of peak selection is described, by which local interactions and smooth gradients drive self-organization of discrete global modules.
www.nature.com
sarthakc.bsky.social
Thanks! Yes, in its current form it doesn't have recency or primacy effects. We have some thoughts on including recency with some weight decay to reduce the importance of older memories. But how to build in primacy and other forms of memory salience in this model is something to think more about!
sarthakc.bsky.social
9b/ (cont’d)
Hippocampal cells remap by direction/context 📍➡️⬅️
Memory consolidation of multiple memory traces 📚
Model thus bridges experiments and theory!
sarthakc.bsky.social
9/ Experimental alignment: 🧠🔬
VectorHaSH mirrors entorhinal-hippocampal phenomena:
Grid cells demonstrate stable periodicity, rapid phase resets, robust velocity integration 🌐
Recreate correlation statistics of grid cells and place cells 📊
sarthakc.bsky.social
8/ 🏰 Memory palaces explained!
Why does imagining a spatial walk supercharge memory?
VectorHaSH shows how recall of familiar locations acts as a secondary scaffold.
Result: Even approximate recall of locations reliably supports one-shot arbitrary, high-fidelity memories. 💡
sarthakc.bsky.social
7/ How does VectorHaSH implement efficient episodic/sequence memory? Conventional models recall entire high-dim states ➡️ fail quickly. VectorHaSH reduces the problem to recalling low-dim velocity vectors on a scaffold. Result: Long sequences stored & recalled with precision! 🔥
sarthakc.bsky.social
6/ Spatial memory at scale? VectorHaSH links scaffold states to sensory cues via the hippocampus. This leads to independent non-interfering learned maps (landmark-location associations) in multiple rooms. Metric grid structure supports zero-shot inference along novel paths🚶‍♀️
sarthakc.bsky.social
5a/ (cont’d)
VectorHaSH then stores memories by heteroassociation of inputs with these scaffold states, enabling graceful degradation of memory detail with the number of stored memories over a vast number of inputs
sarthakc.bsky.social
5/ Memory without cliffs? Hopfield and other models crash 📉after reaching capacity, completely losing all previous memories. VectorHaSH avoids this by first using grid cells to create a scaffold of exponentially many large-basin fixed points
sarthakc.bsky.social
4a/ (cont’d)

(3) Episodic memory, using low-dimensional transitions in the grid space to support massive sequence capacity 🎞️(4) Method of Loci, explaining the paradox of why adding to the memory task (associating items with spatial locations) boosts performance 🏰
sarthakc.bsky.social
4/
VectorHaSH supports: (1) Item memory, avoiding memory cliffs of Hopfield nets (2) Spatial memory, learning landmark-location associations over many maps 🌍& minimizing catastrophic forgetting
sarthakc.bsky.social
3/ Key ideas 🔑Hippocampal and grid cells create a fixed "scaffold" that serves as a robust, error-correcting memory foundation. External inputs are "hooked" onto the scaffold through heteroassociation. Low-dimensional transitions in grid space enable large sequence memory.
sarthakc.bsky.social
2/ Why are spatial & episodic memory co-localized in the hippocampus? How do memory palaces allow memorization of decks of cards?
Our model, VectorHaSH, shows how the hippocampus along with grid cells integrate these roles for memory storage, sequence recall, memory palaces 🏰