Thiparat Chotibut
banner
thipchotibut.bsky.social
Thiparat Chotibut
@thipchotibut.bsky.social
Physicist, Dog-lover, Guitarist / Stat Mech + Machine Learning + Quantum Info = Research Interests / In the land of smiles 🇹🇭🤠😬
[n/n] Check out the paper for the full derivation and discussion! arxiv.org/abs/2512.12767 Happy to chat more about sparse, non-hermitian random matrices, attractors, or why physics needs to pay more attention to timescale heterogeneity. ⏳⌛️⏲️
Random matrix theory of sparse neuronal networks with heterogeneous timescales
Training recurrent neuronal networks consisting of excitatory (E) and inhibitory (I) units with additive noise for working memory computation slows and diversifies inhibitory timescales, leading to im...
arxiv.org
December 16, 2025 at 9:18 AM
[8/n] Takeaway: By adding realistic complexity (sparse, structured interactions and timescale heterogeneity), new functional mechanisms appear that were once invisible to standard RMT results. It’s a messy math problem, but the SUSY tools make it solvable.
December 16, 2025 at 9:18 AM
[7/n] With this technique, we can also reproduce the classic Rajan-Abbott distribution in an economical manner, see Appendix C.
December 16, 2025 at 9:18 AM
[6/n] To prove this, we had to solve for a sparse, non-Hermitian random matrix problem, modulated by heterogeneous timescales.
.
Standard RMT tools fail here.
.
We adapted techniques from High Energy Physics, using a SUSY-based approach to analytically solve the spectral edge.
December 16, 2025 at 9:18 AM
[5/n] We found that the emergent "inhibitory core-excitatory periphery" architecture, coupled with a broad distribution of inhibitory timescales, pushes the spectrum closer to criticality. No weight fine-tuning required!
December 16, 2025 at 9:18 AM
[4/n] For those working on critical phenomena, this seems impossible: achieving criticality usually requires precise fine-tuning. So, how does the cortical network do this?
.
Our proposed solution: Sparsity + Structure + Timescale Heterogeneity 💡
December 16, 2025 at 9:18 AM
[3/n] In this new preprint, we have some answers. Here’s the physics problem:

The working memory computation reported in PNAS relies on a "discrete attractor hopping" mechanism. To encode memory robustly, each discrete attractor needs to operate near the "edge-of-chaos."
December 16, 2025 at 9:18 AM
[2/n] It's a follow-up from our PNAS reporting a paradoxical finding: neuronal networks actually perform working memory tasks better under driven noise. Back then, we reported that it worked, but we didn’t really understand why and how.
www.pnas.org/doi/10.1073/...
December 16, 2025 at 9:18 AM
🧵 4/4
To stat mech crowd: think of congestion games as out-of-equilibrium many-body active matter.
.
This is an exactly solvable active system (multi-agent RL) where microscopic chaos coexists with macroscopic ergodic convergence - check it out!
www.pnas.org/doi/10.1073/...
PNAS
Proceedings of the National Academy of Sciences (PNAS), a peer reviewed journal of the National Academy of Sciences (NAS) - an authoritative source of high-impact, original research that broadly spans...
www.pnas.org
July 1, 2025 at 2:51 PM
🧵 3/4
✨ Remarkably, yet the long-run average number of agents on route 1 settles on the social-optimum / Nash equilibrium (bottom right) ⛳️, despite the day-to-day head-count of route 1 being provably chaotic (bottom left)! 🌪️
July 1, 2025 at 2:51 PM
🧵 2/4

Results: When some agents learn (adapt) very fast, their individual strategies turn chaotic 🌪️. Top panel - x axis: agent type with different learning rates, y-axis fraction of that agent selecting route 1.
July 1, 2025 at 2:51 PM
Kudos to the team (especially to Tang and Teerachote) for this computational feat powered by thousands of NVIDIA GPU hours and LOADs of trials and errors! But eventually they succeeded at rivaling state-of-the-art models!

Feedbacks are welcome!

Paper --> arxiv.org/abs/2501.08998
arxiv.org
January 16, 2025 at 11:46 AM
We show that CrystalGRW yields stable, unique, and novel structures (S.U.N. materials) close to their DFT ground states. The fun part for me is to revisit the theory of random walks on Riemannian manifolds and make this works for generative modeling.
January 16, 2025 at 11:46 AM
If you’re interested in materials discovery or generative modeling, CrystalGRW might cut down the guesswork and skip expensive ab initio calculations and also let you specify, say, a target crystallographic point group or composition right off the bat.
January 16, 2025 at 11:46 AM
The coolest part (in my humble opinion) is how it balances crystal symmetry requirements, periodicity, compositional constraints, and training stability in a single, unified generative modeling framework: diffusion models on natural Riemannian manifolds that suitably represent crystal properties
January 16, 2025 at 11:46 AM