Memming Park
banner
memming.bsky.social
Memming Park
@memming.bsky.social
Computational Neuroscientist & Neurotechnologist
Reposted by Memming Park
📢 Applications open on 19 Jan for the 7-week #Mathematics #SummerSchool in London. You will develop the maths skills and intuition necessary to enter the #TheoreticalNeuroscience / #MachineLearning field.

Find out more & register for the information webinar 👉 www.ucl.ac.uk/life-science...
January 15, 2026 at 2:37 PM
Back in 2014, I co-organized a #COSYNE workshop on scalable modeling. scalablemodels.wordpress.com #timeflies
Scalable models for high-dimensional neural data
A COSYNE 2014 Workshop
scalablemodels.wordpress.com
January 13, 2026 at 1:51 PM
Reposted by Memming Park
🚨📜+🧵🚨 Very excited about this work showing that people with no hand function following a spinal cord injury can control the activity of motor units from those muscles to perform 1D, 2D and 3D tasks, play video games, or navigate a virtual wheelchair

By a wonderful team co-mentored w Dario Farina
January 7, 2026 at 10:36 PM
Reposted by Memming Park
We have reached a situation where (1) the time/resources spent by people applying for grant X often outweighs (2) the time/resources awarded.

For these grants, society loses net time/resources.

www.nature.com/articles/d41...
Client Challenge
www.nature.com
January 13, 2026 at 9:44 AM
How can I accelerate breakdown of caffeine in my body? I will need to increase CYP1A2 (P450) activity (without smoking). Vigorous exercise over 30 days was shown to increase it up to 70%? pubmed.ncbi.nlm.nih....
January 5, 2026 at 3:08 PM
Learning a lot while preparing for a lecture on RNNs for neuroscience.
January 2, 2026 at 5:34 PM
according to last.fm, my favourite artist of 2025 was #LauraThorn. Scrobbled 493/5728 times (just one song; La poupée monte le son). 0.01% of fans worldwide for the song. Of 2672 unique tracks I listened to. Also #1 on Beatrice Rana's Goldberg Variations album.
January 2, 2026 at 3:16 PM
Highlights of 2025

Ayesha Vermani defended her PhD thesis this year. She helped jump start a new direction in integrative neuroscience:

Vermani et al. (2025), Meta-dynamical state space models for integrative neural data analysis. ICLR
👉 openreview.net/forum?id=SRp...
👉 youtu.be/SiXxPmkpYF8
December 30, 2025 at 5:12 PM
I admire all who have donated and will donate to OpenReview. Thank you.
December 23, 2025 at 11:15 AM
Reposted by Memming Park
Today, the NeurIPS Foundation is proud to announce a $500,000 donation to OpenReview, supporting the infrastructure that makes modern ML research possible.

blog.neurips.cc/2025/12/15/s...
blog.neurips.cc
December 15, 2025 at 1:00 PM
Fully-funded International Neuroscience Doctoral Programme🧠 Champalimaud Foundation, Lisbon, Portugal 🇵🇹

Deadline: Jan 31, 2026
fchampalimaud.org/champalimaud...

Research program spans systems/computational/theoretical/clinical/sensory/motor neuroscience, neuroethology, intelligence, and more!!
December 16, 2025 at 7:20 PM
One advantage of monosemantic, sharply-tuned, grandmother-cell, axis-aligned, neuron-centric representation as opposed to polysemantic, mixed-selective, oblique population code is that it can benefit from evolution. Genes are good at operating at the cell level. #neuroscience
December 12, 2025 at 1:56 PM
Some of my favorites from #NeurIPS2025

more neg max Lyapunov exp => faster parallelized RNN convergence
Gonzalez, X., Kozachkov, L., Zoltowski, D. M., Clarkson, K. L., & Linderman, S. Predictability Enables Parallelization of Nonlinear State Space Models.
Predictability Enables Parallelization of Nonlinear State Space Models | OpenReview
The rise of parallel computing hardware has made it increasingly important to understand which nonlinear state space models can be efficiently parallelized. Recent advances have shown that evaluating a state space model can be recast as solving a parallelizable optimization problem, and sometimes this approach yields dramatic speed-ups in evaluation time. However, the factors that govern the difficulty of these optimization problems remain unclear, limiting the larger adoption of the technique. In this work, we establish a precise relationship between the dynamics of a nonlinear system and the conditioning of its corresponding optimization formulation. We show that the predictability of a system, defined as the degree to which small perturbations in state influence future behavior, directly governs the number of optimization steps required for evaluation. In predictable systems, the state trajectory can be computed in $\mathcal{O}((\log T)^2)$ time, where $T$ is the sequence length, a major improvement over the conventional sequential approach. In contrast, chaotic or unpredictable systems exhibit poor conditioning, with the consequence that parallel evaluation converges too slowly to be useful. Importantly, our theoretical analysis demonstrates that for predictable systems, the optimization problem is always well-conditioned, whereas for unpredictable systems, the conditioning degrades exponentially as a function of the sequence length. We validate our claims through extensive experiments, providing practical guidance on when nonlinear dynamical systems can be efficiently parallelized, and highlighting predictability as a key design principle for parallelizable models.
openreview.net
December 7, 2025 at 12:02 AM
This was a fantastic poster presentation!
At #NeurIPS? Curious about how RNNs learn differently in closed-loop (RL) vs. open-loop (supervised) settings?
Come by Poster #2107 on Thursday at 4:30 PM!

neurips.cc/virtual/2025...
December 5, 2025 at 6:56 PM
Melanie Mitchell's keynote reminds us that it is not easy to evaluate intelligence (AI, babies, animals, etc) and benchmarks can be VERY misleading. #NeurIPS2025
December 4, 2025 at 11:19 PM
#NeurIPS2025 For diffusion models, full optimization leads to memorization, but intermediate amount of optimization (easier to do when you have more data) leads to generalization!
This is a clear example where more optimization on the training objective is demonstrably worse.
December 3, 2025 at 7:32 PM
Reposted by Memming Park
And that’s a wrap for the ⚪#CRSy25!
We welcomed nearly 3️⃣0️⃣ presenters, including 4️⃣ keynote speakers, and over 3️⃣0️⃣0️⃣ participants from across the 🌍 to explore the theme “Neuro-Cybernetics at Scale”.
📹 Get a glimpse of this year’s edition: tinyurl.com/2y9yprjn
👉 Read on:
tinyurl.com/2vttzwez
Champalimaud Research Symposium 2025 - Wrap up
Over three days from 15-17 October, the Champalimaud Foundation hosted the 2025 edition of the international Champalimaud Research Symposium (CRSy25), under the theme “Neuro-Cybernetics at Scale”. This symposium brought together leading researchers in neuroscience, robotics, machine learning, control theory, and theoretical neuroscience to explore how behaviour and intelligence emerge from complex feedback loops linking brains, bodies, and environments. Watch the wrap up video to get a glimpse of this year’s edition!
tinyurl.com
October 24, 2025 at 4:19 PM
Reposted by Memming Park
Kabir Dabholkar @ Technion - Israel Institute of Technology introduced a 🤖 deep-learning framework using Koopman eigenfunctions to map decision boundaries in complex dynamical systems - a tool to trace how brains and models transition between states.
October 17, 2025 at 4:48 PM
I’m having a blast!
🚀 The Champalimaud Research Symposium 2025 (#CRSy25) kicked off today in Lisbon under the theme “#NeuroCybernetics at Scale”, exploring #intelligence and the future of brain–AI research!
October 15, 2025 at 10:19 AM
Virtual live-streaming tickets are 20 euros!
🌍 Over 300 participants from around the globe are joining us in Lisbon for the #CRSy25. ⚪ Can't join in person? Get your virtual ticket to follow the talks on “Neuro-Cybernetics at Scale” 🔎➰ 🤖 through the Live Streaming of the event.

🎟️ symposium.fchampalimaud.science/registration-1
October 13, 2025 at 6:32 AM
I'm very proud of @ayeshavermani.bsky.social!!
#PhDone! Congratulations to our new #Doctor, Ayesha Vermani, from the Neural Dynamics Lab at @champalimaudf.bsky.social.

We wish you all the best for your future, Ayesha 👏
October 13, 2025 at 6:32 AM
Reposted by Memming Park
Our new paper is out in Current Opinion in Behavioral Sciences; a perspective paper on the DMN, titled "Embodying the default mode network: self-related processing from an embodied perspective"
www.sciencedirect.com/science/arti...
Embodying the default mode network: self-related processing from an embodied perspective
Self-related processes in the default mode network (DMN) have been viewed predominantly through a cognitive lens, often overlooking the embodied dimen…
www.sciencedirect.com
October 8, 2025 at 5:21 AM
Excellent list of speakers! Come join the symposium at the Champalimaud Centre for the Unknown. Register now. #neurocybernetics #computationalNeuroscience
September 23, 2025 at 11:42 AM
I spent less than 1 day Vibe coding. I didn't read or touch any code yet the outcome is quite remarkably good. I never coded in typescript (I used ECMA/Javascript back in the 90s) and made a usable chrome broswer extension only with some natural langauge. I just submitted it to the chrome webstore.
September 15, 2025 at 5:47 PM