John Pearson
banner
jmxpearson.bsky.social
John Pearson
@jmxpearson.bsky.social
Computational neuroscience, neuroML, natural behavior. I charge more for miracles. PI @ pearsonlab.github.io. @dukemedschool.bsky.social.
Pinned
Okay, long-overdue introduction. I’m a computational neuroscientist at Duke, where my lab (pearsonlab.github.io) does theory “bottom up”: we try to start by modeling data and build toward principles.
Pearson Lab at Duke University
pearsonlab.github.io
Reposted by John Pearson
We are hiring!! Biological Sciences as UNC Charlotte has a broad search for a new assistant professor.
Apps reviewed 12/15.

Please spread the word! 💚⛏️
#PlantSci 🧪🦠🧬
jobs.charlotte.edu/postings/65141
Assistant Professor
Applicants should possess a Ph.D. in the biological sciences or related fields.Candidates are expected to document expertise in their specialty by a record of postdoctoral training, peer reviewed publ...
jobs.charlotte.edu
November 17, 2025 at 2:52 PM
Reposted by John Pearson
By mapping connections among researchers, Neurotree makes it possible to trace how the field has evolved and to visualize how shifts in lab size, training and other factors can shape its direction, writes founder @stephenvdavid.bsky.social.

bit.ly/4od0SiL

#neuroskyence #StateOfNeuroscience
Tracing neuroscience’s family tree to track its growth
By mapping connections among researchers, Neurotree makes it possible to see how the field has evolved and what factors shape its direction.
www.thetransmitter.org
November 25, 2025 at 2:08 PM
Reposted by John Pearson
Y’all are reading this paper in the wrong way.

We love to trash dominant hypothesis, but we need to look for evidence against the manifold hypothesis elsewhere:

This elegant work doesn't show neural dynamics are high D, nor that we should stop using PCA

It’s quite the opposite!

(thread)
“Our findings challenge the conventional focus on low-dimensional coding subspaces as a sufficient framework for understanding neural computations, demonstrating that dimensions previously considered task-irrelevant and accounting for little variance can have a critical role in driving behavior.”
Neural dynamics outside task-coding dimensions drive decision trajectories through transient amplification
Most behaviors involve neural dynamics in high-dimensional activity spaces. A common approach is to extract dimensions that capture task-related variability, such as those separating stimuli or choice...
www.biorxiv.org
November 25, 2025 at 4:16 PM
Reposted by John Pearson
This has made me think that it’s just as important to ask what an ideal actor would do without imposing structure, nonlin or data-training on the networks in the first place...this can allow us to interpret empirical observations in a rule based framework (a useful complementary approach!!)
November 25, 2025 at 4:15 PM
Reposted by John Pearson
I like using ANNs for prediction! but one challenge is that we can engineer them in multiple ways to generate similar data. E.g. M1-like patterns emerge from both sensory and autonomous networks with different optimization rules. They depend on training method, and tricky to infer organizing rules
November 25, 2025 at 4:12 PM
Reposted by John Pearson
🔥🔥SAVE THE DATE!!🔥🔥

Join us at Yale for the 2026 Computational Psychiatry Conference July 14-16, 2026 www.cpconf.org

@xiaosigu.bsky.social @yiplab.bsky.social @shirleybwang.bsky.social @alpowers7.bsky.social

See you in New Haven!
Computational Psychiatry Conference
New Haven, USA (July 14-16, 2026)
www.cpconf.org
November 25, 2025 at 3:12 PM
Reposted by John Pearson
Using models (like motornet) is a super way of generating predictions under hypothesis-driven manipulation of task/training/architecture/neuromuscular etc that cannot easily be done empirically
November 25, 2025 at 12:18 PM
Reposted by John Pearson
Dynamics is a v cool framework for examining how things unfold in time and characterizing the constraints on that temporal unfolding
November 25, 2025 at 12:15 PM
Reposted by John Pearson
Low-d / high-d has never really struck me as interesting on its own ; rather pca, dpc, and related methods can be interesting to test what info is present at what time in a neural population.
November 25, 2025 at 12:15 PM
Reposted by John Pearson
Inviting apps for a workshop to develop a project focused on a mechanistic understanding of canonical cortical computations at the circuit level. Deadline is 1/5/2026: https://www.simonsfoundation.org/simons-foundation-now-accepting-applications-for-workshop-on-canonical-cortical-computations
Simons Foundation Now Accepting Applications for Workshop on Canonical Cortical Computations
Simons Foundation Now Accepting Applications for Workshop on Canonical Cortical Computations on Simons Foundation
www.simonsfoundation.org
November 24, 2025 at 6:47 PM
Reposted by John Pearson
Everything about this is crazy. To mate (and avoid being eaten by the female) the male octopus of this species bites and injects tetrodotoxin to paralyze her first. www.sciencedirect.com/science/arti...
November 25, 2025 at 1:09 AM
Reposted by John Pearson
The Sensorimotor Superlab with @gribblelab.org and @andpru.bsky.social is a unique place to work and learn. We are now accepting MSc and PhD applications for Fall 2026. Join our awesome team at Western University... For application instructions see diedrichsenlab.org and gribblelab.org/join.html!
Diedrichsenlab
diedrichsenlab.org
November 24, 2025 at 10:50 PM
Reposted by John Pearson
Reposted by John Pearson
Congrats to @robertslab.bsky.social, the version of record of this paper is now published in @elife.bsky.social

Nice use of optimal transport methods to measure acoustic distance, if you're into that sort of thing 😉

elifesciences.org/articles/101...

#prattle 💬
#bioacoustics
A deep learning approach for the analysis of birdsong
An open-source deep learning toolkit performs accurate annotation and similarity scoring of zebra finch song, along with comprehensive feature extraction, enabling consistent, interpretable comparison...
elifesciences.org
November 24, 2025 at 6:23 PM
Reposted by John Pearson
📍Excited to share that our paper was selected as a Spotlight at #NeurIPS2025!

arxiv.org/pdf/2410.03972

It started from a question I kept running into:

When do RNNs trained on the same task converge/diverge in their solutions?
🧵⬇️
November 24, 2025 at 4:43 PM
Reposted by John Pearson
Absolutely! The credit for that term goes to Hasok Chang. His book changed how I think about science. How did we traverse from a sense of hot/cold to thermodynamics across 250 years? We jumped in and refined from there.

academic.oup.com/book/5530?lo...
Inventing Temperature: Measurement and Scientific Progress
Abstract. This book presents the concept of “complementary science” which contributes to scientific knowledge through historical and philosophical investig
academic.oup.com
November 24, 2025 at 4:04 PM
Reposted by John Pearson
In principle I would have hopefully mentioned that it assumes rate coding, that it's only true within a task, asked whether or not the 'theoretical upper bound' was actually feasible, etc. With all those caveats it would be seen as an intriguing puzzle not an established result.
November 24, 2025 at 3:22 PM
Reposted by John Pearson
Great test case. Given my 'extraordinary claims' point, I'd have summarised this as 'our statistical measures only seem to cover a low-D subspace, raising questions about the validity of these measures or suggesting neural dynamics are very different from expected'. Issue is with jump to latter.
November 24, 2025 at 3:22 PM
To Dan’s point: I once had a physics prof point out to me that thermodynamics, which is empirical and not disprovable, is different from statistical mechanics, which is a theory of how thermodynamics arises. A lot of us in neuro would do well to periodically remind ourselves of the distinction.
Great test case. Given my 'extraordinary claims' point, I'd have summarised this as 'our statistical measures only seem to cover a low-D subspace, raising questions about the validity of these measures or suggesting neural dynamics are very different from expected'. Issue is with jump to latter.
November 24, 2025 at 3:46 PM
Reposted by John Pearson
Without having read it, I do think the OP paper takes the right approach. Have a network perform the task of interest and explore it as a means of generating hypotheses about how the brain does it and finding which tools would help you find out. "How does MotorNet do it" comes up a lot in our lab.
November 24, 2025 at 2:44 PM
Reposted by John Pearson
I think Mark Churchland's recent review does a nice job of motivating how we got to where we are now in terms of population level descriptions with respect to motor cortex. Its not clear to me that this should generalize across the brain and across all behaviors. www.nature.com/articles/s41...
Preparatory activity and the expansive null-space - Nature Reviews Neuroscience
How does motor-cortex activity well before movement not drive motor outputs? In this Review, Churchland and Shenoy detail how searching for answers transitioned the understanding of neural activi...
www.nature.com
November 24, 2025 at 2:03 PM
Reposted by John Pearson
Yeah, I remember hearing about this by @suryaganguli.bsky.social a few years before their preprint, I think from @sasolla.bsky.social

And I think the picture we have now that (at least for embedding) dimensionality = f ( task "complexity", brain area )
November 24, 2025 at 1:20 PM
Reposted by John Pearson
I would say sensory versus "everything else" (cognitive flexibility, memory, emotion, motor). Great Q - how do all these connect? Likewise, where do @sueyeonchung.bsky.social's manifolds and @stefanofusi.bsky.social's abstractions fit in?

One key difference: static versus dynamic descriptions.
November 24, 2025 at 1:08 PM
Reposted by John Pearson
I might be misremembering, but I think the ideas were known a few years before the 2017 preprint because of their 2014 COSYNE presentation.
ganguli-gang.stanford.edu/pdf/14.Cosyn...
ganguli-gang.stanford.edu
November 24, 2025 at 1:12 PM