Jenelle Feather
@jfeather.bsky.social
1.2K followers 510 following 51 posts
Flatiron Research Fellow #FlatironCCN. PhD from #mitbrainandcog. Incoming Asst Prof #CarnegieMellon in Fall 2025. I study how humans and computers hear and see.
Posts Media Videos Starter Packs
Reposted by Jenelle Feather
cmuscience.bsky.social
Elizabeth Lee, a first-year Ph.D. student in Neural Computation, has been awarded CMU’s 2025 Sutherland-Merlino Fellowship. Her work bridges neuroscience and machine learning, and she’s passionate about advancing STEM access for underrepresented groups.
www.cmu.edu/mcs/news-eve...
Elizabeth Lee smiles at the camera.
Reposted by Jenelle Feather
dataonbrainmind.bsky.social
📢 10 days left to submit to the Data on the Brain & Mind Workshop at #NeurIPS2025!

📝 Call for:
• Findings (4 or 8 pages)
• Tutorials

If you’re submitting to ICLR or NeurIPS, consider submitting here too—and highlight how to use a cog neuro dataset in our tutorial track!
🔗 data-brain-mind.github.io
Data on the Brain & Mind
data-brain-mind.github.io
jfeather.bsky.social
So excited for CCN2026!!! 🧠🤔🤖🗽
neurograce.bsky.social
The rumors are true! #CCN2026 will be held at NYU. @toddgureckis.bsky.social and I will be executive-chairing. Get in touch if you want to be involved!
Reposted by Jenelle Feather
rdgao.bsky.social
arguably the most important component of AI for neuroscience:

data, and its usability
dataonbrainmind.bsky.social
🚨 Excited to announce our #NeurIPS2025 Workshop: Data on the Brain & Mind

📣 Call for: Findings (4- or 8-page) + Tutorials tracks

🎙️ Speakers include @dyamins.bsky.social @lauragwilliams.bsky.social @cpehlevan.bsky.social

🌐 Learn more: data-brain-mind.github.io
jfeather.bsky.social
Join us at #NeurIPS2025 for our Data on the Brain & Mind workshop! We aim to connect machine learning researchers and neuroscientists/cognitive scientists, with a focus on emerging datasets.

More info: data-brain-mind.github.io
dataonbrainmind.bsky.social
🚨 Excited to announce our #NeurIPS2025 Workshop: Data on the Brain & Mind

📣 Call for: Findings (4- or 8-page) + Tutorials tracks

🎙️ Speakers include @dyamins.bsky.social @lauragwilliams.bsky.social @cpehlevan.bsky.social

🌐 Learn more: data-brain-mind.github.io
Reposted by Jenelle Feather
hadivafaii.bsky.social
Announcing the new "Sensorimotor AI" Journal Club — please share/repost!

w/ Kaylene Stocking, Tommaso Salvatori, and @elisennesh.bsky.social

Sign up link: forms.gle/o5DXD4WMdhTg...

More details below 🧵[1/5]

🧠🤖🧠📈
jfeather.bsky.social
Topics include but are not limited to:
•Optimal and adaptive stimulus selection for fitting, developing, testing or validating models
•Stimulus ensembles for model comparison
•Methods to generate stimuli with “naturalistic” properties
•Experimental paradigms and results using model-optimized stimuli
jfeather.bsky.social
Consider submitting your recent work on stimulus synthesis and selection to our special issue at JOV!
eerosim.bsky.social
Submissions now accepted for a special issue of the Journal of Vision:

Choose your stimuli wisely: Advances in stimulus synthesis and selection

Submission deadline: Dec 12, 2025
Futher details: jov.arvojournals.org/ss/synthetic...
JOV Special Issue - Choose your stimuli wisely: Advances in stimulus synthesis and selection | JOV | ARVO Journals
jov.arvojournals.org
Reposted by Jenelle Feather
florentinguth.bsky.social
What is the probability of an image? What do the highest and lowest probability images look like? Do natural images lie on a low-dimensional manifold?
In a new preprint with Zahra Kadkhodaie and @eerosim.bsky.social, we develop a novel energy-based model in order to answer these questions: 🧵
Reposted by Jenelle Feather
irisgroen.bsky.social
Just a few months until Cognitive Computational Neuroscience comes to Amsterdam! Check out our now-complete schedule for #CCN2025, with descriptions of each of the Generative Adversarial Collaborations (GACs), Keynotes-and-Tutorials (K&Ts), Community Events, Keynote Speakers, and social activities!
cogcompneuro.bsky.social
A detailed schedule for CCN2025 is available on our website now, including the specific keynotes / GACs / community events taking place:
2025.ccneuro.org/schedule-of-...

The early bird registration deadline is coming up this Friday (23rd May)!
Reposted by Jenelle Feather
lisik.bsky.social
I’m happy to be at #VSS2025 and share what our lab has been up to this year!

I’m also honored to receive this year’s young investigator award and will give a short talk at the awards ceremony Monday
jfeather.bsky.social
The symposium also serves to kick off a special issue of JOV!

"Choose your stimuli wisely: Advances in stimulus synthesis and selection"
jov.arvojournals.org/ss/synthetic...
Paper Deadline: Dec 12th

For those not able to attend tomorrow, I will strive to post some of the highlights here 👀 👀 👀
JOV Special Issue - Choose your stimuli wisely: Advances in stimulus synthesis and selection | JOV | ARVO Journals
jov.arvojournals.org
jfeather.bsky.social
Super excited for our #VSS2025 symposium tomorrow, "Model-optimized stimuli: more than just pretty pictures".
Join us to talk about designing and using synthetic stimuli for testing properties of visual perception!

May 16th @ 1-3PM in Talk Room #2

More info: www.visionsciences.org/symposia/?sy...
VSS SymposiaSymposia – Vision Sciences Society
www.visionsciences.org
jfeather.bsky.social
This is joint work with fantastic co-authors from @flatironinstitute.org Center for Computational Neuroscience: @lipshutz.bsky.social (co-first) @sarah-harvey.bsky.social @itsneuronal.bsky.social @eerosim.bsky.social
jfeather.bsky.social
These examples demonstrate how our framework can be used to probe for informative differences in local sensitivities between complex models, and suggest how it could be used to compare model representations with human perception.
jfeather.bsky.social
In a second example, we apply our method to a set of deep neural network models and reveal differences in the local geometry that arise due to architecture and training types, illustrating the method's potential for revealing interpretable differences between computational models.
jfeather.bsky.social
As an example, we use this framework to compare a set of simple models of the early visual system, identifying a novel set of image distortions that allow immediate comparison of the models by visual inspection.
jfeather.bsky.social
This provides an efficient method to generate stimulus distortions that discriminate image representations. These distortions can be used to test which model is closest to human perception.
jfeather.bsky.social
We then extend this work to show that the metric may be used to optimally differentiate a set of *many* models, by finding a pair of “principal distortions” that maximize the variance of the models under this metric.
jfeather.bsky.social
We use the FIM to define a metric on the local geometry of an image representation near a base image. This metric can be related to previous work investigating the sensitivities of one or two models.
jfeather.bsky.social
We propose a framework for comparing a set of image representations in terms of their local geometries. We quantify the local geometry of a representation using the Fisher information matrix (FIM), a standard statistical tool for characterizing the sensitivity to local stimulus distortions.