Avery HW Ryoo
@averyryoo.bsky.social
1.1K followers 300 following 49 posts
i like generative models, science, and Toronto sports teams phd @ mila/udem, prev. @ uwaterloo averyryoo.github.io 🇨🇦🇰🇷
Posts Media Videos Starter Packs
Pinned
averyryoo.bsky.social
New preprint! 🧠🤖

How do we build neural decoders that are:
⚡️ fast enough for real-time use
🎯 accurate across diverse tasks
🌍 generalizable to new sessions, subjects, and even species?

We present POSSM, a hybrid SSM architecture that optimizes for all three of these axes!

🧵1/7
Reposted by Avery HW Ryoo
charlottevolk.bsky.social
🚨 New preprint alert!

🧠🤖
We propose a theory of how learning curriculum affects generalization through neural population dimensionality. Learning curriculum is a determining factor of neural dimensionality - where you start from determines where you end up.
🧠📈

A 🧵:

tinyurl.com/yr8tawj3
The curriculum effect in visual learning: the role of readout dimensionality
Generalization of visual perceptual learning (VPL) to unseen conditions varies across tasks. Previous work suggests that training curriculum may be integral to generalization, yet a theoretical explan...
tinyurl.com
Reposted by Avery HW Ryoo
nandahkrishna.bsky.social
Excited to share that POSSM has been accepted to #NeurIPS2025! See you in San Diego 🏖️
averyryoo.bsky.social
New preprint! 🧠🤖

How do we build neural decoders that are:
⚡️ fast enough for real-time use
🎯 accurate across diverse tasks
🌍 generalizable to new sessions, subjects, and even species?

We present POSSM, a hybrid SSM architecture that optimizes for all three of these axes!

🧵1/7
Reposted by Avery HW Ryoo
davidevaleriani.bsky.social
I'm very excited to announce the publication of our new book Neural Interfaces, published by Elsevier. The book is a comprehensive resource for all those interested and gravitating around neural interfaces and brain-computer interfaces (BCIs).

shop.elsevier.com/books/neural...
Neural Interfaces
Neural Interfaces is a comprehensive book on the foundations, major breakthroughs, and most promising future developments of neural interfaces. The bo
shop.elsevier.com
Reposted by Avery HW Ryoo
mirandrom.bsky.social
Step 1: Understand how scaling improves LLMs.
Step 2: Directly target underlying mechanism.
Step 3: Improve LLMs independent of scale. Profit.

In our ACL 2025 paper we look at Step 1 in terms of training dynamics.

Project: mirandrom.github.io/zsl
Paper: arxiv.org/pdf/2506.05447
Reposted by Avery HW Ryoo
majhas.bsky.social
(1/n)🚨Train a model solving DFT for any geometry with almost no training data
Introducing Self-Refining Training for Amortized DFT: a variational method that predicts ground-state solutions across geometries and generates its own training data!
📜 arxiv.org/abs/2506.01225
💻 github.com/majhas/self-...
Reposted by Avery HW Ryoo
dvnxmvlhdf5.bsky.social
Preprint Alert 🚀

Multi-agent reinforcement learning (MARL) often assumes that agents know when other agents cooperate with them. But for humans, this isn’t always the case. For example, plains indigenous groups used to leave resources for others to use at effigies called Manitokan.
1/8
Manitokan are images set up where one can bring a gift or receive a gift. 1930s Rocky Boy Reservation, Montana, Montana State University photograph. Colourized with AI
averyryoo.bsky.social
Finally, we show POSSM's performance on speech decoding - a long context task that can quickly grow expensive for Transformers. In the unidirectional setting, POSSM beats the GRU baseline, achieving a phoneme error rate (PER) of 27.3 while having more robustness to variation in preprocessing.

🧵6/7
averyryoo.bsky.social
Cross-species transfer! 🐵➡️🧑

Excitingly, we find that POSSM pretrained solely on monkey reaching data achieves SOTA performance when decoding imagined handwriting in human subjects! This shows the potential of leveraging NHP data to bootstrap human BCI decoding in low-data clinical settings.

🧵5/7
averyryoo.bsky.social
By pretraining on 140 monkey reaching sessions, POSSM effectively transfers to new subjects and tasks, matching or outperforming several baselines (e.g., GRU, POYO, Mamba) across sessions.

✅ High R² across the board
✅ 9× faster inference than Transformers
✅ <5ms latency per prediction

🧵4/7
averyryoo.bsky.social
POSSM combines the real-time inference of an RNN with the tokenization, pretraining, and finetuning abilities of a Transformer!

Using POYO-style tokenization, we encode spikes in 50ms windows and stream them to a recurrent model (e.g., Mamba, GRU) for fast, frequent predictions over time.

🧵3/7
averyryoo.bsky.social
The problem with existing decoders?

😔 RNNs offer efficient, causal inference, but rely on rigid, binned input formats - limiting generalization to new neurons or sessions.

😔 Transformers enable generalization via tokenization, but have high computational costs due to the attention mechanism.

🧵2/7
averyryoo.bsky.social
New preprint! 🧠🤖

How do we build neural decoders that are:
⚡️ fast enough for real-time use
🎯 accurate across diverse tasks
🌍 generalizable to new sessions, subjects, and even species?

We present POSSM, a hybrid SSM architecture that optimizes for all three of these axes!

🧵1/7
Reposted by Avery HW Ryoo
btolooshams.bsky.social
I am joining @ualberta.bsky.social as a faculty member and
@amiithinks.bsky.social!

My research group is recruiting MSc and PhD students at the University of Alberta in Canada. Research topics include generative modeling, representation learning, interpretability, inverse problems, and neuroAI.
Reposted by Avery HW Ryoo
mehdiazabou.bsky.social
Scaling models across multiple animals was a major step toward building neuro-foundation models; the next frontier is enabling multi-task decoding to expand the scope of training data we can leverage.

Excited to share our #ICLR2025 Spotlight paper introducing POYO+ 🧠

poyo-plus.github.io

🧵
POYO+
POYO+: Multi-session, multi-task neural decoding from distinct cell-types and brain regions
poyo-plus.github.io
Reposted by Avery HW Ryoo
tyrellturing.bsky.social
Interested in foundation models for #neuroscience? Want to contribute to the development of the next-generation of multi-modal models? Come join us at IVADO in Montreal!

We're hiring a full-time machine learning specialist for this work.

Please share widely!

#NeuroAI 🧠📈 🧪
ivado.bsky.social
🔍 [Job Offer] #MachineLearning Specialist.

Join the IVADO Research Regroupement - AI and Neuroscience (R1) to develop foundational models in the field of neuroscience.

More info: ivado.ca/2025/04/08/s...

#JobOffer #AI #Neuroscience #Research #MachineLearning
Machine Learning Specialist | IVADO
ivado.ca
Reposted by Avery HW Ryoo
satpreetsingh.bsky.social
📽️Recordings from our
@cosynemeeting.bsky.social
#COSYNE2025 workshop on “Agent-Based Models in Neuroscience: Complex Planning, Embodiment, and Beyond" are now online: neuro-agent-models.github.io
🧠🤖
Reposted by Avery HW Ryoo
nandahkrishna.bsky.social
Talk recordings from our COSYNE Workshop on Neuro-foundation Models 🌐🧠 are now up on the workshop website!

neurofm-workshop.github.io
averyryoo.bsky.social
Very late, but had a 🔥 time at my first Cosyne presenting my work with @nandahkrishna.bsky.social, Ximeng Mao, @mattperich.bsky.social, and @glajoie.bsky.social on real-time neural decoding with hybrid SSMs. Keep an eye out for a preprint (hopefully) soon 👀

#Cosyne2025 @cosynemeeting.bsky.social
Reposted by Avery HW Ryoo
charlottevolk.bsky.social
Excited to be at #Cosyne2025 for the first time! I'll be presenting my poster [2-104] during the Friday session. E-poster here: www.world-wide.org/cosyne-25/se...
Reposted by Avery HW Ryoo
shahabbakht.bsky.social
We'll be presenting two projects at #Cosyne2025, representing two main research directions in our lab:

🧠🤖 🧠📈

1/3
averyryoo.bsky.social
@oliviercodol.bsky.social my opportunity to lose to scientists in a different field
averyryoo.bsky.social
Just a couple days until Cosyne - stop by [3-083] this Saturday and say hi! @nandahkrishna.bsky.social
Reposted by Avery HW Ryoo
tyrellturing.bsky.social
This will be a more difficult Cosyne than normal, due to both the travel restrictions for people coming from the US and the strike that may be happening at the hotel in Montreal.

But, we can still make this an awesome meeting as usual, y'all. Let's pull together and make it happen!

🧠📈
#Cosyne2025