Mozes Jacobs
@mozesjacobs.bsky.social
48 followers 7 following 14 posts
PhD student at the Kempner Institute at Harvard University. Interested in Computer Vision and Theoretical Neuroscience. Advised by Demba Ba.
Posts Media Videos Starter Packs
Pinned
mozesjacobs.bsky.social
Traveling waves of neural activity are observed all over the brain. Can they be used to augment neural networks?

I am thrilled to share our new work, "Traveling Waves Integrate Spatial Information Through Time" with @andykeller.bsky.social!

1/13
Reposted by Mozes Jacobs
mozesjacobs.bsky.social
A massive thank you to all those involved in this work: Lyle Muller, Roberto Budzinski, and Demba Ba!
Reposted by Mozes Jacobs
andykeller.bsky.social
In the physical world, almost all information is transmitted through traveling waves -- why should it be any different in your neural network?

Super excited to share recent work with the brilliant @mozesjacobs.bsky.social: "Traveling Waves Integrate Spatial Information Through Time"

1/14
mozesjacobs.bsky.social
Here are some examples of the wave dynamics used to segment Multi-MNIST images:

11/13
mozesjacobs.bsky.social
We also compared our model to U-Nets, which have global receptive fields via skip connections and bottlenecks.

Incredibly, on Multi-MNIST, wave-based models outperformed similarly sized U-Nets, despite having fewer parameters and only local connectivity.

10/13
mozesjacobs.bsky.social
Notably, CNNs with small receptive fields (small # of layers) are unable to segment these images, while deeper models - with large receptive fields - are sometimes able to solve the task, but are generally more unstable yielding lower average performance and significantly higher variance.

9/13
mozesjacobs.bsky.social
We see that more complex linear transformations of the hidden state timeseries are the best for extracting the global information, with a learned linear transformation performing the best (even better than the Fourier transform or the common technique of using the last RNN hidden state).

8/13
mozesjacobs.bsky.social
We then studied both our wave-biased model and a standard ConvLSTM (with no wave-based inductive bias). Incredibly, we found that both models learned to generate waves. The ConvLSTM’s emergent waves (shown below on a Tetrominoes image) suggest a degree of optimality for a wave-based solution.

7/13
mozesjacobs.bsky.social
To test this, we built a trainable RNN (the Neural Wave Machine/NWM) that generates traveling waves in its hidden states. We began by testing it on segmenting simple polygons.

We find that wave-based models produce unique dynamics for each shape, resulting in distinct Fourier spectra.

6/13
mozesjacobs.bsky.social
We found that we could actually predict the area of the drums analytically by looking at the frequency of oscillations of each neuron (see below).

This finding led us to wonder: can we actually learn (via trainable parameters) dynamics for more complex shapes?

5/13
mozesjacobs.bsky.social
The problem "Can One Hear the Shape of a Drum", posed by Mark Mac, is a classical example of spatial integration. Strike a drumhead, and its vibrations encode the boundary shape.

We can see (with fixed RNNs that simulate drums) that different sized drumheads have different dynamics:

4/13
mozesjacobs.bsky.social
Spatial integration means that a neuron at one location can access signals from distant points. This could mean linking information together across an image to classify objects or linking words together in a sentence to derive meaning.

3/13
mozesjacobs.bsky.social
The act of vision is a coordinated activity involving millions of neurons in the visual cortex. How is information shared over these large distances?

Evidence suggests traveling waves could carry this information across space, allowing neurons to “know” what’s happening far away.

2/13
mozesjacobs.bsky.social
Traveling waves of neural activity are observed all over the brain. Can they be used to augment neural networks?

I am thrilled to share our new work, "Traveling Waves Integrate Spatial Information Through Time" with @andykeller.bsky.social!

1/13
Reposted by Mozes Jacobs
kempnerinstitute.bsky.social
New research shows neurons learn to encode and transmit information to other spatially distant neurons through traveling waves. Read more in the #KempnerInstitute’s blog: bit.ly/3DrIPEq