Maarten C. Ottenhoff
@mottenhoff.bsky.social
64 followers 190 following 22 posts
Postdoctoral researcher | Motor decoding | Intracranial Brain-computer interfaces | @BrainGate
Posts Media Videos Starter Packs
mottenhoff.bsky.social
Just learned I was nominated for this years #BCIAward! 🏆
We created a bimanual iBCI that enabled simultaneous neural control of two cursors on the first day of use.

www.bci-award.com/Home
BCI Award: Submit now!
The Annual BCI Award, endowed with 3,000 USD, is one of the top accolades in BCI research.
www.bci-award.com
mottenhoff.bsky.social
Check out our latest speech work!
mverwoert.bsky.social
Can we move beyond the motor cortex for speech neuroprosthetics? We've explored just this question in our paper published today in Cell Reports!🧠
Check it out! 👉https://sciencedirect.com/science/article/pii/S2211124725010125
Moving beyond the motor cortex: A brain-wide evaluation of target locations for intracranial speech neuroprostheses
Speech brain-computer interfaces (BCIs) offer a solution for those affected by speech impairments by decoding brain activity into speech. Current neur…
sciencedirect.com
Reposted by Maarten C. Ottenhoff
ki.se
Applications are now open! We are recruiting 20 Assistant Professors in a wide range of subject areas. We're looking for early-career researchers with strong scientific merits and future potential.
🔗 All positions: ki.se/en/about-ki/...
Reposted by Maarten C. Ottenhoff
mushtaqbilalphd.bsky.social
Elsevier's profit margin compared to Apple, Google, and Microsoft

Apple: 28%
Google: 25%
Microsoft: 34%

Elsevier: 37% with a revenue of $3.9 billion.

Elsevier's payment to academic authors and reviewers: $0
mottenhoff.bsky.social
In any case, I have officially awarded myself with best poster🏆
mottenhoff.bsky.social
Can we stop handing out useless awards like poster awards? When is the last time you heard any criteria for the award OR an explanation why a poster has won? Let's stop pretending an arbitrary decision is a measure of excellence.
Reposted by Maarten C. Ottenhoff
cherff.bsky.social
Maxime Verwoert's paper on different representations of speech production in distributed intracranial recordings of neural activity is now published:
www.nature.com/articles/s42...
Reposted by Maarten C. Ottenhoff
Reposted by Maarten C. Ottenhoff
andreashorn.org
I am hiring: PhD students, postdocs, admin personnel. Email me if you know somebody that may be interested in joining us in Cologne! 👇👇

Please RT for reach! 🙏🙏
andreashorn.org
I am excited to announce that I will be founding an institute for network stimulation at the University Hospital Cologne – already this coming May!
mottenhoff.bsky.social
Thanks for your kind response. Do you think these brain-wide activation could be related to the action-mode network? The AMN seems to be a global state in order to plan goal-direction behavior.
mottenhoff.bsky.social
We'd love to hear your thoughts!
mottenhoff.bsky.social
Glad you like it! Note that this work is from my PhD work which I did at neuralinterfacinglab.github.io . I am currently a postdoc a BrainGate working on new projects on movement decoding using microelectrode arrays in the motor cortex!
Neural Interfacing Lab at Maastricht University
Welcome to the Neural Interfacing Lab in the Department for Neurosurgery at Maastricht Univers...
neuralinterfacinglab.github.io
mottenhoff.bsky.social
And finally, science is a team-effort. We are incredible thankful for all collaborators providing their input and expertise, the invaluable staff at Kempenhaeghe and our participants for providing their time and effort.
mottenhoff.bsky.social
We are excited about this work and hope you enjoy reading it. As we always do in our lab, we share code and data publicly. Please find the code here: github.com/mottenhoff/d.... We'll make the data available at publication.
GitHub - mottenhoff/decoding-continuous-goal-directed-movements
Contribute to mottenhoff/decoding-continuous-goal-directed-movements development by creating an account on GitHub.
github.com
mottenhoff.bsky.social
Moreover, our results suggest that goal-directed movement in the brain is represented in relationship to that goal. Current brain-computer interfaces do already use this goal-directed reference frame to calibrate decoders. However, it does require knowledge of the location of the goal.
mottenhoff.bsky.social
Our results provide a comprehensive overview of the decodable movement-related information in brain-wide neural activity, and strengthen our previous results that the whole brain may be involved in generating movement.
mottenhoff.bsky.social
In this work, the decoder reached the highest performance using low frequency activity. The phase information is important, as the performance was substantially lower when we used the power. This neural signal shares similarities with the local motor potential described in M1.
mottenhoff.bsky.social
About the decoding of the directional kinematics. It turns out that if we change the reference frame from the sensor to a goal-directed reference frame, the decoder is able the position w.r.t the target above chance!
mottenhoff.bsky.social
If we look at the electrode contact locations that were significantly correlated with hand movement speed, we observe that these contacts are located throughout the brain, including cortical and subcortical structures.
mottenhoff.bsky.social
We found that non-directional hand movement speed can be decoded using low, mid and high frequency information. Particularly, the low frequency activity reached substantial decoding performance (up to 0.76 + 0.03). Directional hand kinematics seemed not decodable.
mottenhoff.bsky.social
We extracted low frequency activity (< 5 HZ), mid frequency power (8 - 30 Hz) and high-frequency power (55 - 200 Hz), windowed the data (300ms length, 50ms shift), and used preferential subspace identification (PSID) to decode 12 kinematics.
mottenhoff.bsky.social
To investigate, we recorded 3D hand movement trajectory while our participants played a custom game from which we retrieved position, velocity, speed and acceleration. At the same time we recorded neural activity from cortical and sub-cortical brain areas.
3D figure showing the movement trajectory as blue line and captured targets as gold circles
mottenhoff.bsky.social
In previous work, we described how movement can be decoded from brain-wide neural activity, across tasks and across participants. However, this was only decoding between move and rest, so naturally we wanted to know more about the neural content of these brain-wide dynamics.