Lucy Dowdall
@lucydowdall.bsky.social
200 followers 130 following 18 posts
Cognitive Neuroscience PhD Student 🧠 | Plasticity Lab, University of Cambridge 🦾 | sensory feedback, sensorimotor learning, and neurotech | she/her
Posts Media Videos Starter Packs
Reposted by Lucy Dowdall
actlab.bsky.social
Happy to announce that my lab @ Yale Psychology (actcompthink.org) will be accepting PhD applications this year (for start in Fall '26)!

Come for the fun experiments on human learning, memory, & skilled behavior, stay for the best 🍕 in the US.

Please reach out if you have any questions!
Homepage of the Action, Computation, & Thinking (ACT) Lab, Yale department of psychology
actcompthink.org
lucydowdall.bsky.social
Excited to see this collaborative work out! A representation of a robotic limb that’s both effector- and controller-independent? We take a deep dive into motor learning generalisation with the Third Thumb. Congrats @mariamolinasan.bsky.social!
🔗 www.biorxiv.org/content/10.1...
Reposted by Lucy Dowdall
plasticity-lab.bsky.social
Our PhD students @maggieszymanska.bsky.social and Julien Russ ready to present their posters today at #BRNet2025! Find them in the poster room at 2pm, talking about phantom limb pain and using EMG to control the Third Thumb 🧠
lucydowdall.bsky.social
Catching up with the community at UK Sensorimotor is always a highlight of the year! Thanks for having us @uksensorimotor25.bsky.social @ox.ac.uk !
lucydowdall.bsky.social
Check out the full pre-print now: doi.org/10.1101/2025.06.16.658246. This work was truly a collaborative and interdisciplinary effort, and we’d like to thank all of our collaborators, and the over 100 participants that made this work possible! @plasticity-lab.bsky.social
lucydowdall.bsky.social
Together, these results demonstrate that our somatosensory system can facilitate an immediate and accessible sensory representation of the Third Thumb. This is then refined through experience, allowing integration of the Third Thumb into the hand representation across the sensorimotor hierarchy
lucydowdall.bsky.social
Finally, we not only found increased subjective somatosensory embodiment following Third Thumb training, but this increase also correlated with increased similarity between the Third Thumb and biological fingers in S1
lucydowdall.bsky.social
Within the S1 hand representation, we saw experience-dependent refinement of the Third Thumb’s sensory representation following training, as it became more similar to the biological fingers in a way not seen for our control group
lucydowdall.bsky.social
Following Third thumb training, markerless hand tracking revealed reduced co-usage amongst the biological fingers as the Third Thumb became integrated into the hand’s coordination patterns
lucydowdall.bsky.social
To explore how sensorimotor experience refines this representation, participants then underwent 7 days of motor training involving Third Thumb-biological hand collaboration, or our control group instead trained to play the piano keyboard
lucydowdall.bsky.social
Using RSA, we found the immediate emergence of a topographically organised Third Thumb sensory representation within S1. This representation is also distinct from the biological palm representation
lucydowdall.bsky.social
But how does the brain then represent this natural sensory information? We next stimulated the Third Thumb and the biological hand, and used fMRI to map their sensory representations in primary somatosensory cortex
lucydowdall.bsky.social
Despite the tasks being designed for the artificial systems, participants performed equally, or even outperformed, with the natural feedback, demonstrating that meaningful, interpretable information can be extracted from the natural feedback across versatile task demands
lucydowdall.bsky.social
To put this natural feedback to the test, we created 2 new Third Thumb systems - one integrated artificial skin stretch feedback, the other integrated artificial vibrotactile feedback. We then compared the artificial to the 'natural' feedback across perceptual discrimination tasks
lucydowdall.bsky.social
This is possible through the previously neglected role of the ‘natural’ feedback received as a natural by-product of how artificial limbs interface with the biological body. For example, the Third Thumb is worn and moves on the hand, this naturally generates sensory information felt across the palm
lucydowdall.bsky.social
However, despite having no artificial feedback, we have previously demonstrated that people can perform a coordination task with the Third Thumb when blindfolded, relying on a sense of proprioceptive awareness of the robotic digit
lucydowdall.bsky.social
Somatosensory feedback is crucial for motor learning, yet artificial limbs are thought to lack such feedback. Research has therefore focused on creating artificial sensory feedback, but these signals do not replicate the rich, multimodal information available from natural touch-see shorturl.at/2PMyU
lucydowdall.bsky.social
Excited to share our new interdisciplinary work exploring the sensory representation of an artificial body part, combing datasets and methodologies to explore sensorimotor integration of the Third Thumb, a 2 DoF hand augmentation device (@daniclode.bsky.social) doi.org/10.1101/2025.06.16.658246
Reposted by Lucy Dowdall
plasticity-lab.bsky.social
Neurotech is rapidly evolving, but user-centred technologies can only succeed if their development actively involves the end user. We’ve published a basic roadmap for supporting inclusive design testing - highlighting key strategies, stakeholders, representation, and measurable outcomes rb.gy/ng1jus
lucydowdall.bsky.social
New essential reading for anybody working in sensory feedback!
plasticity-lab.bsky.social
Very proud to share a new review (more like a long perspective), just out in Science Advances: rebrand.ly/mdbi8q8. Ilana Nisky and Tamar @cambridgeuni.bsky.social joined heads (brains) to reimagine how to make the most out of artificial haptic interfaces. A thread (1/8)
A neurocognitive pathway for engineering artificial touch
Neurocognitive congruence is key to designing artificial touch that best integrates with human perception and action.
rebrand.ly
Reposted by Lucy Dowdall
plasticity-lab.bsky.social
Now you’ve met the lab, come work with us! We’re hiring a Research Assistant to join the Plasticity lab 🦾 in early 2025 at the MRC Cognition and Brain Sciences Unit, University of Cambridge. Find out more information here: www.jobs.cam.ac.uk/job/49393/. Closing date 8th of January! 🧠
Reposted by Lucy Dowdall
plasticity-lab.bsky.social
We’re thrilled to share our new research article, "Shaping the developing homunculus: the roles of deprivation and compensatory behaviour in sensory remapping on bioRxiv: rebrand.ly/qbz6nal with @dorothycowie.bsky.social
lucydowdall.bsky.social
Come meet our lab and find out about our work!
plasticity-lab.bsky.social
I’m Tamar Makin, leader of the Plasticity Lab @CBU. As an intro to Bluesky, I asked my team to choose their favourite lab paper. My pick sums up a decade of the lab’s work: John Krakauer @JHU & I argue that one brain area can’t simply take over another, challenging remapping claims: rb.gy/n4dr03