Will Turner
@renrutmailliw.bsky.social
110 followers 160 following 5 posts
cognitive neuroscience postdoc at stanford https://bootstrapbill.github.io/ he/him
Posts Media Videos Starter Packs
Pinned
renrutmailliw.bsky.social
New paper out in @plosbiology.org w/ Charlie, @phil-johnson.bsky.social, Ella, and Hinze 🎉

We track moving stimuli via EEG, find evidence that motion is extrapolated across distinct stages of processing + show how this effect may emerge from a simple synaptic learning rule!

tinyurl.com/2szh6w5c
Reposted by Will Turner
lauragwilliams.bsky.social
really fun getting to think about the "time to consciousness" with this dream team! we discuss interesting parallels between vision and language processing on phenomena like postdictive perceptual effects, among other things! check it out 😄
renrutmailliw.bsky.social
New BBS article w/ @lauragwilliams.bsky.social and Hinze Hogendoorn, just accepted! We respond to a thought-provoking article by @smfleming.bsky.social & @matthiasmichel.bsky.social, and argue that it's premature to conclude that conscious perception is delayed by 350-450ms: bit.ly/4nYNTlb
OSF
bit.ly
Reposted by Will Turner
quining.bsky.social
🚨 Out now in @commspsychol.nature.com 🚨
doi.org/10.1038/s442...

Our #RegisteredReport tested whether the order of task decisions and confidence ratings bias #metacognition.

Some said decisions → confidence enhances metacognition. If true, decades of findings will be affected.
A picture of our paper's abstract and title: The order of task decisions and confidence ratings has little effect on metacognition.

Task decisions and confidence ratings are fundamental measures in metacognition research, but using these reports requires collecting them in some order. Only three orders exist and are used in an ad hoc manner across studies. Evidence suggests that when task decisions precede confidence, this report order can enhance metacognition. If verified, this effect pervades studies of metacognition and will lead the synthesis of this literature to invalid conclusions. In this Registered Report, we tested the effect of report order across popular domains of metacognition and probed two factors that may underlie why order effects have been observed in past studies: report time and motor preparation. We examined these effects in a perception experiment (n = 75) and memory experiment (n = 50), controlling task accuracy and learning. Our registered analyses found little effect of report order on metacognitive efficiency, even when timing and motor preparation were experimentally controlled. Our findings suggest the order of task decisions and confidence ratings has little effect on metacognition, and need not constrain secondary analysis or experimental design.
renrutmailliw.bsky.social
Thanks to Steve and Matthias for writing this interesting and ambitious theoretical perspective: bit.ly/4jF4kRp.

Although we don’t (yet) agree w/ one of their foundational claims, we think this perspective is valuable, and should spawn lots of important discussions and follow-up work :)
Sensory Horizons and the Functions of Conscious Vision | Behavioral and Brain Sciences | Cambridge Core
Sensory Horizons and the Functions of Conscious Vision
bit.ly
renrutmailliw.bsky.social
New BBS article w/ @lauragwilliams.bsky.social and Hinze Hogendoorn, just accepted! We respond to a thought-provoking article by @smfleming.bsky.social & @matthiasmichel.bsky.social, and argue that it's premature to conclude that conscious perception is delayed by 350-450ms: bit.ly/4nYNTlb
OSF
bit.ly
Reposted by Will Turner
bryanlimy.bsky.social
We present our preprint on ViV1T, a transformer for dynamic mouse V1 response prediction. We reveal novel response properties and confirm them in vivo.

With @wulfdewolf.bsky.social, Danai Katsanevaki, @arnoonken.bsky.social, @rochefortlab.bsky.social.

Paper and code at the end of the thread!

🧵1/7
Reposted by Will Turner
malcolmgcampbell.bsky.social
🚨Our preprint is online!🚨

www.biorxiv.org/content/10.1...

How do #dopamine neurons perform the key calculations in reinforcement #learning?

Read on to find out more! 🧵
Reposted by Will Turner
mellwoodlowe.bsky.social
I’m hiring!! 🎉 Looking for a full-time Lab Manager to help launch the Minds, Experiences, and Language Lab at Stanford. We’ll use all-day language recording, eye tracking, & neuroimaging to study how kids & families navigate unequal structural constraints. Please share:
phxc1b.rfer.us/STANFORDWcqUYo
Research Coordinator, Minds, Experiences, and Language Lab in Graduate School of Education, Stanford, California, United States
The Stanford Graduate School of Education (GSE) seeks a full-time Research Coordinator (acting lab manager) to help launch and coordinate the Minds,.....
phxc1b.rfer.us
Reposted by Will Turner
nadinedijkstra.bsky.social
Looking forward to #ICON2025 next week! We will have several presentations on mental imagery, reality monitoring and expectations:

To kick us off, on Tuesday at 15:30, Martha Cottam will present:

P2.12 | Presence Expectations Modulate the Neural Signatures of Content Prediction Errors
Reposted by Will Turner
kriesjill.bsky.social
In August I had the pleasure to present a poster at the Cognitive Computational Neuroscience (CCN) conference in Amsterdam. My poster was about 𝘁𝗵𝗲 𝗱𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁𝗮𝗹 𝘁𝗿𝗮𝗷𝗲𝗰𝘁𝗼𝗿𝘆 𝗮𝗻𝗱 𝗻𝗲𝘂𝗿𝗼𝗮𝗻𝗮𝘁𝗼𝗺𝗶𝗰𝗮𝗹 𝗰𝗼𝗿𝗿𝗲𝗹𝗮𝘁𝗲𝘀 𝗼𝗳 𝘀𝗽𝗲𝗲𝗰𝗵 𝗰𝗼𝗺𝗽𝗿𝗲𝗵𝗲𝗻𝘀𝗶𝗼𝗻 🧒➡️🧑 🧠
Reposted by Will Turner
brainboyben.bsky.social
🚨Pre-print of some cool data from my PhD days!
doi.org/10.1101/2025...

☝️Did you know that visual surprise is (probably) a domain-general signal and/or operates at the object-level?
✌️Did you also know that the timing of this response depends on the specific attribute that violates an expectation?
The Latency of a Domain-General Visual Surprise Signal is Attribute Dependent
Predictions concerning upcoming visual input play a key role in resolving percepts. Sometimes input is surprising, under which circumstances the brain must calibrate erroneous predictions so that perc...
doi.org
Reposted by Will Turner
gretatuckute.bsky.social
Humans largely learn language through speech. In contrast, most LLMs learn from pre-tokenized text.

In our #Interspeech2025 paper, we introduce AuriStream: a simple, causal model that learns phoneme, word & semantic information from speech.

Poster P6, tomorrow (Aug 19) at 1:30 pm, Foyer 2.2!
Reposted by Will Turner
lauragwilliams.bsky.social
looking forward to seeing everyone at #CCN2025! here's a snapshot of the work from my lab that we'll be presenting on speech neuroscience 🧠 ✨
Reposted by Will Turner
manikyaalister.bsky.social
We know that a consensus of opinions is persuasive, but how reliable is this effect across people and types of consensus, and are there any kinds of claims where people care less about what other people think? This is what we tested in our new(ish) paper in @psychscience.bsky.social
Screenshot of the article "How Convincing Is a Crowd? Quantifying the Persuasiveness of a Consensus for Different Individuals and Types of Claims"
Reposted by Will Turner
brainboyben.bsky.social
I really like this paper. I fear that people think the authors are claiming that the brain isn’t predictive though, which this study cannot (and does not) address. As the title says, the data purely show that evoked responses are not necessarily prediction errors, which makes sense!
tyrellturing.bsky.social
1/3) This may be a very important paper, it suggests that there are no prediction error encoding neurons in sensory areas of cortex:

www.biorxiv.org/content/10.1...

I personally am a big fan of the idea that cortical regions (allo and neo) are doing sequence prediction.

But...

🧠📈 🧪
Sensory responses of visual cortical neurons are not prediction errors
Predictive coding is theorized to be a ubiquitous cortical process to explain sensory responses. It asserts that the brain continuously predicts sensory information and imposes those predictions on lo...
www.biorxiv.org
Reposted by Will Turner
plosbiology.org
It takes time for the #brain to process information, so how can we catch a flying ball? @renrutmailliw.bsky.social &co reveal a multi-stage #motion #extrapolation occurring in the #HumanBrain, shifting the represented position of moving objects closer to real time @plosbiology.org 🧪 plos.io/3Fm83Fc
Mapping the position of moving stimuli. The top three panels show the three events of interest: stimulus onset, stimulus offset, and stimulus reversal (left to right). The bottom three panels show group-level probabilistic spatio-temporal maps centered around these three events. Diagonal black lines mark the true position of the stimulus. Horizontal dashed lines mark the time of the event of interest (stimulus onset, offset, or reversal). Red indicates high probability regions and blue indicates low probability regions (‘position evidence’ gives the difference between the posterior probability and chance). Note: these maps were generated from recordings at posterior/occipital sites.
Reposted by Will Turner
plosbiology.org
It takes time for the #brain to process information, so how can we catch a flying ball? This study provides evidence of multi-stage #motion #extrapolation occurring in the #HumanBrain, shifting the represented position of moving objects closer to real time @plosbiology.org 🧪 plos.io/3Fm83Fc
Mapping the position of moving stimuli. The top three panels show the three events of interest: stimulus onset, stimulus offset, and stimulus reversal (left to right). The bottom three panels show group-level probabilistic spatio-temporal maps centered around these three events. Diagonal black lines mark the true position of the stimulus. Horizontal dashed lines mark the time of the event of interest (stimulus onset, offset, or reversal). Red indicates high probability regions and blue indicates low probability regions (‘position evidence’ gives the difference between the posterior probability and chance). Note: these maps were generated from recordings at posterior/occipital sites.
renrutmailliw.bsky.social
Thanks Henry! All kudos really go to Charlie for the modelling! Hope all is well in Brissy :)
Reposted by Will Turner
gretatuckute.bsky.social
What are the organizing dimensions of language processing?

We show that voxel responses during comprehension are organized along 2 main axes: processing difficulty & meaning abstractness—revealing an interpretable, topographic representational basis for language processing shared across individuals
renrutmailliw.bsky.social
New paper out in @plosbiology.org w/ Charlie, @phil-johnson.bsky.social, Ella, and Hinze 🎉

We track moving stimuli via EEG, find evidence that motion is extrapolated across distinct stages of processing + show how this effect may emerge from a simple synaptic learning rule!

tinyurl.com/2szh6w5c
Reposted by Will Turner
ayeletlandau.bsky.social
📣cog neuro postdoc opportunity! Interested in studying attention & exploration w/ cutting edge M/EEG? 🧠care about making vision science a bit more naturalistic? 🌱LandauLab is hiring! We seek resourceful, curious and creative researchers who can join the newly forming London-based [email protected]! ...
UCL – University College London
UCL is consistently ranked as one of the top ten universities in the world (QS World University Rankings 2010-2022) and is No.2 in the UK for research power (Research Excellence Framework 2021).
www.ucl.ac.uk
Reposted by Will Turner
jeanremiking.bsky.social
I'm very pleased to share our latest study:
‘Emergence of Language in the Developing Brain’,
by L Evanson, P Bourdillon et al:
- Paper: ai.meta.com/research/pub...
- Blog: ai.meta.com/blog/meta-fa...
- Thread below 👇
Reposted by Will Turner
matthiasmichel.bsky.social
Very happy to announce that our paper “Sensory Horizons and the Functions of Conscious Vision” is now out as a target article in BBS!! @smfleming.bsky.social and I present a new theory of the evolution and functions of visual consciousness. Article here: doi.org/10.1017/S014.... A (long) thread 🧵
Sensory Horizons and the Functions of Conscious Vision | Behavioral and Brain Sciences | Cambridge Core
Sensory Horizons and the Functions of Conscious Vision
doi.org
Reposted by Will Turner