Christoph Strauch
@cstrauch.bsky.social
460 followers 440 following 75 posts
Assistant professor @utrechtuniversity.bsky.social studying spatial attention, eye-movements, pupillometry, and more. Co-PI @attentionlab.bsky.social
Posts Media Videos Starter Packs
cstrauch.bsky.social
I'm still waiting for you to write that package that recovers pupil size from mri recordings - gotta find a way to make all that fmri data useful ;-)
Reposted by Christoph Strauch
matthiasnau.bsky.social
Incredible study by Raut et al.: by tracking a single measure (pupil size), you can model slow, large-scale dynamics in neuronal calcium, metabolism, and brain blood oxygen through a shared latent space! www.nature.com/articles/s41...
cstrauch.bsky.social
I'll show some (I think) cool stuff about how we can measure the phenomenology of synesthesia in a physiological way at #ECVP - Color II, atrium maximum, 9:15, Thursday.

say hi and show your colleagues that you're one of the dedicated ones by getting up early on the last day!
cstrauch.bsky.social
#ECVP2025 starts with a fully packed room!

I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!
Reposted by Christoph Strauch
sebastiaanmathot.bsky.social
A good reason to make it to the early #visualcognition session of today's #ecvp2025 👉 @anavili.bsky.social will talk about how attending to fuzzy bright/dark patches that have faded from awareness (through adaptation-like processes) still affect pupil size! ⚫👀⚪ Paper: dx.doi.org/10.1016/j.co...
Redirecting
dx.doi.org
cstrauch.bsky.social
#ECVP2025 starts with a fully packed room!

I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!
Reposted by Christoph Strauch
eemstewart.bsky.social
Eye movements are cheap, right? Not necessarily! 💰 In our review just out in @natrevpsychol.nature.com, Alex Schütz and I discuss the different costs associated with making an eye movement, how these costs affect behaviour, and the challenges of measuring this… rdcu.be/eAm69 #visionscience #vision
A review of the costs of eye movements
Nature Reviews Psychology - Eye movements are the most frequent movements that humans make. In this Review, Schütz and Stewart integrate evidence regarding the costs of eye movements and...
rdcu.be
cstrauch.bsky.social
I think there is a lot one doesn't think of intuitively. Lossy compression of audio files directly built on psychophysics, for instance (no (hardcore experimental)psychology, no spotify!). Or take all the work foundational for artificial neural networks that comes from cognitive psych&modeling
cstrauch.bsky.social
Together with @ajhoogerbrugge.bsky.social, Roy Hessels and Ignace Hooge - thanks all!
cstrauch.bsky.social
Gaze heatmaps (are popular especially for eye-tracking beginners and in many applied domains. How many participants should be tested?
Depends of course, but our guidelines help navigating this in an informed way.

Out now in BRM (free) doi.org/10.3758/s134...
@psychonomicsociety.bsky.social
data saturation for gaze heatmaps. Initially, any additional participant will bring the total NSS or AUC as measures for heatmap similarity a lot closer to the full sample. However, the returns diminish increasingly at higher n.
Reposted by Christoph Strauch
elkanakyurek.bsky.social
Jelmer Borst and I are looking for a PhD candidate to build an EEG-based model of human working memory! This is a really cool project that I've wanted to kick off for a while, and I can't wait to see it happen. Please share and I'm happy to answer any Qs about the project!
www.rug.nl/about-ug/wor...
Vacatures bij de RUG
www.rug.nl
Reposted by Christoph Strauch
rademaker.bsky.social
Curious about the visual human brain, a vibrant and collaborative lab, and pursuing a PhD in the heart of Europe? My lab is recruiting for a 3-year PhD position. More details: www.rademakerlab.com/job-add
PhD position — Rademaker lab
www.rademakerlab.com
cstrauch.bsky.social
so nice, they are lucky to have you over there!
cstrauch.bsky.social
We had a splendid day: great weather, got to wear peculiar/special clothes, and then Alex even defended his PhD (and nailed it!).

Congratulations dr. Alex, super proud of your achievements!!!
ajhoogerbrugge.bsky.social
Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!

The dissertation is available here: doi.org/10.33540/2960
Reposted by Christoph Strauch
dkoevoet.bsky.social
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...
Reposted by Christoph Strauch
attentionlab.bsky.social
@vssmtg.bsky.social
presentations today!

R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict

R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention
cstrauch.bsky.social
Cool new preprint by Damian. Among other findings: eye pupillometry, EEG & IEMs show that the premotor theory of attention can't be the full story: eye movements are associated with an additional, separable, spatially tuned process compared to covert attention, hundreds of ms before shifts happen.
dkoevoet.bsky.social
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?

We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
Reposted by Christoph Strauch
elife.bsky.social
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
A move you can afford
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
buff.ly
cstrauch.bsky.social
Let me know if you're still unconvinced and, if so, why. I'm also happy to present it in more detail at a lab meeting or online.
Cheers!
cstrauch.bsky.social
All together, it's certainly correct that pupillometry requires care as it's just two output systems and in many (but not all) ways just one with multiple inputs. But they are well understood (shameless plug to my tins papers here):
doi.org/10.1016/j.ti...
doi.org/10.1016/j.ti...
Redirecting
doi.org
cstrauch.bsky.social
Lastly, are there other physiological measures that point to similar effects? Yes. Saccade latencies show similar effects - providing convergent evidence to our bottom line of effort driving saccade selection. Latencies are just not as nice as they are not separate from the movement itself