AttentionLab Utrecht University
@attentionlab.bsky.social
160 followers 91 following 7 posts
AttentionLab is the research group headed by Prof. Stefan Van der Stigchel at Experimental Psychology | Helmholtz Institute | Utrecht University
Posts Media Videos Starter Packs
Pinned
attentionlab.bsky.social
Want to keep track of AttentionLab's members? We have our own starter pack!

Follow all or make a selection, up to you! Will update this list whenever more lab members join Bluesky 🙂
go.bsky.app/4yHiToK
Reposted by AttentionLab Utrecht University
suryagayet.bsky.social
Very happy to see this preprint out! The amazing @danwang7.bsky.social was on fire sharing this work at #ECVP2025, gathering loads of attention, and here you can find the whole thing!
Using RIFT we reveal how the competition between top-down goals and bottom-up saliency unfolds within visual cortex.
Reposted by AttentionLab Utrecht University
cstrauch.bsky.social
I'll show some (I think) cool stuff about how we can measure the phenomenology of synesthesia in a physiological way at #ECVP - Color II, atrium maximum, 9:15, Thursday.

say hi and show your colleagues that you're one of the dedicated ones by getting up early on the last day!
cstrauch.bsky.social
#ECVP2025 starts with a fully packed room!

I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!
Reposted by AttentionLab Utrecht University
danwang7.bsky.social
🧠 Excited to share that our new preprint is out!🧠
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).

📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...
Dynamic competition between bottom-up saliency and top-down goals in early visual cortex
Task-irrelevant yet salient stimuli can elicit automatic, bottom-up attentional capture and compete with top-down, goal-directed processes for neural representation. However, the temporal dynamics und...
www.biorxiv.org
Reposted by AttentionLab Utrecht University
suryagayet.bsky.social
And now without bluesky making the background black...
Reposted by AttentionLab Utrecht University
cstrauch.bsky.social
#ECVP2025 starts with a fully packed room!

I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!
Reposted by AttentionLab Utrecht University
arora-borealis.bsky.social
Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social
Reposted by AttentionLab Utrecht University
danwang7.bsky.social
Excited to share that I’ll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!

🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging

@attentionlab.bsky.social @ecvp.bsky.social
Reposted by AttentionLab Utrecht University
cstrauch.bsky.social
Gaze heatmaps (are popular especially for eye-tracking beginners and in many applied domains. How many participants should be tested?
Depends of course, but our guidelines help navigating this in an informed way.

Out now in BRM (free) doi.org/10.3758/s134...
@psychonomicsociety.bsky.social
data saturation for gaze heatmaps. Initially, any additional participant will bring the total NSS or AUC as measures for heatmap similarity a lot closer to the full sample. However, the returns diminish increasingly at higher n.
Reposted by AttentionLab Utrecht University
ajhoogerbrugge.bsky.social
Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!

The dissertation is available here: doi.org/10.33540/2960
Reposted by AttentionLab Utrecht University
dkoevoet.bsky.social
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...
attentionlab.bsky.social
And Monday morning:

@suryagayet.bsky.social
has a poster (pavilion) on:
Feature Integration Theory revisited: attention is not needed to bind stimulus features, but prevents them from falling apart.

Happy @vssmtg.bsky.social #VSS2025 everyone, enjoy the meeting and the very nice coffee mugs!
attentionlab.bsky.social
@vssmtg.bsky.social
presentations today!

R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict

R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention
attentionlab.bsky.social
and tomorrow, Monday:

Surya Gayet in the Pavilion in the morning session:
Feature Integration Theory revisited: attention is not needed to bind stimulus features, but prevents them from falling apart.

Enjoy VSS everyone!
Reposted by AttentionLab Utrecht University
dkoevoet.bsky.social
We previously showed that affordable eye movements are preferred over costly ones. What happens when salience comes into play?

In our new paper, we show that even when salience attracts gaze, costs remain a driver of saccade selection.

OA paper here:
doi.org/10.3758/s134...
Reposted by AttentionLab Utrecht University
dkoevoet.bsky.social
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?

We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
Reposted by AttentionLab Utrecht University
dkoevoet.bsky.social
In our latest paper @elife.bsky.social we show that we choose to move our eyes based on effort minimization. Put simply, we prefer affordable over more costly eye movements.

eLife's digest:
elifesciences.org/digests/9776...

The paper:
elifesciences.org/articles/97760

#VisionScience
elifesciences.org
Reposted by AttentionLab Utrecht University
ajhoogerbrugge.bsky.social
New preprint!

We present two very large eye tracking datasets of museum visitors (4-81 y.o.!) who freeviewed (n=1248) or searched for a +/x (n=2827) in a single feature-rich image.

We invite you to (re)use the dataset and provide suggestions for future versions 📋

osf.io/preprints/os...
Heat map of gaze locations overlaid on top of a feature-rich collage image. There is a seascape with a kitesurfer, mermaid, turtle, and more.
Reposted by AttentionLab Utrecht University
cstrauch.bsky.social
Out in Psychophysiology (OA):

Typically, pupillometry struggles with complex stimuli. We introduced a method to study covert attention allocation in complex video stimuli -
effects of top-down attention, bottom-up attention, and pseudoneglect could all be recovered.

doi.org/10.1111/psyp.70036
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library
Previous studies have shown that the pupillary light response (PLR) can physiologically index covert attention, but only with highly simplistic stimuli. With a newly introduced technique that models ....
doi.org
attentionlab.bsky.social
Congrats to Luzi @luzixu.bsky.social! We're very proud of you! 🎊
suryagayet.bsky.social
Last Friday the irreplaceable @luzixu.bsky.social successfully defended her PhD (at @utrechtuniversity.bsky.social). This has been an incredibly productive 3+ years, and we are sad to see her leave, but are very proud of her accomplishments (with @attentionlab.bsky.social, @chrispaffen.bsky.social)!
Reposted by AttentionLab Utrecht University
dkoevoet.bsky.social
Presaccadic attention facilitates visual continuity across eye movements. However, recent work may suggest that presaccadic attention doesn't shift upward. What's going on?

Our paper shows that presaccadic attention moves up- and downward using the pupil light response.

doi.org/10.1111/psyp.70047
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library
Dominant theories posit that attentional shifts prior to saccades enable a stable visual experience despite abrupt changes in visual input caused by saccades. However, recent work may challenge this ...
onlinelibrary.wiley.com
Reposted by AttentionLab Utrecht University
cstrauch.bsky.social
New popscience piece on why pupil size changes are so cool. psyche.co/ideas/the-pu...
Included: an assignment that lets you measure pupil size. In my classes, this replicates Hess & Polt's 1964 effort finding without an eyetracker. Feel free to use it!

#VisionScience #neuroscience #psychology 🧪
Psyche | on the human condition
Psyche is a digital magazine from Aeon Media that illuminates the human condition through psychology, philosophy and the arts.
psyche.co
Reposted by AttentionLab Utrecht University
danwang7.bsky.social
In conclusion, observers can flexibly de-prioritize and re-prioritize VWM contents based on current task demands, allowing observers to exert control over the extent to which VWM contents influence concurrent visual processing.
Reposted by AttentionLab Utrecht University
attentionlab.bsky.social
As always, excellent work by Damian et al.!