Visual Attention Lab (Wolfe Lab)
@visualattentionlab.bsky.social
76 followers 92 following 10 posts
Welcome to the VAL (also known as the Wolfe Lab) Our lab specializes in Visual Search and is run by Dr. Jeremy Wolfe We are affiliated with Brigham and Women's Hospital and Harvard Medical School
Posts Media Videos Starter Packs
visualattentionlab.bsky.social
Our RA Cailey just got back from Sensory Neuroscience Summer School at The University of Pisa!

She got to hear from speakers like Dr. Martin Rolfs and Dr. Paola Binda, as well as attend a bootcamp for 7 Tesla fMRIs.
visualattentionlab.bsky.social
Meet our visiting Grad Student Ecem!

She is Ph.D. candidate in Neuroscience at the University of Würzburg, studying attentional and learning processes in fear conditioning. At VAL, she focuses on visual foraging under threat, manipulating the spatial distribution of threat and value of targets.
visualattentionlab.bsky.social
Suganuma, M., & Yokosawa, K. (2006). Grouping and trajectory storage in multiple object tracking: Impairments due to common item motions. Perception, 35, 483– 495. doi:10.1068/p5487

- Courtesy of Dr. Todd Horowitz and Dr. Jeremy Wolfe
visualattentionlab.bsky.social
Meet our visiting med student Rana!

Visiting from İstinye University in Istanbul, her interests are in understanding the connection between the eye and brain through a neuroscience-driven approach. She is focused on deepening her understanding of visual sciences and developing research skills.
visualattentionlab.bsky.social
Meet our RA since 2024 Cailey Tennyson!

She got her Bachelor's degree in Computational Cognitive Science from UC Davis where she was an RA with Dr. Joy Geng. She is interested in studying memory and perception and how/where these happen in the brain. Outside of research she loves to bowl and read
visualattentionlab.bsky.social
Congratulations to Injae Hong on her new paper about foraging

Hong, I., & Wolfe, J. M. (2025). Mixed hybrid visual foraging is near optimal. Attention, Perception, & Psychophysics. doi.org/10.3758/s134...

In the current moment, it is nice to know that something is “near optimal”
Mixed hybrid visual foraging is near optimal - Attention, Perception, & Psychophysics
This study investigated patch-leaving strategies in mixed hybrid visual foraging scenarios, focusing on how target specificity and the number of target sets influence overall outcomes. In mixed hybrid...
doi.org
visualattentionlab.bsky.social
Meet our postdoc since 2024 Victoria!

She got her Ph.D. from UCLA where she worked on optimizing learning in real-world categorization, like skin cancer classification. She is interested in visual search within medical image perception and exploring the role of categories in search tasks.
visualattentionlab.bsky.social
Meet Dr. Jeremy Wolfe!

He is a Professor of Ophthalmology and Radiology at Harvard Med and received an AB in Psychology from Princeton and his PhD in Psychology from MIT. His research focuses on visual search and visual attention in areas such as medical image perception, security, and intelligence
visualattentionlab.bsky.social
Meet the Visual Attention Lab!

We are a lab researching Visual Search directed by PI Dr. Jeremy Wolfe, affiliated with Brigham and Women's Hospital and Harvard Medical School.