Chong Zhao
zhaochong.bsky.social
Chong Zhao
@zhaochong.bsky.social
PhD student at UChicago Psych. Interested in visual memory and its individual differences.
Reposted by Chong Zhao
The ADAM lab is hiring a Research Specialist to join us! This role involves conducting human subjects research (EEG experiments on attention + working memory) and assisting with the execution and administration of ongoing projects.

Job posting: emdz.fa.us2.oraclecloud.com/hcmUI/Candid...
Research Specialist
The Attention, Distractions, and Memory (ADAM) Lab at Rice University is recruiting a full-time Research Specialist (Research Specialist I). The ADAM Lab (PI: Kirsten Adam) conducts cognitive neurosci...
emdz.fa.us2.oraclecloud.com
January 2, 2026 at 3:21 PM
Isn’t that just 24 more hours Prof? Happy new year!!!
December 31, 2025 at 7:17 AM
Reposted by Chong Zhao
New pre-print day! Distributed and drifting signals for working memory load in human cortex 🧠 (with Ed Awh & @serences.bsky.social)

www.biorxiv.org/content/10.1...
Distributed and drifting signals for working memory load in human cortex
Increasing working memory (WM) load incurs behavioral costs, and whether the neural constraints on behavioral costs are localized (i.e., emanating from the intraparietal sulcus) or distributed across ...
www.biorxiv.org
September 16, 2025 at 1:18 PM
Reposted by Chong Zhao
🌟 We’re hiring! 🌟 Are you interested in memory, cognitive training, & healthy ageing? We’re looking for a Research Assistant to join our lab! www.jobs.ac.uk/job/DOZ814/r...
👉 0.5 FTE (2.5 days/week), 4 months (likely from 01/26)
👉 Annual salary £32,080 to £33,002 (pro-rata)
👉 Based in Sheffield, UK
Research Assistant - Cognitive Ability & Plasticity Lab at University of Sheffield
Searching for an academic job? Explore this Research Assistant - Cognitive Ability & Plasticity Lab opening on jobs.ac.uk! Click to view more details and browse other academic jobs.
www.jobs.ac.uk
October 17, 2025 at 8:08 AM
Reposted by Chong Zhao
How does the visual system track moving objects while remembering the color of those objects? My latest research article (co-first with Piotr @styrkowiec.bsky.social) exploring this question using EEG is out in JoCN! @jocn.bsky.social #workingmemory #cognition #cogneuro #cogsci #neuro
Item-based Parsing of Dynamic Scenes in a Combined Attentional Tracking and Working Memory Task
Abstract. Human visual processing is limited—we can only track a few moving objects at a time and store a few items in visual working memory (WM). A shared mechanism that may underlie these performanc...
doi.org
September 18, 2025 at 2:56 PM
Interestingly, midfrontal theta power could tell errors apart (Hit < Miss, FA < CR) regardless of whether images were learned before. Therefore, we proposed that our brain detects memory-based errors with ERN, and general recognition errors with midfrontal theta power. 3/3
July 31, 2025 at 2:48 PM
First, error-related negativity (ERN) could tell whether our brain made a recognition error or not: but only for images that were learned (i.e. hits versus misses). It couldn't tell false alarms and correct rejections apart even though FAs were also errors. 2/3
July 31, 2025 at 2:48 PM
🚨NEW paper alert at JEP:G: when we're making recognition memory decisions, are we aware of our recognition errors? The answer is YES (with two error detection systems for different purposes): (w/ Geoff Woodman & @keisukefukuda.bsky.social ) psycnet.apa.org/record/2026-... 1/3
APA PsycNet
psycnet.apa.org
July 31, 2025 at 2:48 PM
In Study 4, we generalized our findings from WM to AC with the inclusion of Square tasks (by Engle lab), change localization task and filtering change localization task (used in our lab). Click on this to see more: sciencedirect.com/science/arti.... 5/5
Working memory and attentional control abilities predict individual differences in visual long-term memory tasks
Working memory predicts cognitive abilities like fluid intelligence (gF) and source memory. This suggests these abilities depend on working memory and…
sciencedirect.com
June 18, 2025 at 3:56 PM
In Study 3, we kept the presentation time at 250ms but gave participants a longer ISI, and we observed that the WM-LTM relationship as we saw in Study 1. This suggested that WM matters to LTM if people get more time to form spatio-temporal binding (during ISI). 4/5
June 18, 2025 at 3:56 PM
However, in Study 2, if we speed up the presentation rate of the recognition memory task (250ms/image), WM stopped predicting recognition memory differences. This suggests that WM affects the formation of spatio-temporal binding of items formed only with slower presentation rates. 3/5
June 18, 2025 at 3:56 PM
In Study 1, we found that working memory correlated with simple recognition memory when the images were presented slowly (3 seconds per image). The correlation holds even we regressed out source memory differences, suggesting that WM generally predicts LTM abilities. 2/5
June 18, 2025 at 3:56 PM
New paper out at Journal of Memory and Language! We knew that individual differences in working memory predict source memory, but did it predict simple item recognition memory (that relied on less attention resources than source memory)? Our answer is: YES! (with @edvogel.bsky.social ) 1/5
June 18, 2025 at 3:56 PM
Tonight 5-7 PM I’ll be presenting my poster at CNS @cogneuronews.bsky.social C46: “Repetition learning produces stronger and faster recollection during recognition” Don’t miss it if you’re interested in EEG and visual long-term memory! #CNS2025
March 30, 2025 at 6:03 PM
Reposted by Chong Zhao
Remember what your partner said during a heated argument? Or the rush of getting your first job offer? Why do these emotionally arousing moments stick? Across 3 studies, and 3 arousal measures, we found that emotional arousal enhances memory encoding by promoting functional integration in the 🧠 1/🧵
Emotional arousal enhances narrative memories through functional integration of large-scale brain networks https://www.biorxiv.org/content/10.1101/2025.03.13.643125v1
March 14, 2025 at 5:05 PM
Attention here! We found ind. diff. & fMRI evidence that sustained attention is more closely related to long term memory than to attentional control. With the best team @monicarosenb.bsky.social @edvogel.bsky.social @annacorriveau.bsky.social @jinke.bsky.social
Sustained attention is more closely related to long-term memory than to attentional control https://www.biorxiv.org/content/10.1101/2025.03.13.643171v1
March 14, 2025 at 4:12 PM
Thank uuuuu Kirsten!!!
January 29, 2025 at 10:47 PM
Across 5 experiments, we found that working memory and attentional control (WMAC) ability continued to predict long-term memory (LTM) performance even after participants showed significant learning. Huge thanks to my advisor Dr. Ed Vogel!
January 29, 2025 at 6:36 PM
New paper out now in JEP:G.

"Individual differences in working memory and attentional control continue to predict memory performance despite extensive learning."

psycnet.apa.org/doi/10.1037/xg…
January 29, 2025 at 6:36 PM
Reposted by Chong Zhao
The Awh Vogel lab is heading to Psychonomics!
We made a page on our lab website to cover everything, including Ed Awh's OPAM keynote address!

awhvogellab.com/conferences#...

Find the great work on display from Chong, Igor, Matthieu, me(?), and Park!
Conferences
The Awh Vogel Lab at the University of Chicago uses behavioral and neural methods to study attention and visual working memory
awhvogellab.com
November 19, 2024 at 3:30 PM
Reposted by Chong Zhao
I'm so excited to read this special issue on interactions of working & long-term memory!

My article with @zhaochong.bsky.social and Ed Vogel is linked here, and I'll also be giving a talk on this paper at Psychonomics! #psynom24

link.springer.com/article/10.3...
November 19, 2024 at 3:14 PM