Chong Zhao
@zhaochong.bsky.social
130 followers 200 following 13 posts
PhD student at UChicago Psych. Interested in visual memory and its individual differences.
Posts Media Videos Starter Packs
zhaochong.bsky.social
Interestingly, midfrontal theta power could tell errors apart (Hit < Miss, FA < CR) regardless of whether images were learned before. Therefore, we proposed that our brain detects memory-based errors with ERN, and general recognition errors with midfrontal theta power. 3/3
zhaochong.bsky.social
First, error-related negativity (ERN) could tell whether our brain made a recognition error or not: but only for images that were learned (i.e. hits versus misses). It couldn't tell false alarms and correct rejections apart even though FAs were also errors. 2/3
zhaochong.bsky.social
🚨NEW paper alert at JEP:G: when we're making recognition memory decisions, are we aware of our recognition errors? The answer is YES (with two error detection systems for different purposes): (w/ Geoff Woodman & @keisukefukuda.bsky.social ) psycnet.apa.org/record/2026-... 1/3
APA PsycNet
psycnet.apa.org
zhaochong.bsky.social
In Study 3, we kept the presentation time at 250ms but gave participants a longer ISI, and we observed that the WM-LTM relationship as we saw in Study 1. This suggested that WM matters to LTM if people get more time to form spatio-temporal binding (during ISI). 4/5
zhaochong.bsky.social
However, in Study 2, if we speed up the presentation rate of the recognition memory task (250ms/image), WM stopped predicting recognition memory differences. This suggests that WM affects the formation of spatio-temporal binding of items formed only with slower presentation rates. 3/5
zhaochong.bsky.social
In Study 1, we found that working memory correlated with simple recognition memory when the images were presented slowly (3 seconds per image). The correlation holds even we regressed out source memory differences, suggesting that WM generally predicts LTM abilities. 2/5
zhaochong.bsky.social
New paper out at Journal of Memory and Language! We knew that individual differences in working memory predict source memory, but did it predict simple item recognition memory (that relied on less attention resources than source memory)? Our answer is: YES! (with @edvogel.bsky.social ) 1/5
zhaochong.bsky.social
Tonight 5-7 PM I’ll be presenting my poster at CNS @cogneuronews.bsky.social C46: “Repetition learning produces stronger and faster recollection during recognition” Don’t miss it if you’re interested in EEG and visual long-term memory! #CNS2025
Reposted by Chong Zhao
jadynpark.bsky.social
Remember what your partner said during a heated argument? Or the rush of getting your first job offer? Why do these emotionally arousing moments stick? Across 3 studies, and 3 arousal measures, we found that emotional arousal enhances memory encoding by promoting functional integration in the 🧠 1/🧵
biorxiv-neursci.bsky.social
Emotional arousal enhances narrative memories through functional integration of large-scale brain networks https://www.biorxiv.org/content/10.1101/2025.03.13.643125v1
zhaochong.bsky.social
Attention here! We found ind. diff. & fMRI evidence that sustained attention is more closely related to long term memory than to attentional control. With the best team @monicarosenb.bsky.social @edvogel.bsky.social @annacorriveau.bsky.social @jinke.bsky.social
biorxiv-neursci.bsky.social
Sustained attention is more closely related to long-term memory than to attentional control https://www.biorxiv.org/content/10.1101/2025.03.13.643171v1
zhaochong.bsky.social
Thank uuuuu Kirsten!!!
zhaochong.bsky.social
Across 5 experiments, we found that working memory and attentional control (WMAC) ability continued to predict long-term memory (LTM) performance even after participants showed significant learning. Huge thanks to my advisor Dr. Ed Vogel!
zhaochong.bsky.social
New paper out now in JEP:G.

"Individual differences in working memory and attentional control continue to predict memory performance despite extensive learning."

psycnet.apa.org/doi/10.1037/xg…
Reposted by Chong Zhao
The Awh Vogel lab is heading to Psychonomics!
We made a page on our lab website to cover everything, including Ed Awh's OPAM keynote address!

awhvogellab.com/conferences#...

Find the great work on display from Chong, Igor, Matthieu, me(?), and Park!
Conferences
The Awh Vogel Lab at the University of Chicago uses behavioral and neural methods to study attention and visual working memory
awhvogellab.com
Reposted by Chong Zhao
kirsten-adam.bsky.social
I'm so excited to read this special issue on interactions of working & long-term memory!

My article with @zhaochong.bsky.social and Ed Vogel is linked here, and I'll also be giving a talk on this paper at Psychonomics! #psynom24

link.springer.com/article/10.3...