Michelle Ramey
@michelleramey.bsky.social
85 followers 100 following 10 posts
Assistant professor studying episodic memory and how it interacts with visual attention, schemas, and aging (using eyetracking and computational modeling) | https://michellemramey.com/
Posts Media Videos Starter Packs
Pinned
michelleramey.bsky.social
Our new paper on how episodic memory and semantic knowledge interact to influence eye movements during search is out now in Psychonomic Bulletin & Review, with @jmhenderson.bsky.social and Andy Yonelinas! (summary below) link.springer.com/article/10.3...
#psynomPBR @psychonomicsociety.bsky.social
Episodic memory and semantic knowledge interact to guide eye movements during visual search in scenes: Distinct effects of conscious and unconscious memory - Psychonomic Bulletin & Review
Episodic memory and semantic knowledge can each exert strong influences on visual attention when we search through real-world scenes. However, there is debate surrounding how they interact when both are present; specifically, results conflict as to whether memory consistently improves visual search when semantic knowledge is available to guide search. These conflicting results could be driven by distinct effects of different types of episodic memory, but this possibility has not been examined. To test this, we tracked participants’ eyes while they searched for objects in semantically congruent and incongruent locations within scenes during a study and test phase. In the test phase containing studied and new scenes, participants gave confidence-based recognition memory judgments that indexed different types of episodic memory (i.e., recollection, familiarity, unconscious memory) for the background scenes, then they searched for the target. We found that semantic knowledge consistently influenced both early and late eye movements, but the influence of memory depended on the type of memory involved. Recollection improved first saccade accuracy in terms of heading towards the target in both congruent and incongruent scenes. In contrast, unconscious memory gradually improved scanpath efficiency over the course of search, but only when semantic knowledge was relatively ineffective (i.e., incongruent scenes). Together, these findings indicate that episodic memory and semantic knowledge are rationally integrated to optimize attentional guidance, such that the most precise or effective forms of information available – which depends on the type of episodic memory available – are prioritized.
link.springer.com
Reposted by Michelle Ramey
drjenryan.bsky.social
Memory problems will change how you see the world...literally 👀

Across two new papers, we examined the eye movement patterns of younger adults, older adults, individuals with mild cognitive impairment, and amnesic cases.

1/5
michelleramey.bsky.social
3) This indicates that imagery can modify memory to accommodate anticipated changes, improving the ability to detect that a familiar face is present—but not the ability to pick that face out of a lineup. These findings thus identify a novel dissociation between old/new and forced-choice recognition.
michelleramey.bsky.social
2) At study, participants saw neutral faces and were cued to imagine them in happy or angry expressions. At test, old and new faces were shown as happy or angry. When old faces' test expression matched the imagined expression, old/new recognition was better—but forced-choice accuracy was unaffected.
michelleramey.bsky.social
1) Given that items don't look exactly the same at encoding and retrieval in real-world recognition—including consequential uses of memory, like eyewitness memory—Darya Zabelina and I examined whether visual imagery could be used to improve our ability to recognize people across appearance changes.
michelleramey.bsky.social
New paper out! Imagery can directionally modify memory encoding, to manipulate later recognition for changed faces. Essentially, imagery can be used to simulate effects of higher (or lower) study-test similarity for an item itself. @psychonomicsociety.bsky.social link.springer.com/article/10.1...
Using visual imagery to manipulate recognition memory for faces whose appearance has changed - Cognitive Research: Principles and Implications
Real-world recognition requires our memory system to accommodate perceptual changes that occur after encoding; for example, eyewitnesses must recognize perpetrators across changes in appearance. However, it is not clear how this flexible recognition ability can be improved: Standard encoding strategies not only tend to be ineffective, but can in fact be detrimental for recognizing people across appearance changes. Given the effectiveness of visual imagery in creating and modifying memory representations, we examined whether counterfactual visual imagery could be used to manipulate flexible recognition by simulating an increase in encoding–retrieval similarity. Across two experiments, participants (n = 317) encoded faces with neutral expressions and were cued to imagine the faces with either happy or angry expressions. During later retrieval, participants saw lineups of old and new faces with either happy or angry expressions, and selected the old face and provided recognition confidence. Old/new recognition discriminability and confidence were higher when a face’s expression at retrieval matched the expression that it was imagined in during encoding (i.e., congruent imagery); interestingly, however, there was Bayesian evidence for no benefit of imagery congruence for face-choice accuracy. Moreover, congruent imagery improved recognition for old arrays irrespective of whether participants correctly selected the old face, suggesting that the imagery manipulation influenced a diffuse sense of recognition without influencing the ability to attribute that sense of recognition to a specific stimulus. Together, these findings indicate that visual imagery can directionally manipulate recognition for changed faces and produces a novel dissociation between old/new recognition and forced-choice accuracy.
link.springer.com
Reposted by Michelle Ramey
dkvarga.bsky.social
So happy to share our paper on the role of the hippocampus as a mismatch detector:
doi.org/10.1073/pnas...

We show that the hippocampus detects mismatches between ongoing experiences and episodic memories but not generalised schematic knowledge.

See 🧵for how we got here:
#neuroskyence #PsychSciSky
Reposted by Michelle Ramey
matthiasnau.bsky.social
Very cool and strong effect for me!
It reminded me of this amazing fovea visualizer that I saw on the other platform a few years ago. Open it, make full screen, and see the extent of your fovea! 👀 www.shadertoy.com/view/4dsXzM
Reposted by Michelle Ramey
mariamaly.bsky.social
We make predictions based on general knowledge and/or specific memories. Different brain areas are active when these distinct predictions are violated – and hippocampus selectively responds to prediction errors based on episodic memory.

Cool work by @chrismbird.bsky.social @ayab.bsky.social et al!
Hippocampal mismatch signals are based on episodic memories and not schematic knowledge | PNAS
Prediction errors drive learning by signaling mismatches between expectations and reality, but the neural systems supporting these computations rem...
www.pnas.org
Reposted by Michelle Ramey
woodforbrains.bsky.social
Cortico-hippocampal interactions underlie schema-supported memory encoding in older adults

New paper led by @shenyanghuang.bsky.social!
academic.oup.com/cercor/artic...

Older adults' memory benefits from richer semantic contexts. We found connectivity patterns supporting this semantic scaffolding.
Reposted by Michelle Ramey
natrevpsychol.nature.com
Adaptive compression as a unifying framework for episodic and semantic memory

Perspective by David G. Nagy (@davidnagy.bsky.social), Gergő Orbán & Charley M. Wu (@thecharleywu.bsky.social)

Web: go.nature.com/3ZkmRLb
PDF: rdcu.be/epAQ0
Reposted by Michelle Ramey
dorsaamir.bsky.social
Does the culture you grow up in shape the way you see the world? In a new Psych Review paper, @chazfirestone.bsky.social & I tackle this centuries-old question using the Müller-Lyer illusion as a case study. Come think through one of history's mysteries with us🧵(1/13):
michelleramey.bsky.social
4) These results led us to propose a new theory of attentional guidance, which we term rational integration: different sources of information, in this case episodic memory and semantic knowledge, are rationally combined and prioritized based on their relative strength/precision to guide attention.
michelleramey.bsky.social
3) When only unconscious memory was available—i.e., cases in which participants exhibited memory-driven performance improvements despite a confident lack of awareness for that memory—memory only guided search when semantic knowledge had failed to get the eyes to the target (aka, incongruent scenes).
michelleramey.bsky.social
2) We manipulated semantic knowledge via schema congruency (objects in congruent vs incongruent scene locations), and measured recognition memory for the scenes. When detailed recollection was available, memory was integrated with semantic knowledge to guide early eye movements during search.
michelleramey.bsky.social
1) The extent to which episodic memory guides visual search when semantic knowledge is available is debated. We found that whether memory influences search depends on what type of memory is available.
michelleramey.bsky.social
Our new paper on how episodic memory and semantic knowledge interact to influence eye movements during search is out now in Psychonomic Bulletin & Review, with @jmhenderson.bsky.social and Andy Yonelinas! (summary below) link.springer.com/article/10.3...
#psynomPBR @psychonomicsociety.bsky.social
Episodic memory and semantic knowledge interact to guide eye movements during visual search in scenes: Distinct effects of conscious and unconscious memory - Psychonomic Bulletin & Review
Episodic memory and semantic knowledge can each exert strong influences on visual attention when we search through real-world scenes. However, there is debate surrounding how they interact when both are present; specifically, results conflict as to whether memory consistently improves visual search when semantic knowledge is available to guide search. These conflicting results could be driven by distinct effects of different types of episodic memory, but this possibility has not been examined. To test this, we tracked participants’ eyes while they searched for objects in semantically congruent and incongruent locations within scenes during a study and test phase. In the test phase containing studied and new scenes, participants gave confidence-based recognition memory judgments that indexed different types of episodic memory (i.e., recollection, familiarity, unconscious memory) for the background scenes, then they searched for the target. We found that semantic knowledge consistently influenced both early and late eye movements, but the influence of memory depended on the type of memory involved. Recollection improved first saccade accuracy in terms of heading towards the target in both congruent and incongruent scenes. In contrast, unconscious memory gradually improved scanpath efficiency over the course of search, but only when semantic knowledge was relatively ineffective (i.e., incongruent scenes). Together, these findings indicate that episodic memory and semantic knowledge are rationally integrated to optimize attentional guidance, such that the most precise or effective forms of information available – which depends on the type of episodic memory available – are prioritized.
link.springer.com
Reposted by Michelle Ramey
drjenryan.bsky.social
Can you predict the future? Your brain and your eyes can.
🧵

I had the honour of writing a @currentbiology.bsky.social Dispatch in which I discuss exciting new findings from @philippbuchel.bsky.social, Klingspohr, Kehl & Staresina (2024).

Read the Dispatch here:
doi.org/10.1016/j.cu...

1/3
Redirecting
doi.org
michelleramey.bsky.social
For my first post, I thought I'd share my first ever lab photos as PI that we took last week :) (+ the fact that I'm recruiting more PhD students for Fall 2025!) memlab.uark.edu