Graham Flick
@grahamflick.bsky.social
350 followers 390 following 14 posts
NSERC Postdoc @ the Rotman Research Institute, Toronto Studying language, memory, and neural oscillations https://sites.google.com/view/grahamflick
Posts Media Videos Starter Packs
Reposted by Graham Flick
scone-neuro.bsky.social
/1 We took our sweet time (~3yrs) to put this into its final shape - but happy to say that the pre-print of an extensive review of brain rhythms in cognition - from a cognruro perspective - is now available. Please let us know what you think. #neuroskyence doi.org/10.48550/arX...
Brain rhythms in cognition -- controversies and future directions
Brain rhythms seem central to understanding the neurophysiological basis of human cognition. Yet, despite significant advances, key questions remain unresolved. In this comprehensive position paper, w...
doi.org
Reposted by Graham Flick
laurenhomann.bsky.social
Proud to share the first preprint of my PhD w/ @barense.bsky.social & Mursal Jahed:

“Putting the testing effect to the test in the wild: Retrieval enhances real-world memories and promotes their semantic integration while preserving episodic integrity”

See thread! 🧵 osf.io/preprints/ps...
OSF
osf.io
Reposted by Graham Flick
liinapy.bsky.social
During natural reading with eye movements, the left posterior fusiform cortex reflects both fixated and upcoming words in parallel—and distinguishes whether an upcoming word is skipped or fixated. A new preprint from Graham Flick @grahamflick.bsky.social!
grahamflick.bsky.social
These results suggest that during visual reading, word recognition and integration begin parafoveally, underpinned by the left occipito-temporal system.

AND this system appears to be the first port of call where word processing may rapidly exert downstream influences on eye movement behaviours.
grahamflick.bsky.social
What about word skipping? Here, we examined parafoveal processing of words that were skipped vs. fixated.

This revealed widespread parafoveal effects of frequency and surprisal before skipping, with significantly larger effects preceding skipped words in the left fusiform and middle temporal gyri
grahamflick.bsky.social
We identified putative generators of these influences in left occipito-temporal and ventral temporal areas.

These effects began in posterior areas during parafoveal word processing and shifted anteriorly when the word was fixated.
grahamflick.bsky.social
Matching past work, 2 properties significantly influenced fixation durations: a given word's frequency and it's surprisal in the current context.

Next, we asked where and when these properties influenced neural activity, time-locked to fixation onsets
grahamflick.bsky.social
To try to answer this, we recorded brain activity and eye movements with simultaneous MEG and eye-tracking, while participants naturally read short stories.

This allowed us to analyze neural activity time-locked to more than 22k gaze fixations
grahamflick.bsky.social
As a visual reader moves their eyes across this sentence, they will linger on certain words longer than others, and skip some entirely.

How does the brain make these decisions, based on the incoming linguistic input, in tens or hundreds of milliseconds?
Reposted by Graham Flick
grahamflick.bsky.social
I'm at CNS presenting new work from my postdoc! Come check out our poster C32 from 5:00-7:00 tonight:

Consistent alignment of saccades and alpha oscillations supports the neural representation and memory encoding of visual objects

w. @rosannaolsen.bsky.social, Jen Ryan, & Jed Meltzer
Reposted by Graham Flick
jerrytang.bsky.social
I'm excited to share our new paper (with @alexanderhuth.bsky.social) on transferring language decoders across participants and modalities!

authors.elsevier.com/a/1kZRD3QW8S...

1/5
authors.elsevier.com
Reposted by Graham Flick
mariamaly.bsky.social
How is the memorability of an image influenced by how it makes us feel?

@hartwakeland.bsky.social created an image set (VAMOS) of over 900 scene images, along with their valence, arousal, and memorability ratings. They then showed that *moderately* negative images are more memorable!

(1/2)
Sample images sorted by evoked valence/arousal and memorability. Scene images were selected to span a wide range of average valence and arousal scores. The figure above shows examples of memorable and forgettable images from across the spectra of valence and arousal. Examples include a tornado (negative and memorable), pollution from a factory (negative and forgettable), a field of sunflowers (positive and memorable), a frosty field in early morning (positive and forgettable), a car crash (high arousal and memorable), and a mountain scene (high arousal and forgettable).
grahamflick.bsky.social
Nice study showing that memories for new speech + hand movement associations are linked to the specific words they were learned with, suggesting that "...word or gesture production might reactivate an entire co-speech gesture memory engram"
Reposted by Graham Flick
Reposted by Graham Flick
olejensen.bsky.social
Our review out in TiCS spearheaded by Mathilde Bonnefond on the latest ideas on the functional role of alpha oscillations and distractor inhibition - e.g. we highlight that alpha increases might reflect perceptual target load rather than distractor anticipation authors.elsevier.com/a/1kFXN_V1r-...
Reposted by Graham Flick
olafhauk.bsky.social
@mrccbu.bsky.social are happy to announce Cognitive Neuroimaging Skills Training In Cambridge (#COGNESTIC) on 15-26 Sep 2025. We will provide training in state-of-the-art methods for open neuroimaging analysis and related methods. Look here for more information: www.mrc-cbu.cam.ac.uk/events/cogne...
Reposted by Graham Flick
katenuss.bsky.social
I'm hiring a full-time lab manager / research tech for my new psychology lab at Boston University, to start this summer (July 2025)!

The lab's research focuses on understanding developmental changes in learning, memory, and exploration.

More details here: cldlab.org/join/

🧠💻 #psychscisky
Detailed job description, which can also be found here: https://cldlab.org/join/
grahamflick.bsky.social
Opportunity to join a really great group!
pripolles.bsky.social
We have an open position for a full time junior research scientist to work at the Ripollés and Fuentes labs at MARL in the context of an NIH project. The position starts in January/February 2025, is full time, and based in NYC.
Reposted by Graham Flick
nadinedijkstra.bsky.social
JOB ALERT: We are recruiting a research assistant in the Imagine Reality Lab to work on a project using MEG decoding to distinguish between different theories of consciousness 🧠 Get in touch if you have any questions about the role's scientific details. Please share!

www.ucl.ac.uk/work-at-ucl/...
UCL – University College London
UCL is consistently ranked as one of the top ten universities in the world (QS World University Rankings 2010-2022) and is No.2 in the UK for research power (Research Excellence Framework 2021).
www.ucl.ac.uk
grahamflick.bsky.social
Had a lot of fun teaching this workshop on simultaneous MEG and eye-tracking yesterday!
rosannaolsen.bsky.social
Earlier today post doc @grahamflick.bsky.social showed off how he has mastered the collection, temporal coregistration, and analysis of MEG and eye-tracking data, no easy feat! 🧠〰️👀🔗 Very excited to address new questions about how eye movements relate to memory formation!
grahamflick.bsky.social
Not so "new" anymore but still here at the Rotman and the University of Toronto Data Sciences Institute, interested in all things memory, language, aging, and MEG/eye-tracking! 👋
grahamflick.bsky.social
Hi folks, I'm a new PhD from NYU and now a postdoc at the Rotman Research Institute in Toronto 🇨🇦

My research examines memory, language, and aging using MEG & eye-tracking, MRI, and behavioral methods 🧠