O'Connell Lab
@connelllab.bsky.social
330 followers 43 following 14 posts
Prof Redmond O'Connell's lab, Trinity College Institute of Neuroscience. Seeking to understand the neural mechanisms underpinning high-level cognition. https://oconnell-lab.com/home/opportunities/
Posts Media Videos Starter Packs
connelllab.bsky.social
3/3 Here we show that their deconvolution approach eliminates these same signatures when applied to a ground-truth EA signal. We also recap the many other signatures of sensory EA that the CPP has been shown to exhibit.
connelllab.bsky.social
2/3 Frömer et al (2024) used a signal deconvolution method to show that one signature of evidence accumulation (EA) observed in the centro-parietal positivity (CPP) - trial-averaged response-locked buildup effects - could arise artifactually from overlapping stimulus- and response-locked components.
connelllab.bsky.social
1/3 Check out our new commentary bsky.app/profile/imag....
imagingneurosci.bsky.social
New paper in Imaging Neuroscience by Redmond G. O’Connell, Simon P. Kelly, et al:

Regressing away common neural choice signals does not make them artifacts: Comment on Frömer et al. 2024

doi.org/10.1162/IMAG...
connelllab.bsky.social
Check out our new paper in which we identify a model that can jointly account for the timing and accuracy of perceptual choices, the timing and level of subsequent confidence judgments and the pre- and post-choice dynamics of neural decision signals
Reposted by O'Connell Lab
rldmdublin2025.bsky.social
Exciting news - early bird registration is now open for #RLDM2025!

🔗 Register now: forms.gle/QZS1GkZhYGRF...

Register now to save €100 on your ticket. Early bird prices are only available until 1st April.
Reposted by O'Connell Lab
rldmdublin2025.bsky.social
📢 Call for Abstracts 📢

Submit your extended abstracts on "learning and decision-making over time to achieve a goal" to #RLDM2025. Successful applications will be selected for poster or oral presentation to an interdisciplinary audience.

🗓️ Deadline: Jan 15th
🔗 Learn more: rldm.org/call-for-abs...
Call to action requesting poster abstracts be submitted to the RLDM 2025 conference by the January 15th deadline. Poster is stylised in the pink and blue RLDM colours, and features the RLDM brain/robot mascot standing on a "submit now" sticker.
connelllab.bsky.social
Our results have interesting implications for modelling 2 choice and continuous outcome dot motion tasks, and open the door to future research to help further develop our understanding of ODMR
5/5
connelllab.bsky.social
We found supporting evidence that temporal filters are involved in ODMR, with the higher frame rate display seeming to induce more ODMR than the lower frame rate display. Confidence data interestingly distinguished ODMR both from error and correct responses.
4/5
connelllab.bsky.social
Inspired by Bae and Luck's 2022 Visual Cognition paper, we decided to investigate if changing the display frame rate would impact ODMR rates - hypothesising that temporal filtering may be involved. We also gathered confidence data to compare ODMR to correct and error responses.
3/5
connelllab.bsky.social
Pat was frustrated at consistently responding in the opposite direction to the true dot motion direction displayed during piloting for a related study. Diving into the literature revealed he was not the only one responding in such a way.
2/5
connelllab.bsky.social
New paper from our very own @patmckeown.bsky.social in Visual Cognition, investigating opposite direction motion reports (ODMR) in RDKs. We looked at the role of display frame rate and confidence in these peculiar reports.
1/5 w/ Elaine Corbett and @redmondoconnell.bsky.social
connelllab.bsky.social
We're hiring for a 2-year postdoctoral researcher position! Come join our ERC-funded project developing neurally-informed models of perceptual decision making and metacognition at Trinity College Dublin.

oconnell-lab.com/home/opportu...
Reposted by O'Connell Lab
redmondoconnell.bsky.social
We're hiring! The O'Connell Lab is offering a 2-year postdoctoral position to join our team and work on our ERC-funded project developing neurally-informed models of perceptual decision making and metacognition. Full info here: oconnell-lab.com/home/opportu..., feel free to DM any questions