Antonin Fourcade
@toninfrc.bsky.social
500 followers 100 following 19 posts
PhD student at Max Planck School of Cognition and MPI CBS/MPDCC (Berlin). Background in Biomedical Engineering and Neuroscience. Interested in brain-heart interactions, emotions and VR
Posts Media Videos Starter Packs
Pinned
toninfrc.bsky.social
📢Our peer-reviewed article about the AffectTracker is finally out! 😲🕹️📈
Traditional methods for rating emotion often miss the dynamic, moment-to-moment nature of feelings. We designed a tool to capture this continuous affective experience in real-time during dynamic emotional stimulation.
Frontiers | AffectTracker: real-time continuous rating of affective experience in immersive virtual reality
Subjective experience is key to understanding affective states, characterized by valence and arousal. Traditional experiments using post-stimulus summary rat...
doi.org
Reposted by Antonin Fourcade
cp-trendsneuro.bsky.social
'Brain–body states as a link between cardiovascular and mental health'

by Arno Villringer, Vadim Nikulin & Michael Gaebler @mbe-lab.bsky.social @michaelgaebler.com @mpicbs.bsky.social sky.social

www.cell.com/trends/neuro...
Villringer et al. Figure 1. Conceptual framework for brain–body states Villringer et al. Figure 2 Brain–body micro-, meso-, and macro-states can be distinguished on the basis of their duration and reversibility
Reposted by Antonin Fourcade
martager.bsky.social
Check out our new article for young readers (ages 8-15) on heart-brain interactions and interoception! 🧠🫀

I had so much fun co-writing this with @agatapatyczek.bsky.social @el-rei.bsky.social with the support of @michaelgaebler.com ✍️

👉 Share it widely with curious young minds
Yay for #scicomm
toninfrc.bsky.social
Our studies confirmed AffectTracker is reliable, with high user experience and low interference. It opens new avenues for linking subjective experience to physiological dynamics. The tool is open-source and available on GitHub!
#OpenScience
GitHub - afourcade/AffectTracker: AffectTracker: real-time continuous rating of affective experience in immersive virtual reality
AffectTracker: real-time continuous rating of affective experience in immersive virtual reality - afourcade/AffectTracker
github.com
toninfrc.bsky.social
AffectTracker allows users to continuously rate their valence and arousal during VR experiences. It features customizable feedback options, including a simplified affect grid and a novel abstract shape ("Flubber"), designed to be intuitive and minimally interfering.
toninfrc.bsky.social
👥An amazing team effort by:
@fra-malandrone.bsky.social
@lucyroe.bsky.social
A. Ciston
@thefirstfloor.bsky.social
A. Villringer
S. Carletto
@michaelgaebler.com

#neuroskyence #vr #emotion #affect #selfreports
toninfrc.bsky.social
📢Our peer-reviewed article about the AffectTracker is finally out! 😲🕹️📈
Traditional methods for rating emotion often miss the dynamic, moment-to-moment nature of feelings. We designed a tool to capture this continuous affective experience in real-time during dynamic emotional stimulation.
Frontiers | AffectTracker: real-time continuous rating of affective experience in immersive virtual reality
Subjective experience is key to understanding affective states, characterized by valence and arousal. Traditional experiments using post-stimulus summary rat...
doi.org
Reposted by Antonin Fourcade
mbe-lab.bsky.social
📣 We're at the #MindBrainBody Symposium in Berlin, starting today! Looking forward to connect with everyone and share our latest research 🧠

Our group has an exciting lineup of posters - come chat with us! 💬 Check out the previews below to see where and when to meet us 📌

#MBBS24 #neuroskyence
Reposted by Antonin Fourcade
mbe-lab.bsky.social
We centralized our open-science contributions in a new "Tools & Software" section on our website; check out

- open stimuli (eg. 3D objects)
- open data (eg. MindBrainBody)
- tools (eg. excite-o-meter, AffectTracker)
- analysis scripts
- & more

www.cbs.mpg.de/departments/...

#researchtransparency
Tools and Software
www.cbs.mpg.de
toninfrc.bsky.social
The 1-min videos in study 1 are monoscopic, chosen as intermediate stimuli between static images and long videos to extend the classical short event-related stimulus approach. Also finding suitable free videos was challenging. Study 2's 23-min video is stereoscopic, a step further in stimuli type
toninfrc.bsky.social
3️⃣Tool offers a novel way to study affective dynamics with minimal interference, effectively capturing the nuances of subjective experiences. It opens new research opportunities to link affective states with physiological dynamics
🌟Stay tuned for the full paper & we welcome feedback & discussions! 💭
toninfrc.bsky.social
2️⃣Empirically evaluated in 2 studies at 2 sites (Berlin & Torino; N = 134) with both shorter 1-min 360° videos (low affective variability [AV] 〰️) and longer more dynamic 23-min stimulus (high AV 📈)
Both Grid & Flubber ➡️ high user experience 😃 & low interference with the affective experience itself
toninfrc.bsky.social
1️⃣Participants can rate in real-time and continuously, using the touchpad or joystick of a VR controller 🎮(here: HTC Vive Pro). It comprises three customizable feedback options: a simplified affect grid (Grid), an abstract pulsating variant (Flubber), and no visual feedback (Proprioceptive)
toninfrc.bsky.social
🚀 Preprint out! doi.org/10.31234/osf...
We developed, empirically evaluated and openly share **AffectTracker**, a new tool to collect continuous ratings of two-dimensional (valence and arousal) affective experience **during** dynamic emotional stimulation (e.g., 360° videos) in immersive VR! 🥽🧠🟦
Reposted by Antonin Fourcade
martager.bsky.social
I thought it could be nice to connect the community of researchers exploring body-brain interactions on bsky, so here is the Body-Brain Interactions Starter Pack! 🫀🫁👀🧠 #neuroskyence #academicsky

Let me know if you would like to be added or know someone to add. Enjoy and share!

go.bsky.app/Fwqeu32
Reposted by Antonin Fourcade
mbe-lab.bsky.social
Title: Real-time continuous rating of affective experience in immersive Virtual Reality

P.361 (Session 1)
@toninfrc.bsky.social

In collaboration with Torino University, we developed a fun and intuitive new tool to record moment-by-moment feelings!
Reposted by Antonin Fourcade
mbe-lab.bsky.social
We are coming to Psychologie und Gihirn 2024 (PUG) in Hamburg! Come chat with us! See the some teezers in the comments 💬 #PuG2024
toninfrc.bsky.social
The picture was made using the AI image generator DALL-E3
toninfrc.bsky.social
We contribute to shed light on the complex relationship between emotions & the nervous systems (or MindBrainBody coupling) under naturalistic stimulation. 🌟 Stay tuned for the full paper & we’re very happy about feedback and discussions! 💭
(Illustration: DALL-E3)
toninfrc.bsky.social
However, whole-brain exploratory analyses revealed a temporo-occipital cluster, where higher EA was linked to decreased 🧠➡️🫀  brain-to-heart (gamma→HF-HRV) and increased 🫀➡️🧠heart-to-brain (LF-HRV→gamma) information flow.
toninfrc.bsky.social
4️⃣ Physiological modeling (using a method by @diegocandiar and others) did not provide evidence for our hypothesis that higher EA changes the bidirectional information flow between HF-HRV & posterior alpha power. 🧠🔁🫀
toninfrc.bsky.social
3️⃣ Combining🧠EEG & 🫀 ECG, higher EA was also associated with lower heartbeat-evoked potential (HEP) amplitudes in a left fronto-central electrode cluster. This may indicate that stronger emotional states change the importance of signals from the outer world & the inner body.
toninfrc.bsky.social
2️⃣ Higher EA was linked to 🫀 lower vagal cardioregulation (high-frequency heart rate variability, HF-HRV) and🧠lower posterior (parieto-occipital) alpha power.
toninfrc.bsky.social
1️⃣ 29 healthy adults experienced virtual🎢while we recorded the electrical activity in their🧠(#EEG) and 🫀 (#ECG). They then continuously rated the intensity of their emotional experience (emotional arousal, EA) while viewing a replay.