Dr. Mar Gonzalez-Franco
@twimar.bsky.social
Computer Scientist and Neuroscientist. Lead of the Blended Intelligence Research & Devices (BIRD) @GoogleXR. ex- Extended Perception Interaction & Cognition (EPIC) @MSFTResearch. ex- @Airbus applied maths
More over... we put all of this in an interactive dataset for people to explore each of the techniques and geek around😎 xrtexttrove.github.io some are wild!
March 13, 2025 at 9:54 PM
More over... we put all of this in an interactive dataset for people to explore each of the techniques and geek around😎 xrtexttrove.github.io some are wild!
We will also present “Beyond the Phone: Exploring Context-Aware Interaction Between Mobile and Mixed Reality Devices" by Fengyuan Zhu et al. showcasing the many interesting ways in which we can bring our phones inside VR.
Paper:...
Paper:...
April 4, 2025 at 6:00 PM
We will also present “Beyond the Phone: Exploring Context-Aware Interaction Between Mobile and Mixed Reality Devices" by Fengyuan Zhu et al. showcasing the many interesting ways in which we can bring our phones inside VR.
Paper:...
Paper:...
At @IEEEVR we present “EmBARDiment: an Embodied AI Agent for Productivity in XR" by @riccardobovoHCI et al. showing how powerful AI agents can be when they are aware of the content around us, driven by contextual inputs like gaze.
Web: http://embardiment.github.io
Code: soon
Web: http://embardiment.github.io
Code: soon
April 4, 2025 at 5:54 PM
At @IEEEVR we present “EmBARDiment: an Embodied AI Agent for Productivity in XR" by @riccardobovoHCI et al. showing how powerful AI agents can be when they are aware of the content around us, driven by contextual inputs like gaze.
Web: http://embardiment.github.io
Code: soon
Web: http://embardiment.github.io
Code: soon
We will also present “Beyond the Phone: Exploring Context-Aware Interaction Between Mobile and Mixed Reality Devices" by Fengyuan Zhu et al. showcasing the many interesting ways in which we can bring our phones inside VR.
Paper: duruofei.com/papers/Zhu_B...
Paper: duruofei.com/papers/Zhu_B...
March 6, 2025 at 5:10 AM
We will also present “Beyond the Phone: Exploring Context-Aware Interaction Between Mobile and Mixed Reality Devices" by Fengyuan Zhu et al. showcasing the many interesting ways in which we can bring our phones inside VR.
Paper: duruofei.com/papers/Zhu_B...
Paper: duruofei.com/papers/Zhu_B...
Next week at #IEEEVR we present “EmBARDiment: an Embodied AI Agent for Productivity in XR" by @riccardobovo.bsky.social et al. showing how powerful AI agents can be when they are aware of the content around us, driven by contextual inputs like gaze.
Web: embardiment.github.io
Code: coming soon
Web: embardiment.github.io
Code: coming soon
March 6, 2025 at 5:08 AM
Next week at #IEEEVR we present “EmBARDiment: an Embodied AI Agent for Productivity in XR" by @riccardobovo.bsky.social et al. showing how powerful AI agents can be when they are aware of the content around us, driven by contextual inputs like gaze.
Web: embardiment.github.io
Code: coming soon
Web: embardiment.github.io
Code: coming soon
And prototyping full scenarios of future interactions.
Like the work with Lystbaek et al. “Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction” UIST 2024
pure.au.dk/ws/portalfil...
Like the work with Lystbaek et al. “Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction” UIST 2024
pure.au.dk/ws/portalfil...
December 30, 2024 at 9:39 AM
And prototyping full scenarios of future interactions.
Like the work with Lystbaek et al. “Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction” UIST 2024
pure.au.dk/ws/portalfil...
Like the work with Lystbaek et al. “Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction” UIST 2024
pure.au.dk/ws/portalfil...