Bionic Vision Lab
@bionicvisionlab.org
460 followers 170 following 130 posts
👁️🧠🖥️🧪🤖 What would the world look like with a bionic eye? Interdisciplinary research group at UC Santa Barbara. PI: @mbeyeler.bsky.social‬ #BionicVision #Blindness #NeuroTech #VisionScience #CompNeuro #NeuroAI
Posts Media Videos Starter Packs
Reposted by Bionic Vision Lab
mbeyeler.bsky.social
👁️🧠 New preprint: We demonstrate the first data-driven neural control framework for a visual cortical implant in a blind human!

TL;DR Deep learning lets us synthesize efficient stimulation patterns that reliably evoke percepts, outperforming conventional calibration.

www.biorxiv.org/content/10.1...
Diagram showing three ways to control brain activity with a visual prosthesis. The goal is to match a desired pattern of brain responses. One method uses a simple one-to-one mapping, another uses an inverse neural network, and a third uses gradient optimization. Each method produces a stimulation pattern, which is tested in both computer simulations and in the brain of a blind participant with an implant. The figure shows that the neural network and gradient methods reproduce the target brain activity more accurately than the simple mapping.
bionicvisionlab.org
As federal research funding faces steep cuts, UC scientists are pushing brain-computer interfaces forward: restoring speech after ALS, easing Parkinson’s symptoms, and improving bionic vision with AI (that’s us 👋 at @ucsantabarbara.bsky.social).

🧠 www.universityofcalifornia.edu/news/thrilli...
Thrilling progress in brain-computer interfaces from UC labs
UC researchers and the patients they work with are showing the world what's possible when the human mind and advanced computers meet.
www.universityofcalifornia.edu
Reposted by Bionic Vision Lab
mbeyeler.bsky.social
Excited to share that I’ve been promoted to Associate Professor with tenure at UCSB!

Grateful to my mentors, students, and funders who shaped this journey and to @ucsantabarbara.bsky.social for giving the Bionic Vision Lab a home!

Full post: www.linkedin.com/posts/michae...
Epic collage of Bionic Vision Lab activities. From top to bottom, left to right:
A) Up-to-date group picture
B) BVL at Dr. Beyeler's Plous Award celebration (2025)
C) BVL at The Eye & The Chip (2023)
D/F) Dr. Aiwen Xu and Justin Kasowski getting hooded at the UCSB commencement ceremony
E) BVL logo cake created by Tori LeVier
G) Dr. Beyeler with symposium speakers at Optica FVM (2023)
H, I, M, N) Students presenting conference posters/talks
J) Participant scanning a food item (ominous pizza study)
K) Galen Pogoncheff in VR
L) Argus II user drawing a phosphene
O) Prof. Beyeler demoing BionicVisionXR
P) First lab hike (ca. 2021)
Q) Statue for winner of the Mac'n'Cheese competition (ca. 2022)
R) BVL at Club Vision
S) Students drifting off into the sunset on a floating couch after a hard day's work
Reposted by Bionic Vision Lab
mbeyeler.bsky.social
At #EMBC2025? Come check out two talks from my lab in tomorrow’s Sensory Neuroprostheses session!

🗓️ Thurs July 17 · 8-10AM · Room B3 M3-4
🧠 Efficient threshold estimation
🧑🔬 Deep human-in-the-loop optimization

🔗 embc.embs.org/2025/program/
#BionicVision #NeuroTech #IEEE #EMBS
Program – EMBC 2025
Loading...
embc.embs.org
bionicvisionlab.org
🧠 Building on Roksana Sadeghi’s work: Calibrating retinal implants is slow and tedious. Can Gaussian Process Regression (GPR) guide smarter sampling?

✅ GPR + spatial sampling = fewer trials, same accuracy
🔁 Toward faster, personalized calibration

🔗 bionicvisionlab.org/publications...

#EMBC2025
Efficient spatial estimation of perceptual thresholds for retinal implants via Gaussian process regression | Bionic Vision Lab
We propose a Gaussian Process Regression (GPR) framework to predict perceptual thresholds at unsampled locations while leveraging uncertainty estimates to guide adaptive sampling.
bionicvisionlab.org
bionicvisionlab.org
🎓 Proud of our undergrad(!) Eirini Schoinas for leading this:
bionicvisionlab.org/publications...

🧠 Human-in-the-loop optimization (HILO) works in silico—but does it hold up with real people?
✅ HILO outperformed naïve and deep encoders
🔁 A step toward personalized #BionicVision

#EMBC2025
Evaluating deep human-in-the-loop optimization for retinal implants using sighted participants | Bionic Vision Lab
We evaluate HILO using sighted participants viewing simulated prosthetic vision to assess its ability to optimize stimulation strategies under realistic conditions.
bionicvisionlab.org
bionicvisionlab.org
👁️⚡ Headed to #EMBC2025? Catch two of our lab’s talks on optimizing retinal implants!

📍 Sensory Neuroprostheses
🗓️ Thurs July 17 · 8-10AM · Room B3 M3-4
🧠 Efficient threshold estimation
🧑🔬 Deep human-in-the-loop optimization

🔗 embc.embs.org/2025/program/
#BionicVision #NeuroTech #IEEE #EMBS #Retina
Program – EMBC 2025
Loading...
embc.embs.org
bionicvisionlab.org
This matters. Checkerboard rastering:

✔️ works across tasks
✔️ requires no fancy calibration
✔️ is hardware-agnostic

A low-cost, high-impact tweak that could make future visual prostheses more usable and more intuitive.

#BionicVision #BCI #NeuroTech
bionicvisionlab.org
✅ Checkerboard consistently outperformed the other patterns—higher accuracy, lower difficulty, fewer motion artifacts.

💡 Why? More spatial separation between activations = less perceptual interference.

It even matched performance of the ideal “no raster” condition, without breaking safety rules.
Boxplots showing task accuracy for two experimental tasks—Letter Recognition and Motion Discrimination—grouped by five raster patterns: No Raster (blue), Checkerboard (orange), Vertical (green), Horizontal (brown), and Random (pink). Each colored boxplot shows the median, interquartile range, and individual participant data points.

In both tasks, Checkerboard and No Raster yield the highest median accuracy.

Horizontal and Random patterns perform the worst, with more variability and lower scores.

Significant pairwise differences (p < .05) are indicated by horizontal bars above the plots, showing that Checkerboard significantly outperforms Random and Horizontal in both tasks.

A dashed line at 0.125 marks chance-level performance (1 out of 8).

These results suggest Checkerboard rastering improves perceptual performance compared to conventional or unstructured patterns.
bionicvisionlab.org
We ran a simulated prosthetic vision study in immersive VR using gaze-contingent, psychophysically grounded models of epiretinal implants.

🧪 Powered by BionicVisionXR.
📐 Modeled 100-electrode Argus-like array.
👀 Realistic phosphene appearance, eye/head tracking.
Diagram showing the four-step pipeline for simulating prosthetic vision in VR.
Step 1: A virtual camera captures the user’s view, guided by eye gaze. The image is converted to grayscale and blurred for preprocessing.
Step 2: The preprocessed image is mapped onto a simulated retinal implant with 100 electrodes. Electrodes are activated based on local image intensity and grouped into raster groups. Raster Group 1 is highlighted.
Step 3: Simulated perception is shown with and without rastering. Without rastering (top), all electrodes are active, producing a more complete but unrealistic percept. With rastering (bottom), only 20 electrodes are active per frame, resulting in a temporally fragmented percept. Phosphene shape depends on parameters for spatial spread (ρ) and elongation (λ).
Step 4: The rendered percept is updated with temporal effects and presented through a virtual reality headset.
bionicvisionlab.org
Checkerboard rastering has been used in #BCI and #NeuroTech applications, often based on intuition.

But is it actually better, or just tradition?

No one had rigorously tested how these patterns impact perception in visual prostheses.

So we did.
Raster pattern configurations used in the study, shown as 10×10 electrode grids labeled with numbers 1 through 5, representing five sequentially activated timing groups.

1. Horizontal: Each row of electrodes belongs to one group, with activation proceeding top to bottom.

2. Vertical: Each column is a group, activated left to right.

3. Checkerboard: Electrode groups are arranged to maximize spatial separation, forming a checkerboard-like layout.

4. Random: Group assignments are randomly distributed across the grid, with no spatial structure. This pattern was re-randomized every five frames to test unstructured activation.
Each group is represented with different shades of gray and labeled numerically to indicate activation order.
bionicvisionlab.org
👁️🧠 New paper alert!

We show that checkerboard-style electrode activation improves perceptual clarity in simulated prosthetic vision—outperforming other patterns in both letter and motion tasks.

Less bias, more function, same safety.

🔗 doi.org/10.1088/1741...

#BionicVision #NeuroTech
Raster patterns in simulated prosthetic vision. On the left, a natural scene of a yellow car is shown, followed by its transformation into a prosthetic vision simulation using a 10×10 grid of electrodes (red dots). Below this, a zoomed-in example shows the resulting phosphene pattern. To comply with safety constraints, electrodes are divided into five spatial groups activated sequentially across ~220 milliseconds. Each row represents a different raster pattern: vertical (columns activated left to right), horizontal (rows top to bottom), checkerboard (spatially maximized separation), and random (reshuffled every five frames). For each pattern, five panels show how the scene is progressively built across the five raster groups. Vertical and horizontal patterns show strong directional streaking. Checkerboard shows more uniform activation and perceptual clarity. Random appears spatially noisy and inconsistent.
Reposted by Bionic Vision Lab
bionic-vision.org
👁️🧠🧪 Next on the Horizon: Frederik Ceyssens from ReVision Implant on scaling bionic vision to the cortex with Occular, a high-res, deep-brain prosthesis.

Why performance might beat invasiveness - and what comes next:
www.bionic-vision.org/research-spo...

#BionicVision #NeuroTech #BCI
bionic-vision.org | Research Spotlights | Frederik Ceyssens, ReVision Implant
Frederik Ceyssens is Co-Founder and CEO of ReVision Implant, the company behind Occular: a next-generation cortical prosthesis designed to restore both central and peripheral vision through ultra-flex...
www.bionic-vision.org
Reposted by Bionic Vision Lab
mbeyeler.bsky.social
Last but not least is Lily Turkstra, whose poster is assessing the efficacy of visual augmentations for high-stress navigation:

Tue, 2:45 - 6:45pm, Pavilion: Poster #56.472
www.visionsciences.org/presentation...

👁️🧪 #XR #VirtualReality #Unity3D #VSS2025
VSS PresentationPresentation – Vision Sciences Society
www.visionsciences.org
Reposted by Bionic Vision Lab
mbeyeler.bsky.social
Coming up: Jacob Granley on whether V1 maintains working memory via spiking activity. Prior evidence from fMRI and LFPs - now, rare intracortical recordings in a blind human offer a chance to test it directly. 👁️ #VSS2025

🕥 Sun 10:45pm · Talk Room 1
🧠 www.visionsciences.org/presentation...
Schematic illustrating the phosphenes elicited by an intracortical prosthesis. A 96-channel Utah array is shown, stimulated with biphasic pulse trains. An arrow points to drawings of visual percepts elicited by electrical stimulation Jacob Granley headshot
Reposted by Bionic Vision Lab
mbeyeler.bsky.social
Our @bionicvisionlab.org is at #VSS2025 with 2 talks and a poster!

First up is PhD Candidate Byron A. Johnson:

Fri, 4:30pm, Talk Room 1: Differential Effects of Peripheral and Central Vision Loss on Scene Perception and Eye Movement Patterns

www.visionsciences.org/presentation...
Example stimulus from Byron's image dataset: an unobscured version (left) depicting a man on a bike approaching a woman trying to cross the bike lane; a simulation of peripheral vision loss (center), where the woman is clearly visible but the man on the bike is obscured; and a simulation of central vision loss (right), where the man on the bike is apparent but the woman is obscured Byron Johnson headshot
Reposted by Bionic Vision Lab
mbeyeler.bsky.social
Not usually one to post personal pics, but let’s take a break from doomscrolling, yeah?

Some joyful moments from the Plous Award Ceremony: Honored to give the lecture, receive the framed award & celebrate with the people who made it all possible!

@bionicvisionlab.org @ucsantabarbara.bsky.social
Michael Beyeler smiles while receiving the framed Plous Award at UC Santa Barbara. College of Letters & Science's Dean Shelly Gable presents the award in front of a slide thanking collaborators and funders, with photos of colleagues and logos from NIH and the Institute for Collaborative Biotechnologies. The audience watches the moment from their seats. The lecture hall of Mosher Alumni House is packed as Prof. Beyeler gets started with his lecture titled "Learning to See Again: Building a Smarter Bionic Eye" Michael Beyeler stands with members of the Bionic Vision Lab in front of a congratulatory banner celebrating his 2024–25 UCSB Plous Award. Everyone is smiling, with some holding drinks, and Michael is holding his young son. The group is gathered outdoors under string lights, with tall eucalyptus trees in the background.
Reposted by Bionic Vision Lab
ucsbengineering.bsky.social
Modern visual prosthetics can generate flashes of light but don’t restore natural vision. What might bionic vision look like?

In CS asst. prof Michael Beyeler's @bionicvisionlab.org his team explores how smarter, more adaptive tech could move toward a bionic eye that's functional & usable.👁️
bionic eye glasses
Reposted by Bionic Vision Lab
bionic-vision.org
Virtual Human Retina: A simulation platform designed for studying human retinal degeneration and optimizing stimulation strategies for retinal implants 👁️🧠🧪

doi.org/10.1016/j.br...
Examples of discrete neuronal network models of the human retina, including the central (top) and peripheral retina (bottom). Photoreceptors, bipolar cells, and ganglion cells are shown.
Reposted by Bionic Vision Lab
Reposted by Bionic Vision Lab
maxhodak.bsky.social
I had an idea way back in college which I've long thought could be, in many ways, the ultimate BCI technology. What if instead of using electrodes, we used biological neurons embedded in electronics to communicate with the brain?

Enter biohybrid neural interfaces: science.xyz/news/biohybr...
Biohybrid neural interfaces: an old idea enabling a completely new space of possibilities | Science Corporation
Science Corporation is a clinical-stage medical technology company.
science.xyz