Show Notes
Today’s episode features one of the NIH Director’s New Innovator Award winners. It’s a unique project that features both artificial intelligence, virtual reality, and visual prostheses – retinal and brain implants referred to as “bionic eyes” – in hopes of one day providing better assistive technology for incurable blindness that affects about 40 million people worldwide.
On this episode of the NNLM Discovery podcast, we talk to Dr. Michael Beyeler, assistant professor of computer science and psychology and brain sciences at UC Santa Barbara about how his Bionic Eye Lab is combining A.I. smarts with a visual prosthesis to improve the quality of life for people who are blind.
You can view a short video about the story here on the NLM YouTube Channel.
The NNLM is the outreach arm of the National Library of Medicine with the mission to advance the progress of medicine and improve the public health by providing all U.S. health professionals with equal access to biomedical information and improving the public's access to information to enable them to make informed decisions about their health. The seven Health Sciences Libraries function as the Regional Medical Library (RML) for their respective region, with Region 7 consisting of: Connecticut, Massachusetts, Maine, New Hampshire, New York, Rhode Island and Vermont.
All of the artwork for this podcast series has been created with a generative AI image-to-image tool! The prompt for this episode was “Silhouette of an abstract painting of a blind person who can now see because of a smart bionic eye powered with artificial intelligence.”
We want your feedback! Please click on this link to offer your opinions about the NNLM Podcast!
Transcript
00:00:03:12 - 00:00:33:16
Yamila El-Khayat
I’m librarian Yamila El-Khayat and this is NNLM Discovery a podcast from the Network of the National Library of Medicine. Today’s episode is, “Towards a Smart Bionic Eye,” a story featuring one of the NIH Director’s New Innovator Award winners. I'll be your host as we talk about a unique project that features both artificial intelligence and virtual reality in hopes of one day providing better assistive technology for incurable blindness that affects about 40 million people worldwide.
00:00:33:18 - 00:01:04:12
Yamila El-Khayat
We need just a little set up before we jump into this story. A visual prosthesis, often referred to as a Bionic eye, that already exists. It’s a new technology that we’ll be describing quite a bit in this story, but the focus of this NIH project is to take the current, limited bionic eye to the next level! Today's story takes place near the beach on the campus of UC Santa Barbara, where you can see the waves crashing from the Pacific Ocean as you enter the campus.
00:01:04:14 - 00:01:19:07
Yamila El-Khayat
I talked with Project Lead Dr. Michael Beyeler, assistant professor of computer science and psychology and brain sciences, about this unique and exciting project. So Dr. Beyeler what is the goal of your project, towards a smart bionic eye?
00:01:19:09 - 00:01:46:12
Michael Beyeler
The goal of this project is to address the fundamental research in computer science, as well as neuroscience and human computer interaction that will one day allow us to build what I like to call a smart bionic eye, which is the idea of combining A.I. smarts with a visual prosthesis for people who are blind. And the idea of a visual prosthesis is really to replace lost functionality with an implant.
00:01:46:14 - 00:02:16:23
Michael Beyeler
So these devices are developed for people who have been able to see for most of their lives, but then lost their vision, perhaps due to a degenerative hereditary disease such as retinitis pigmentosa or perhaps lost an eye in an accident. And so the idea of current implants, which, by the way, have been implanted in 500 people worldwide. Is the idea to replace this lost functionality by electrically stimulating the surviving neurons either in the eye or in the visual cortex.
00:02:17:01 - 00:02:27:08
Michael Beyeler
I feel like a lot of the research so far has focused on the technical aspects of these devices, but less so on the usability of them.
00:02:27:10 - 00:02:31:15
Yamila El-Khayat
And what's the problem? What's wrong with the usability of these bionic eyes?
00:02:31:17 - 00:02:55:13
Michael Beyeler
Even though these devices are already out there. The vision they can provide is rather limited, and we know from interacting with users of these devices that what they can see is mostly blurry blobs and shapes, sometimes described as like a firework of different visual stimuli, that are really hard to put together and make sense of the scene around them.
00:02:55:14 - 00:03:10:14
Michael Beyeler
And so that's really where we come in. We were wandering rather than working towards one day restoring natural vision. Would it be better to find ways to provide practical and useful artificial vision today?
00:03:10:16 - 00:03:15:02
Yamila El-Khayat
That sounds amazing, but what is artificial vision and what does this grant doing?
00:03:15:06 - 00:03:48:08
Michael Beyeler
Yeah, so the main idea behind this grant is to think of these prosthesis less as a way to restore natural vision and more as a way to provide useful visual cues that help you do everyday tasks. What I mean by that is maybe it doesn't matter as much that the vision provided doesn't look natural. What is really important is that the people get the cues needed to go from A to B or to have a conversation or to find a door in real life.
00:03:48:10 - 00:04:17:17
Yamila El-Khayat
Michael explained to me the history of the bionic eye. There are two devices out there, the Argus II and the Orion. The Argus II is a retinal implant, but this implant can only work with patients who have retinal diseases. The Orion is an implant in the visual cortex of your brain. This is perfect for patients who may have lost their vision in an accident and who have no vision to salvage. The Orion brain implant technology is currently only in six patients.
00:04:17:19 - 00:04:22:15
Yamila El-Khayat
Luckily, Dr. Beyeler was able to connect and work with one of these trial patients.
00:04:22:17 - 00:04:49:18
Michael Beyeler
One voice that has been hugely important to this research is the voice of Jason, who is one of the Orion recipients. Jason is very tech savvy, but he's also very honest about the limitations of the current technology. And I think that is hugely important to designing the next generation of these devices to really learn from our previous mistakes and address them in a way that is useful to the end user.
00:04:49:20 - 00:05:13:06
Jason Esterhuizen
My name is Jason Esterhuizen. I'm originally from South Africa and I now live in Los Angeles. I was sighted until I was 23 years old. I had 20/20 vision. I was flying airplanes, driving motorcycles, racing cars, just being a 23 year old reckless person until I was involved in a car accident that left me blind.
00:05:13:08 - 00:05:47:10
Jason Esterhuizen
When I woke up after the car accident. I had been in a induced coma for a couple of weeks. The doctors came to me and just told me that I had lost my right eye and that my optic nerve of my left eye got torn or damaged and that there's no way that I could medically fix it. As blindness is a spectrum. So I am the one of the 1% of blind people that have zero light perception, zero vision.
00:05:47:12 - 00:06:00:01
Jason Esterhuizen
I only see black. Some people that are legally blind might still be able to drive a car with corrected vision, but for me, I'm totally blind. Zero Light perception.
00:06:00:03 - 00:06:19:03
Yamila El-Khayat
So Jason traveled across the world from South Africa to Los Angeles to see if he could be a part of this first clinical trial for the Orion device. Luckily, he was chosen and he relocated his life to California while he was a part of this trial. Here's Jason describing the Orion device and what it's like to use it.
00:06:19:05 - 00:06:45:12
Jason Esterhuizen
So the Orion cortical implant is a medical device that gets implanted onto the surface of your visual cortex in the back of your brain. That's the part of your brain that creates vision. It gets electrically stimulated by an external device, which is a pair of sunglasses with a video camera. So the video camera would pick up whatever is in front of you.
00:06:45:14 - 00:07:10:15
Jason Esterhuizen
It would then send the data to a VPU, a visual processing unit. The processing unit would send it to the coil, the external coil, which then speaks to the internal device implanted in your brain. And it creates light perception, something that they call phosphenes. So it's these tiny little flickers of light that move around and tell you what's what's in front of you.
00:07:10:16 - 00:07:35:22
Jason Esterhuizen
It's sort of like learning a new language. And the closest I could explain what I see is like looking up at the stars at night. It's black background with these flickering white lights just moving around. For me to have regained the ability to see light or identify light sources is 100% improvement from seeing absolutely nothing.
00:07:36:00 - 00:07:53:03
Yamila El-Khayat
We stopped our interview for a moment with Jason as he put on the Orion device for us, which didn't take much time. He put on the sunglasses which have the small camera in the middle, put the coil against his head with an elastic band and powered on the visual processing unit, which is a small box connected to the glasses.
00:07:53:05 - 00:08:05:19
Yamila El-Khayat
Here's Jason again describing what he sees now that the Orion glasses are turned on and the electric impulses are now being wirelessly sent to the array of electrodes implanted in his brain.
00:08:05:21 - 00:08:40:16
Yamila El-Khayat
Just to... Okay. So at the moment, what I'm seeing now are flickers of light all over the room is very bright at the moment. So everything bright is lighting up. So I see a bunch of slivers of light everywhere against the wall. And I assume that is the sunshine coming through the sliding door onto the the walls and then all of the dark gaps that I see, those might be objects in the room, might be a door, might be a couch, might be a human.
00:08:40:18 - 00:09:05:00
Jason Esterhuizen
And at night it might be the complete opposite. If it's very dark outside, I would be picking up all of the bright objects. I see a very dark object over there. It might be a human, might be a couch, might be anything. So this is where Dr. Beyeler's work would play a very crucial role into filling these gaps for me.
00:09:05:01 - 00:09:44:22
Jason Esterhuizen
I can see something, but I don't know what it is. But with A.I. or with object recognition technology integrated into this device, I would be able to look at it and it would be able to tell me that's a human, it's a car, it’s a trashcan, whatever. So I think the advantage of integrating Dr. Beyeler's work with the current implantable devices, it would help give you a more full picture of what's going on around you using, for instance, GPS, or facial recognition, or object recognition in combination with these implantable devices.
00:09:44:22 - 00:09:48:01
Jason Esterhuizen
It would be such a great help.
00:09:48:03 - 00:10:06:13
Yamila El-Khayat
When we interviewed Dr. Beyeler he was sitting in front of a small, round, very cluttered table, which consisted of a half played chess board, books, a small potted plant, wooden blocks, and even his car keys. Here's Michael describing how his smart bionic eye could help in real world scenarios.
00:10:06:14 - 00:10:25:21
Michael Beyeler
If you think about it, our world is cluttered, usually. It's very rich. For example, if I look at this table, I see many things. There's the books here. There's the chessboard. And if I just had to paint this picture given only a hundred pixels, I would not be able to realize what I'm looking at.
00:10:25:23 - 00:10:49:00
Michael Beyeler
And so the smart bionic eye instead could realize that what is important right now are the objects on this table. I could highlight the books, or perhaps I have misplaced my keys, and so I could ask my bionic eye, “Hey, where are my keys?” And while I look around the room, I scan the room. The implant would highlight the keys visually, allowing me to find them quickly.
00:10:49:02 - 00:10:57:15
Yamila El-Khayat
And how do you know what you're testing is similar to what these bionic eye users are actually seeing, especially when you're talking about implants in the brain.
00:10:57:17 - 00:11:30:13
Michael Beyeler
A big advantage of our approach is that we are using simulations of prosthetic vision, which we have spent many years developing using both psychophysical data and neurophysiological data. So these simulations are fairly sophisticated and they're well tested. What we can use them for is not just to get a better understanding of the kind of vision we can produce, but we have also embedded them in virtual reality, which allows anyone to put on the VR goggles and see through the eyes of the patient.
00:11:30:15 - 00:11:57:20
Michael Beyeler
This allows us to recreate scenes in virtual environments that might otherwise be too dangerous. Let's say you're trying to cross the street. You don't want to do that in real life, but you want to practice in virtual reality. So this gives us a testbed for these different ideas and augmentation strategies. And then once we find something that works in VR, we can spend the time and effort to go out and test it on real bionic eye patients.
00:11:57:22 - 00:12:28:12
Michael Beyeler
Putting on the VR goggles myself really changed my view about how these devices work, because it's easy to simulate some pixels on a screen, but that doesn't capture the real experience of these patients. What is really happening is the current implants produce a very limited field of view. So it's like watching TV from across the room. And that fundamentally changes how you use the device, because now you have to use your head to scan the scene and you will only get one piece at a time of the scene.
00:12:28:12 - 00:12:37:17
Michael Beyeler
And you kind of have to put that together. And putting the goggles on myself, I could really experience that and realize how hard this really is.
00:12:37:19 - 00:13:01:15
Yamila El-Khayat
Michael took me to the simulation room and had me try the VR, and it was very difficult to do anything. The task was to cross a sidewalk without hitting any obstacles. I think I hit about every possible obstacle out there. It was very frustrating. Jason described it well. It's definitely a new way of interpreting vision. It isn't anything like seeing as I know it as a sighted person.
00:13:01:16 - 00:13:23:12
Yamila El-Khayat
It's more like decoding patterns. And you really have to scan your head around to process all the details. The thing that also shocked me is that the VR is only in one eye, so your whole sense of depth perception was totally off. After 5 minutes, I couldn't handle it anymore. My favorite part of Dr. Beyeler’s research was meeting his team.
00:13:23:14 - 00:13:28:20
Yamila El-Khayat
Here's Michael again, explaining why this team is so unique.
00:13:28:22 - 00:13:46:20
Michael Beyeler
The important part of our approach is that it shouldn't just be me, a sighted researcher sitting in my office thinking about what would be helpful, but to actually incorporate the help of these bionic eye users and blind researchers at all stages of the development.
00:13:46:22 - 00:13:55:13
Yamila El-Khayat
We interviewed one of his blind researchers, Lucas Gil Nadolskis, a UC Santa Barbara Ph.D. student. Here's Lucas.
00:13:55:15 - 00:14:24:12
Lucas Gil Nadolskis
A lot of research that has been done for blind people has been done by sighted people. And the problems with that varies. The main one being that sighted people don't know the challenges that we have. And that was when I decided that I wanted to go for research and kind of be on the other side of the table and help with this connection between blind people and the research.
00:14:24:14 - 00:14:39:18
Yamila El-Khayat
We took Michael and Lucas outside the lab to the middle of a wide open, large concrete sidewalk where bikes, students, and skateboards whizzed by to talk about some of the challenges faced by the visually impaired navigating the college campus.
00:14:39:19 - 00:14:51:15
Michael Beyeler
Now, see, I think this is a great example of how we can help people get around. Right? We dropped you off right next to the parking lot and we're in this wide open space. Now what? How would you get around?
00:14:51:17 - 00:15:11:18
Lucas Gil Nadolskis
The challenge here is because this is an open area. Yeah. If I was, like, trailing a wall or grass or something. Yeah. That would be easy, because I knew, I would know where I was going and I would know I was going straight or right or whatever. But here it’s just open. So knowing that you are going straight or not.
00:15:11:19 - 00:15:38:12
Lucas Gil Nadolskis
It's really hard. And especially because the bikes don't only use a straight path, it's kind of angled. It makes it really confusing because one of the techniques for orientation mobility is that you use the parallel sounds. If you don't have anything to trail. But here there are bunch of sounds coming on different directions and this is why it's so hard to use any of them as a parallel sound.
00:15:38:13 - 00:15:46:13
Michael Beyeler
Yeah, especially on a busy campus where everyone is running late and no one is paying attention and the environment is different every day.
00:15:46:15 - 00:16:11:05
Lucas Gil Nadolskis
Yeah, you know, there are depending on where you are, there are different challenges. For example, I used to live in Minneapolis and the snow was a huge challenge because if you are on the snow you don't hear a lot of sounds. They're mostly muffled. So I actually walked on the train tracks a couple of times when I was leaving Minneapolis, because I didn't know where the street was or where the sidewalk was or anything.
00:16:11:06 - 00:16:12:03
Lucas Gil Nadolskis
00:16:12:05 - 00:16:14:02
Michael Beyeler
And there's currently no technology that could help you with that?
00:16:14:06 - 00:16:16:02
Lucas Gil Nadolskis
No.
00:16:16:04 - 00:16:24:17
Yamila El-Khayat
I asked Lucas how he's helping with Michael's research and how this smart bionic eye might fit in with his other mobility assistance devices.
00:16:24:19 - 00:16:55:20
Lucas Gil Nadolskis
There is a reason why you're using the cane for more than half a century. Alright, because it works. The cane works, the dog works, the, we have iPhone apps that do amazing stuff, you know, So when we are thinking about an implant, it's really important for us to understand exactly in which niche the implant would fit. You know, if I have an implant, that would only tell me when, you know, when I am getting close to a wall, I'm not going to use that because the cane do that.
00:16:55:22 - 00:17:26:00
Lucas Gil Nadolskis
But if I have an implant that can tell me what's on my right while the cane is on the left, that's a complement to the cane, which can be life saving depending on the situation, depending on the side of the hole that I can step. So having these very specifics of of an implant is really important when you’re developing it because, you know, the target population is blind people.
00:17:26:01 - 00:17:30:08
Lucas Gil Nadolskis
It doesn't matter if it looks cool on the paper, it needs to be useful.
00:17:30:10 - 00:17:39:00
Yamila El-Khayat
We'll finish our story with Michael and Lucas one last time, explaining the importance of having the NIH and NLM fund this research.
00:17:39:02 - 00:18:10:07
Michael Beyeler
I think this research would not have been possible without the support of the National Library of Medicine or the National Institutes of Health, because it allows us to go across disciplines which typically is not doable with traditional funding sources. So it really allow us to integrate our knowledge ranging from the computational neuroscience of how individual neurons react to electrical stimulation, all the way to visual perception of what it looks like and really feels like to use these devices.
00:18:10:09 - 00:18:31:07
Michael Beyeler
I personally really believe in the value of open science. So the moment I started my lab, I knew it was important to make all our code, our papers, our simulator publicly available, because to me it is most about driving the field as a whole forward rather than just one particular technology.
00:18:31:09 - 00:19:16:00
Lucas Gil Nadolskis
It's my life's work, right? It's more than research. And for a lot of people working with this, it's a cool little project. And for me it's deeply personal. I was fortunate enough to be in a very unique position of being a blind person that graduated in neuroscience, that is interested in the field. Also, the aspects of my blindness are quite of unique because I was sighted until five years old, which means I have really good visual memory and being able to put all of these skills to be used on something that would potentially one day help all of the other blind people.
00:19:16:00 - 00:19:21:16
Lucas Gil Nadolskis
It's more than research, it’s more than anything. It's my, you know, the goal of my life.
00:19:21:18 - 00:19:43:04
Michael Beyeler
This is really just the beginning of the smart bionic eye. But even in a year, we have made a great amount of progress and have developed the theoretical foundations, that hopefully in the next two or three years will allow us to go out and validate these ideas with real patients.
00:19:43:06 - 00:20:06:23
Yamila El-Khayat
We've made a short video of this story and we've also made an audio described version of this video for the visually impaired. Check the show notes for these links to YouTube. This video is a great way to see the bionic eye technology, the virtual reality, and get a better sense of the challenges the bionic eye users face. The NLM, NIH and NNLM offer many funding opportunities like this one.
00:20:07:01 - 00:20:18:18
Yamila El-Khayat
Contact your local regional rep or search for grants that are available now at: nnlm.gov/funding to learn more. This is NNLM Discovery. Thank you for listening.