On Tech & Vision Podcast

Leveling Up Accessible Video Game Features: How New Technology is Making Gaming More Immersive and Inclusive for People with Vision Loss

For decades, people with vision loss had limited options when it came to accessing video games. Aside from screen magnification and text-to-voice tools, gamers who are blind or visually impaired didn’t have many ways to play their favorite titles. But in recent years, the same cutting-edge technology used to create games has been used to also make them more accessible for people with vision impairment. These advances include more visibility options, the implementation of 3D audio, haptic feedback, and customizable controllers for gamers with vision impairment. Furthermore, 3D audio technologies being developed in live sports may soon make their way to online multiplayer video games. The implementation and improvement of these technologies mean that everyone will be able to play together, regardless of their visual acuity.

Podcast Transcription

Abdallah: Video games have been a part of my life since I was around the age of five, honestly. So yeah, before I got diagnosed with Stargardt, I played, I think on my sister’s Super Nintendo first.

Roberts: Elaine Abdallah is a lifelong gamer who began losing her vision when she was a teenager from Stargardt’s disease, a rare inherited condition that causes loss of central vision.

Abdallah: I started losing my vision around the age of 10. Before that, I had perfect vision. And then I started having issues seeing the projector at school or the blackboard, and it took about three years to get diagnosed. But with Stargardt, it’s similar to macular degeneration, which a lot more people have heard of, but you lose your central vision and you get kind of these blind spots. So, for me, it’s progressed up until about the time I was 18, 19, 20 around there and it’s stayed the same for a while now. I’m 31 so thankfully it stayed pretty much the same.

I have about 2200 to 2400 visual acuity, so back then, gosh, when I was a teenager and I got diagnosed with Stargardt’s, if I was like on a handheld device, I had to hold it really close to my face to see it. But honestly, the handheld devices like PSP or Game Boy, I just couldn’t play. I think I might have been able to put some of those handheld devices under my CCTV magnifier. That was about it. And then as far as my consoles that I could play on the TV. I just sat really, really close. That was about all you’re gonna get back then.

Roberts: Luckily, as gaming technology leveled up, more accessibility options could be developed.

Jones: I’m Kaitlyn Jones. I’m a program manager on the gaming accessibility team over at Xbox, and I get the pleasure of working with our hardware teams as well as our customer and developer documentation and training teams to be able to bring more accessibility, knowledge and visibility out into the gaming industry.

It all started back when I was actually in high school. My dad, he’s a mechanical engineer by trade, but he had a lot of friends who were coming home from being deployed overseas in the military, and a lot of them were huge gamers. And they would come home a lot of times, missing one or all four limbs and some cases of traumatic brain injuries and things like that, and not being able to play video games was pretty devastating, especially with everything else going on.

So, he basically started hacking various controllers to create these really customized setups. So anyone who came to him would be able to play. So, whether it was using mouth-based joysticks or bite switches or mounting buttons by their heads or knees – wherever they had mobility.

Back then, especially from the hardware side of things, we were basically having to crack open three or four controllers and basically solder out wiring from them just so we could get that wireless connection from whatever larger buttons that we were connecting to controllers to give input to the console itself. And then, when I came to game titles as well,

there just definitely weren’t as many accessibility features and settings in the titles that we see today.

Accessibility was starting to get some pretty good motion back then, but really in the past, I want to say 6 to 8 or 10 years, things have really taken off in a great way.

Roberts: I’m Doctor Cal Roberts and this is On Tech & Vision. Today’s big idea is inclusive design in gaming. The video game industry is a leader in innovation and incorporating some of the world’s most cutting-edge technology. Like with smartphones, we should expect video games to be accessible to everyone who wants to play, no matter their ability. Fortunately, designers are enabling more doors to be open for gamers who are blind or visually impaired, and they lead to infinite possibilities. Let’s dive in.

Spinks: I guess if we take it from the first principle, gaming is predominantly a visual medium, so we’ve been aware of that for some time.

Roberts: This is Robin Spinks. He’s the head of inclusive design at the UK’s Royal National Institute of Blind People, and has a personal passion for gaming. Robin has vision loss from albinism, a genetic condition which often causes vision impairment among other symptoms.

Spinks: So we carried out some research in 2022 really to try and find out what blind and partially sighted people’s experience of gaming was and also to identify the gaps where people didn’t have access where they wanted to have access. And we’ve long held the view that gaming is something that people have a right to have access to. This shouldn’t be a privilege. You should be able to access gaming in the same way that anyone else can if you’re blind or partially sighted. The motivation here is to remove barriers, to form partnerships and to collaborate so that going forward becomes a much more possible area of life for people. Cause gaming’s fantastic, right? And why should you be stopped from enjoying it just because you’re blind or partially sighted?

Roberts: So what I hear you saying is that it’s not that you’re looking for games for people who are visually impaired, but that what you’re looking for is that all games are accessible to people who are visually impaired.

Spinks: Yeah, blind and partially sighted people are like anyone else, we want to play the same type of games that the sighted population are playing. We want to be able to play in a team or against an opponent who’s a family member or a friend or a neighbor. That’s what we’re really shooting for, and it’s ambitious, but we believe passionately that this is something that people care deeply about.

Abdallah: I just have always really loved video games and their stories and the escapism that comes with that. Role-playing games are my go-to. My favorite series is Final Fantasy. That’s been one of my favorites for a long, long time as well as TheKingdom Hearts, Grand Theft Auto, The Last of Us, which is the TV show now, which is really cool.

Roberts: The games Elaine mentions were all developed for current platforms, meaning they have incredible graphics and complex control schemes. They’ve come a long way since simple games like Pong or Tetris.

Spinks: Yeah, I mean we’ve seen games evolve as increasingly visual mediums and also, the richness of that visual information has increased enormously with the power that’s available in consoles, but also on PC and on mobile. Alongside that, we’ve also seen significant advances in audio. And then of course haptics. Think about the importance of touch and vibration and really very small fine movements and when you begin to think about haptics and audio, and then you also begin to think about different approaches to design of games and you’ve got a lot more possibility.

Roberts: Kaitlin Jones agrees.

Jones: Both of those things are really fantastic on a lot of different fronts, but especially when we’ve worked with our communities and folks are in the blind and low vision motivation community, they find those really helpful because if you are in a situation where you’re playing a game, you have this 3D space around you that you’re navigating, but obviously you can’t see who’s on the screen and you can’t see where those things are on the screen it’s really difficult to navigate through a game. If I can’t see where the door is, how can I walk through it? So, that’s where a lot of times things like spatial audio will come in really handy.

Because, if players are able to spatially place where the sound of an enemy groaning is in space for them, they can then turn their character towards that area through the spatial audio queue and in fire at that enemy. In Years Five, one of our Xbox Game Studios titles, they actually have a feature called a fabricator ping and a navigation ping. It’s a spatial audio kind of based feature where the entry and exit points and other like key aspects of the game do you have this spatial audio ping.

So, if you’re looking for the box of loot, that is the goal for that mission in the game, the box of loot is constantly pinging, and as your character is moving around the space, you can use the spatial audio of where that ping is coming from to get closer and closer to the box until you eventually find it.

And the same with doors and the game and getting out of rooms and things like that is just having these rich spatial audio cues can give you that understanding of where you are or your character is in space even though you can’t physically see you don’t have to.

Roberts: Robin thinks spatial audio technology will unlock incredible new experiences for gamers who are blind or visually impaired.

Spinks: One of the areas we’re really interested in is head tracking and audio tracking So, imagine being able to track exactly where audio’s coming from in a game. and be able to convey that to the user. So that’s an opportunity I think, because if we can work on ways to make that more accurate, more precise, particularly if we can do that combining with haptic feedback.

So, you might imagine, for example, using a wearable that’s on your wrist, or maybe even using a trackpad and being able to communicate some visual information, haptics, that move across the surface, but which also tracks audio. Then I think we get to a really interesting place.

Roberts: And that technology on the audio front is being developed today by AKQA, a design and innovation agency. AKQA Executive Director of Innovation Tim Devine and his team created action audio, which turns spatial data into information-rich sound sports experiences, allowing people who are blind or visually impaired to follow the action in real time.

Devine: Yeah, so we saw an opportunity to fill an information gap when people who are blind or have low vision are experiencing broadcast sport at the moment. W use Hawkeye in the case of tennis. The Hawkeye content is very high resolution, so millimeters of resolution and Hawkeye’s basically multiple cameras around the stadium and it captures a 3D model of things as they’re happening, and it’s being used to see if anything is in or out. And so we can use that data to know where the ball is in 3D space and from that we can also track the player and where the ball is in relation to the player.

So, when something happens on the court, it’s captured quite fast with Hawkeye within hundreds of milliseconds, and we’re able to take that data relatively quickly and generate the audio and then send it to the broadcast before it’s received in the real world.

Roberts: One of the key features of action audio is the fact that it’s 3D audio, as opposed to words which tend to be 2D audio. Help me with that.

Devine: 3D audio is, is and like our eyes, we have two ears so we can perceive things in space. So the idea of 3D audio is that when we wear headphones, there’s some headphones now that you can wear that have a couple of key features. One of them is passthrough mode, so being able to hear what’s outside and around you by using the noise cancellation microphones in reverse so you can hear what’s happening around you still even though you’ve got audio coming into your ears.

The second feature is what’s known as head related transfer functions. So when you move your head left or right and there’s a sound in your environment, that sound stays where it is even though your head moves. So, what spatialized audio is starting to do is to keep things persistent in space so we can envision a time where you’re able to sit anywhere in the stadium and you’re able to hear sound as if it’s happening at present in that space. Even when you move your head, it will still be in place that was, so 3D audio is about creating a sense of presence in physical space.

Roberts: The concept of action audio was born from the idea of sensory substitution, which we also discussed in the 7th episode of this podcast, Training the Brain: Sensory Substitution.

Devine: A long time ago, before Action Audio really became a thing, we were talking to a professor at Northwestern University called Moran Cerf, and he was friends with David Eagleman. Dave Eagleman is known for his company called Neurosensory. And he’s talked a lot about the idea of sensory substitution. And that got us really excited.

Sensory substitution is about saying the brain is just receiving signal and it’s not receiving vision or audio like your computer receives audio and vision, it’s just receiving signal. And why the brain is so amazing is it just translates that into experience. Well, it’s a beautiful sentiment. I think that’s the idea of 3D audio making things more immersive.

Action Audio debut to the public at the 2022 Australian Open allowing tennis fans who are blind or visually impaired to follow the action like never before. Here’s a breakdown of the different sounds that accompany the live feed. When you hear the sound of a sleigh bell. That’s someone hitting the ball, or the ball bouncing. A high or low pitch beep represents a forehand or backhand, respectively. The repeated tapping sounds tell you how close the ball is to hitting the line. The more taps, the closer it is. Now let yourself be transported to Sydney, Australia to the women’s singles final between Ashley Barty and Danielle Collins at the 2022 Australian Open.

Imagine you’re sitting in the crowd, taking the match in with them. Collins is about to serve.

It’s an absolute game changer for tennis fans who are blind or visually impaired, but you don’t have to take it from me. Here are some first-hand reactions to their experience with action audio.

Fantastic. Forehand. Backhand. Backhand. Backhand. Backhand. Forehand. What a rally.

I don’t know why, but I can suddenly see the ball. I can actually hear everything, and I can actually see them all for some reason. I can see the ball now better than I could see the ball before.

I wouldn’t watch it any other way. You can just enjoy the game so much better.

I don’t have to ask my dad like. Can you commentate, I can’t see where the balls going?

Roberts: For Tim Devine and his team, a lot of thought went into every aspect of this immersive technology.

Devine: The idea is to make it effectively augmenting the existing audio experience because there’s lots of information in there.

Roberts: Fascinating. How did you go about determining which sounds are most effective?

Devine: Action Audio is a process of iteration. We went through a long process. Actually just exploring sound as information. It in lots of different forms and lots of different, I guess, density. So language is obviously a high bandwidth information and then we have music which is like an emotive kind of emotional abstraction of sounds which plays on memory. Recollection and experience over time. And then there’s the elevator sounds, which is like a pitch that goes “ding ding” to say the elevator is going up or “ding ding” to say the elevator is going down.

So those simple cues are very informational or at a crosswalk or traffic lights when you’re walking across the road, you hear the click click. They’re super simple terms. Or a horn. Sound is great indicator of activity and action. So we went through a whole bunch of thinking and ideas around what we wanted it to be, and we came up with four principles that we felt could be universally applied to all sports.

First is social, and social is a massive part of sports, so we wanted to make sure it was listenable by people who are not blind or low vision. And second is we wanted to keep it familiar. We realize that there are lots of blind sports, so blind tennis has bells in the ball.

So we thought that’s a great sound cue. There’s a high correlation between people who play blind tennis and like experiencing tennis broadcasts or tennis matches. So let’s use existing sound cues, so let’s not try and reinvent the wheel, so to speak.

The third principle was around tension. We had to break down what the essence of tennis was. We needed to think about what makes tennis interesting, what makes sport interesting, and then what can we pull out and then highlight using the sound cues. And we realized that high performance athletes and sports teams are always on the edge of things. They’re always just high performance. And in tennis it’s often that there’s an ability for the player to just move the other player around by putting it around the edges of the court. That was an opportunity for us to explore the tension that is created through this high performance.

So, in the case of tennis, we give a sound cue of to how close the ball is to the edges of the playspace or edge of the court. That’s what gives you that excitement and that experience beyond the score and beyond the kind of the macro game aspects. And so that was the third point and the last sort of design criteria was acknowledging that as our brain is very capable of hearing lots of things or receiving lots of information and filtering it out.

So, when we’re in a cafe, we might be able to listen to the friend that we’re talking to, but we could also tune into the barista and hear what the barista is talking about and tune our friends out. And this isn’t the same for everyone. There are people who have suffered from the inability to filter out sounds. But for the most part, most people can filter out sounds so that they can focus on. So, that allows us to put a little bit more information there for people to move around.

So, when you’re watching a game of tennis, or when a sighted person is watching of tennis, you’re not just watching the ball all the time. You might watch the crowd, you might watch this specific player, you might move around. So, we wanted that experience as well.

Roberts: And this incredible technology could soon be implemented in gaming and beyond.

Devine: We’re seeking to have a universal experience in the sense that video games, computer games, especially in the case of NBA2K or other basketball games they’re a phenomenal platform for experience and they’re great testing ground for us as well. The fidelity of those games and the kind of longevity of those games mean that they’re very immersive.

Our ambition is that Action Audio contributes to those experiences and is effectively a universal experience. So if you were to play NBA2K with action audio turned on, you would go to the arena with your with Action Audio or you could experience a broadcast of this. You’d have Action Audio across all those experiences. There’s an opportunity there to make a universal language because the reality with Action Audio is there is an onboarding and a learning process. It’s new and so that’s probably something that we want to evolve and normalize across all the experiences of a sport.

Roberts: I asked Robin Spinks about this idea of building an immersive, universal experience in gaming. So, to me it sounds similar to the history of motion pictures that were initially just visual and then they added talkies and then they added the more 3 dimensional

that you sit in the theater and you can feel your seats shaking and you get more of a surround sound feeling, which gets you much more involved in the actual movie. So that could be similarly done say, with gaming.

Spinks: Absolutely. Think about the experience you can have in VR. For example, if you experience a roller coaster, I think that was probably one of the first experiences I had in VR where I’m actually on a seat, which will convey that moment where you’re about to go down this huge dissent. We’ve tried to communicate that now through haptics and through the movement of a seat, for example, and combine it with the visuals that would actually accompany that experience.

So, thinking about how you can leverage that kind of capability, what we want to know is how can we do that for people who are blind or partially sighted? So, how can we communicate some of the things that are intensely visual in ways that are that are haptic, or that are audio, or an immersive experience that actually allows people to take part and enjoy it, whatever their level of vision.

Roberts: But of course, what you’re saying is that these innovations won’t just benefit people who are visually impaired. Everyone will enjoy this greater involvement in the game.

Spinks: Absolutely. That’s one of the reasons why I’m so passionate about inclusive design, because I think what you’ve just said applies to so many walks of life. If you make an improvement, if you make up an uptick for people with sight loss, you actually make the experience better for everybody.

You know, I can remember when some of the smartphone platforms started to introduce accessibility and we were right involved at the outset of that. And incrementally we saw that functionality coming baked into the platform. And of course today, quite rightly that’s what consumers expect. A new iPhone that’s launched, people expect it to be accessible because it’s been accessible for over a decade now.

We want the same thing to be true of gaming and game consoles. Look at the power and the capability that’s packed into a games console and if you think about that in relation to a smartphone, if smartphones can become accessible devices. and have toolkits and frameworks that enable developers to make accessible titles, let’s do the same on gaming. So yes, describing what’s happening in the game is fantastic. Making sure that there’s the opportunity for you to turn on bold text, to make the font bigger and menus, brilliant. All of that stuff of helps, but that’s not all that needs to happen.

Hopefully you set out on appearance of seeking a better, better experience, just like that Japanese principle of continuous improvement Kaizen, where you’re kind of constantly looking to do things better. That’s what we want to see in the gaming world when it comes to accessibility. It’s early days, but you know there are many positive signs and we’re really keen to lend our voice to the community. Every now and again people quote this statistic that the gaming industry globally was bigger than Hollywood and Bollywood and the music industry combined. That is one big marketplace. Let’s make sure those people are included.

Roberts: Kaitlin Jones and her team at Xbox are working hard to do just that.

Jones: We’ve always really prioritized accessibility along the way. But I think in terms of the journey of even where we started when I first joined the team a few years ago versus now, I think the bar honestly just keeps getting higher and higher, so it makes how we prioritize accessibility, a kind of change in that way. Where years ago we might have been really, really proud and excited about a basic feature like having screen narration in one of our games, whereas now that’s like a standard low hanging fruit as the bar has been raised and raised.

So, now we know we’re going to have that in, but we need to seek more. So, I think the priority and understanding and emphasis has always been there, but I think it’s really cool how, as time goes on, our idea of what is accessible just gets bigger and more comprehensive than it was back then.

I would just love to see a world and an industry where all games just readily have all of those key accessibility, support features or settings, whatever you want to call them going forward. There isn’t a case where a game ships without having basic menu narration. All games will have integrated haptic feedback and audio tones and spatial audio and things like that, because not only is it an accessibility thing and helps millions of our players, but it’s also just great from an emergent standpoint when you think about the entire larger gaming population at hand. We love when we can really feel like we’re in the game and things are rumbling and our controller is being really interactive and everything like that. It just makes for a better experience for everyone.

Roberts: For Elaine Abdallah, accessible gaming has come a long way since those early days of nothing more than screen magnification.

Abdallah: To me, it’s just crazy how much things have changed. I mean, I’ve been gaming for 20 years or more, so it’s taken a while, but, it’s really going in a great direction and it’s getting better. The Last of Us Part 2 especially, has I think over 60 accessibility options, which is crazy. I mean the developers did work with disabled gamers to create this, which is so awesome and for me it has aim assist so instead of having to use the controller to aim and just hope that you’ve hit a zombie, it really helps the user aim and actually get the shot.

And then there’s also audio cues and navigation assistance, so it’ll let you know with the audio cue if you’re going the right direction or if you know there’s a certain way that you need to go. And then it also has menu narration, so it’ll read all of the menu options and everything for you as well as on the PS5 there is – I forget what it’s called, but there’s this little pad in the middle of the controller, and if you just swipe up with your finger, it’ll tell you the position you are. So if you’re crouched, if you’re standing, if you’re on a horse, and then it will also tell you your health. Like it’s 100% or 50%,

Those are the features that I find most helpful in that game since I’m just partially sighted, but I know a few people who are totally blind that have also been able to play it pretty well. There’s some games now that will even let you dim the background lighting. They’ll create like outlines of characters and obstacles so that you can see them easier. You can disable lighting, you can disable screen shakes if that’s something that bothers you. There’s just so many things, as well as difficulty settings and adjusting the game speed. So, I’m just happy to see that the disabled community is finally getting the assistance they need. It’s taken a while and it’s still going to take a while, but we’re going in the right direction.

Roberts: The developments in accessible gaming that have taken place in the last decade have been crucial in removing barriers for gamers who are blind or visually impaired. Like the concept of Kaizen. There have been continual improvements and the surface has only been scratched. 3D audio and haptics paired with something like VR or the Metaverse will make for immersive experiences beyond our wildest dreams, and we won’t have to rely on just our vision to enjoy them. There will be countless worlds for us to explore and we can all do it together.

Did this episode spark ideas for you? Let us know at podcasts@lighthouseguild.org. And if you liked this episode please subscribe, rate and review us on Apple Podcasts or wherever you get your podcasts.

I’m Dr. Cal Roberts. On Tech & Vision is produced by Lighthouse Guild. For more information visit www.lighthouseguild.org. On Tech & Vision with Dr. Cal Roberts is produced at Lighthouse Guild by my colleagues Jaine Schmidt and Annemarie O’Hearn. My thanks to Podfly for their production support.

Join our Mission

Lighthouse Guild is dedicated to providing exceptional services that inspire people who are visually impaired to attain their goals.