On making tech accessible for everyone
Sarah Herrlinger is the Senior Director of Global Accessibility Policy & Initiatives at Apple. She has been the company for 21 years, focusing on accessibility for 18 of those. Here, she shares her insights on the past and future of accessibility features designed for individuals with disabilities, making technology available for everyone.
— It is a basic human right to treat all of our customers with dignity and respect, Herlinger explains. She continues:
— We consider accessibility to be one of our six core corporate values and something that we have been building into our products for decades. Technology should be made for everyone — for years we have been building accessibility features to support vision, hearing, physical motor, speech, and cognitive.
— The key element for us is adhering to the disability mantra of ’nothing about us without us.’ That starts with employing individuals with disabilities, not just on our accessibility team, but throughout the company. Working with those people who are users of our assistive technology and being able to think about unique ways that the devices that we make can be powerful and instrumental in people’s lives.
— Beyond that, we try to work with communities all over the globe to try to make our products look at what’s the next thing, and where do we go from here? Is there a community that may for whatever reason not be able to use our technology? And how do we solve that problem?
In terms of history, there have been many technologies developed for health reasons that are not then widely adopted and vice versa. Have there been instances where you developed things for accessibility features that have been adopted into wider usage, or the other way around?
— There are certainly ways in which when our team starts working on something we can find use cases that expand out to the larger population. One of the things that we often say is when you design for the margins, you make better products for the masses.
— The feature Voice Control, for example, is something that we built in to support individuals for whom voice might be the only way that they can use the technology, which might be a significant physical motor limitation. The idea behind it was to enable navigation on the device and then do things like dictate into a text field or do everything that you would do through any other modality using just your voice.
— With Apple Vision Pro, the voice becomes a means by which you can drive the device. It’s a whole new computing platform where voice can be a powerful tool for anyone to whom they are speaking. Here’s something that we started with on other devices and as with many of the accessibility features, we’ve ported them over to Apple Vision Pro, and will be used much more ubiquitously by others.
What can you say about the developments in AI LLMs that play into this?
— Machine learning and other AI is something we’ve been using for a long time within accessibility. It’s not new to us within our team to be reliant upon AI. But what we try to do is use AI and machine learning to solve unique problems within the communities. When you think about a feature like People Detection or Door Detection for the blind community, being able to use the light camera of our Pro models of iPhone and iPad, to social distance during COVID and understand where someone is in proximity to you.
— To be able to be dropped off by a cab on a street that you have never been on before to go have brunch with friends and use the device to be able to read out all of the establishment names or understand how far away a door is from you and whether it’s open and closed. Those are all complex algorithms which is AI to be able to help the blind community to be able to just navigate more independently and with more confidence in where they are in the world.
— Same with sound recognition on the iPhone or the Apple Watch to be able to support members of the deaf community to understand audible alerts in the background, like a fire alarm or a baby crying and have that delivered to them as a visual alert on the watch or the phone. Again, very complex algorithms to be able to understand what a baby crying is versus a dog barking. Even new features like Personal Voice where one can create a digital version of one’s voice.
— We also use machine learning for Personal Voice which enables users to create a synthesized voice that sounds like them simply by reading out loud a selection of phrases that are presented in a random order. This data is collected and processed solely on the device. It is all done very safely and securely.
There are different target audiences and different communities that you communicate with. Do you have a sense of how many there are and how many you interact with?
— Some statistics say that up to 20% of the world’s population has a disability, certainly across all types, and over the age of 65 that number increases dramatically to almost 50% of the population. That can cover anything from sensory disabilities, vision, hearing, physical motor, and speech to more hidden disabilities, like autism, ADHD, and dyslexia. A range of cognitive, Herlinger says.
— Everyone is unique — they may have more than one different element of things. We don’t tend to look at it from the perspective that there are more people in one community, and we need to pay more attention to that. For us, it’s about supporting everybody that we possibly can. I would say even on a year-over-year basis, one of the goals for our team is to expand the audience of individuals who can use our technology.
— If we become aware of a community that may for whatever reason not be able to use one of our products, our goal is to figure out how to solve that problem. Along with that, we also try and work on new features for the communities we’re already supporting so that we continue to surprise.
How do you interact with them and how do you identify the problems to solve? Do you interact with the medical community or the healthcare community or have research projects that you do together?
— I would say it’s less about the medical community and more about the communities themselves and that comes to us through many different ways, and we want as many different ways that people can connect with us as possible. It starts with the hiring of individuals in the communities themselves. Throughout Apple, whether that be on the accessibility team or anywhere within our ranks, we want to hire the best people to do the jobs by building these accessibility features into our products, so we get a lot of feedback from our employees and a lot of ideas.
— We also have a customer-facing email address which is accessible online. It has been available for almost two decades and we get a lot of emails every day from customers who ask us questions, report bugs and provide feedback, so we’re constantly trying to have that dialogue. The work we do is with many external organisations around the globe of these communities to make sure that we’re not just hearing from one group of blind users in one area. We know that there may be unique ways that people are using our devices or unique needs within different parts of the globe.
You mentioned Apple Vision Pro. Can you see that this device plays a role, in some of these communities more than others?
— There are some early reviews from people who are saying it may be the most accessible device out there.
— We’ve been able to take features that people know from our other operating systems, port them over and reimagine them for this new spatial computing environment. For example, if you take a feature like Voice-over, which is available on all our products, we’re bringing that as a tool for the blind community into Vision Pro and reworking all of the gestures that one would use with Vision Pro to specifically support the blind community. From the feedback we’re getting right now, it’s being incredibly well received by the community to be able to do that which everyone else does on this device but do it with Voice-over enabled.
— There are so many of the features that we’ve brought over to support our communities from vision, hearing, physical motor, and all kinds of things cognitive.
You’ve been at Apple for a long time and the company has certainly changed. Can you mention some key moments in your work that have been important in your work concerning what we’re talking about here?
— We have just brought in Personal Voice with our latest operating systems available on iPhone, iPad, and Mac. For those who are at risk of speech loss, when you think about someone with motor neuron disease or ALS, as we often refer to it in the US, one in three people diagnosed with MND will lose their ability to speak throughout their lifetime.
— Communication is something which is the heart of who we are as human beings. For someone reliant and so used to their voice, the idea that it may no longer be available to them and that they may miss out on some of those incredibly personal moments of saying I love you to a spouse or loved one or reading a book to a child or grandchild.
— To be able to create something that is so uniquely, the way we think about something, our products make it simple and easy and available to you but done in a way that keeps your privacy in mind. Developing this feature was just such a great one for us.
— Beyond that, I think about Assistive Touch for Apple Watch. Being able to create a feature for people who are upper body limb different, perhaps an amputee, to be able to run the entire watch one-handed by using the hardware underneath. Using the gyroscope and the accelerometer to understand the muscle and tendon movements and be able to run it with just a clench or a pinch.
— It’s practically sci-fi in some people’s minds and yet, for us, it’s just how we create ways for people to use our technology that are respectful and dignified for them. It’s wonderful.
Do you have any recommendations for children with autism and ADHD about functionalities that they can use from your technologies?
— It is a feature called Guided Access that was initially built to support individuals who are autistic. The idea behind it was to be able to sort of keep an individual in one app so that they can stay focused and on-task. One of the things that we found for many individuals who are on the autism spectrum is that there’s a thing called stimming. By being able to say ’No, I’m just going to keep you in this one app,’ whether it is a game or an educational app that you want them to stay focused and on-task for, it is a great way to keep that structure to it, Herlinger shares. She adds:
— One of the things that we focused on was transitions for individuals who are autistic. We found that sometimes when a parent or a teacher had to take the device away, it might cause a lot of anxiety for a child who’s autistic. By being able to set time limits it became something where it was a countdown. When you set the time for 30 minutes and at five minutes it would say ’in five minutes, we’re going to turn this off’, it takes the bad guy role away from the parent or the teacher. Once again looking at some of the unique needs of individuals who are autistic and to be able to support children.