What’s new in Accessibility in 2019 – enabling technologies and ATs
Accessibility is all about making digital solutions work for people with impairments, whether those impairments are permanent, fluctuating, progressive, temporary or situational. It’s not just for people who have a permanent or fluctuating impairment due to a disability. We all can experience impairments: whether they are the progressive impairments of ageing, a temporary impairment due to spraining your wrist, or a situational impairment like using speech recognition to type a text because you’re cradling a baby.
This has been a key part of the accessibility business case that experts in my team and across our industry have been talking about for years. If you need a refresher, see: my Past, Present and Future of Accessibility talk from 2016.
What isn’t talked about so much are the multiple layers of technology and practice that are needed to deliver an accessible digital experience to people with impairments.
This ecosystem view of assistive technology and accessibility is critical to support the creation of solutions that actually work for real users, rather than well-meaning misfires – see my discussion of the need for local fit, infrastructure and training in my blog about Accessibility in Developing Countries.
Thankfully each year more and more organisations wake up to accessibility, in every layer of the ecosystem. So I wanted to flag up some examples of breakthroughs and innovations in each layer, to show how the field is advancing, and where it could go in 2019.
In my first blog of 2019 I looked at advances in:
- accessibility requirements in legislation and regulation, and
- the attitudes and commitment of organisations to accessibility and inclusive design
In this blog I’ll look at the underlying technologies that may enable people to deliver on those commitments and legal requirements in 2019:
Technologies and devices – general and specialist
There has been an explosion in the number of specialist devices, specifically for people with impairments, that are being created, mostly by small start-ups. Matt Ater’s great blog on some of the new devices he saw at CES 2019 mentions many of these, from the perspective of someone with a vision impairment. Devices like the Braille display mouse Bonocle, for example, have been specifically created for blind people, so may have a big impact on a small market.
Specialist devices in 2019 may also not look “techie”. Fast Company’s great Innovation by Design awards have found great examples of ‘technology that doesn’t look like technology’, and show that some of the best work coming out of design schools is focused on accessibility.
Hearables, IoT and 5G
It’s not just specialist devices that have an impact. More general devices like the in-ear foreign language translator WT2 Plus, which enables real-time conversation across languages, have not been made for people with impairments, but could massively improve their lives.
This is just one example of the potential of “hearables” – ear-mounted devices that supply aural content or information to the wearer that is either generated by computations and data on the device itself or by an attached app (like Apple’s AirPods). Juniper Research estimate a growing market:
“by 2022, 417 million hearables are projected to be in use, growing 50% annually until then”
While 5G may be over-marketed and under-delivered in 2019, it promises not just increased wireless data speeds but also to allow more connected devices to communicate with each other seamlessly (estimated to grow at 27% per year to 4 billion connected devices in 2024) to create more of the ‘Internet of Things’ world that has been promised to us for years.
As Matt also mentions, “connected devices” attached to ubiquitous voice assistants driven by saying “Hey Google,” “Alexa,” or “Hey Siri” also have the potential to change the way many people with disabilities interact with the web. It is slowly becoming more and more important to make sure your information and services are usable through these voice technologies (via Alexa Skills, for example) as well as ensuring your website works well with screen readers or speech recognition on computers and smartphones.
We’re still in the infancy of this potential change. New directions are being explored, like H&M’s Home Gift Guide, which uses Google Assistant to provide gifts and payment capabilities through voice commands:
“This fast-growing technology is opening up new possibilities for the retail industry to inspire and interact with customers,” stated Anders Sjöblom, managing director of H&M Home. “By giving a voice to our brand, we will enhance the customer experience and be able to speak with our customers whenever they want.”
At the same time, the figures show voice-enabled commerce’s future is uncertain as it is not yet fulfilling its potential:
“While some analysts have predicted that voice commerce will eventually be responsible for tens of billions of dollars of retail sales, last August, leaked internal documents from Amazon indicated that just 2% of the people who own Amazon Alexa-powered devices have made purchases using voice. What’s more, 90% of that tiny minority failed to make a second voice-based purchase.”
Google is working to improve its voice assistant with machine learning advances in the use of context in natural language conversations, so more people are encouraged to go beyond the games that are currently the most popular Skills on the Alexa.
It’s going to take a while for people to trust voice assistants for making purchasing decisions rather than accessing entertainment, as I believe people are still hooked on screens to provide enough information to ensure they don’t order the wrong thing when buying online. Years of usability studies on online checkouts are clear on the contextual support and feedback people need, so how to make people “okay without a screen” to the extent they are happy to purchase is a challenge.
I’m hoping the challenge gives product designers a better insight into how people who are blind may already feel about online shopping. It could be that the accessible solutions made for blind users now are the key to this new market segment flourishing for everyone in the future.
Gesture recognition, Virtual Reality and Augmented Reality
This principle of “do it for people with disabilities, and it might end up helping everyone” is also at play in Google’s screen-free tactile input Project Soli:
“the radar-based technology (is) a way to “capture motion in a three-dimensional space using a radar beam to enable touchless control of device functions or features, which can benefit users with mobility, speech and tactile impairments”… The technology can (also) help Google interpret human intent, according to Patrick Amihood, its software lead”
We at Hassell Inclusion (and our innovation partners Reflex Arc and Gamelab UK) have created rehabilitation games for people who have had strokes that have been enabled by numerous generations of screen-free gesture devices – from Kinect, to Leap Motion, to HTC Vive. So we can see that there are real possibilities that can come from this.
Project Soli might also be a great addition to Virtual Reality (VR) and Augmented Reality (QR), which might also be big in 2019. While 65% of consumers anticipate VR will become part of daily life, the same article notes that most people don’t have a traditional VR headset, nor the computing processing power needed to support them, both of which can be costly. AR is also gaining momentum, with headsets like the Magic Leap One mixed reality headset allowing you to add live news to your field of vision, and glasses like the Vuzix Blade smart glasses doing the same with weather forecasts.
In-view news and weather might not change your world, but the technologies behind them could. The VR vendors in Matt Ater’s article didn’t know the answer to their own question: “Why would a blind person use VR?” However, we do, from our work in mobility audio-games for children who are blind. VR headsets aren’t just about immersive visuals, they can also be used to capture head motion and provide feedback via 3D audio. We’ve used 3D audio in elearning for years, and we know how VR can help add extra dimensions to rehabilitation too.
Wearables and eHealth
That leads us to the final technology which could bring more opportunities in 2019 – the application of wearables to eHealth. Jamie Knight’s great presentations at CSUN have revealed how the Apple Watch can bring benefits to people on the autistic spectrum. Now the Apple Watch Series 4’s health insight capabilities have the potential to help more people who are older (fall detection has already ‘saved its first life’). And they could possibly help people with disabilities with associated health conditions to manage those conditions more easily.
Accessible features in operating systems (OSes)
The base-level of accessibility functionality that comes bundled with an operating system used to be a token gesture just to indicate what could be possible using ATs that you buy and install (see Microsoft Narrator on Windows). However, the smartphone category changed that years ago. You don’t have to install assistive technologies into iOS or Android. They are just part of the OS, from screen readers to switch input.
That’s why the whole accessibility industry watches when Apple talk about accessibility at their yearly WWDC Developer Conference and Google at their Android Dev Summit. What happens here tends to define what accessibility technologies people can code for, and build accessible services on.
Apple manage to drive adoption of new OSes faster than Android (iOS 12 has been adopted by 75% of devices – faster than iOS 11) so the new features of each version of iOS are adopted quickly. However, the larger number of Android devices worldwide, especially in developing countries, means that innovations from Google also have great impact.
In 2018, the features that had most accessibility potential were Siri Shortcuts, but people are still getting their heads round that. In 2019, I’m hoping lots of people may better understand the needs of people with MS or vision impairment if ‘Dark Mode’ in MacOS also becomes available on iOS (see Wired’s great Give Yourself To The Dark (Mode) Side article and Linson Productions tweet below asking for it on iOS):
— Linson Productions (@linson_prollc) January 24, 2019
New Assistive Technologies (ATs) and features in ATs
Finally, what’s new in Assistive Technologies for 2019?
Many new ATs are coming from the world of AI. We blogged last year about IBM’s Content Clarifier, that uses AI to help make complex language simpler. Microsoft’s Seeing AI – a talking camera app for bind people – is part of it’s $115m AI for Good programme. And Google are planning to launch its Lookout app later this year to help narrate and guide visually impaired people around specific objects (for more, check out the BBC article The blind woman developing tech for the good of others).
Apple also seem to be getting serious about AI. So with their track record of developing technologies that help people with disabilities, expect more from them in 2019 too.
In the field of gaming, Microsoft’s Xbox Adaptive Controller has brought breakthroughs in accessibility for many gamers with disabilities. While the controller can be hacked for use on other consoles too, we’re hoping the likes of Sony and Nintendo will follow Microsoft’s lead.
Back in the world of computers, most ATs are very well established. This is what we’d like to see from them, to help people with impairments and developers in 2019:
- For developers: The NVDA and Jaws screen readers for Windows will do their usual yearly updates. We’d like to see them providing full support for some newer native HTML elements – like the native <detail> and <summary> elements for accordions and a number of other new elements that we’ll be blogging about this year – so it’s safe for developers to use the new native elements to help make their code more simple and more accessible at the same time
- For users: Speaking recently with people who have difficulty using their hands and prefer to control their computers through speech, the recent decision by Dragon developer Nuance to drop support for its Dragon Professional for Mac is frustrating. We’d love to see Mac users be able to benefit from all the functionality of the Windows version of Dragon going forwards. The clock is ticking for anyone who relies on Dragon for Mac to either find a new app or migrate to Windows. Let’s hope Apple steps up and offer better voice controls and dictation.
How can we help you with your innovation?
It’s great to see so many innovations being created that will benefit people with impairments in 2019. At Hassell Inclusion we have a long history of innovation in all of the layers of the accessibility ecosystem. So, if your organisation wants to create an enabling technology, or to create services for people with impairments on top of a new enabling technology, then pleaseget in touch. We’d love to help.
In my third 2019 blog, coming soon, I’ll put the final piece in place, discussing advances in the things that help digital teams to create products which get the best from those underlying technologies and devices to deliver digital products that are accessible to everyone:
- disabled people’s awareness, access/availability and training in using new underlying technologies and assistive technologies
- standards for developing software (websites and apps) to work with those ATs
- the embedding of accessibility into technology enablers like content management systems, frameworks like react and code libraries
- the embedding of accessibility into design enablers like design patterns and “design thinking”
- standards for embedding accessibility into team processes
What do you think?
We hope these insights are useful to you, and would love to hear your experiences around implementing accessible accordions. Please share your comments below.
If this blog has been useful, you might like to sign-up for the Hassell Inclusion newsletter to get more insights like this in your email every month.