At last week’s I/O 2019 developer conference, Google revealed the company’s latest updates and innovations. Among them are three AI projects that will help the disabled gain more accessibility to current technology tools.
Project Euphonia aims to improve computers’ ability to understand individuals with speech impairments. Live Relay helps the deaf and hearing impaired receive, make and respond to voice calls. And Project Diva is a Google Assistant-based product that will help give independence and autonomy to more people.
In a recent Google blog post, Jen Devins, Head of Google Accessibility UX, talks about the importance of developing and improving its product for all. “Focus on the user and all else will follow,” she writes, citing one of Google’s key values. “User research is core to success throughout a product’s life cycle, and fundamental to creating a product that works for as many people as possible, including people with disabilities.”
Here is more on how each of Google’s AI for disabled users projects work.
Google AI Projects — Project Euphonia
While it is still in its early phase, Project Euphonia holds big promise to make voice interfaces work with a wider range of speech. Current voice recognition technology is limited for individuals with speech disorders. People with neurological conditions such as Amyotrophic Lateral Sclerosis, Multiple Sclerosis, and Parkinson’s, or developmental disorders such as Autism or Celebral Palsy cannot rely on products like Google Assistant or Amazon’s Alexa. By collecting more voice data from people with impaired speech, however, Project Euphonia will allow for optimization of AI algorithms to improve how computers understand and transcribe impaired speech.
The company is also developing technology that detects non-speech sounds and gestures to mimic spoken command responses.
To build out its data library, Google is asking people with speech impairments to send in their voice samples. Learn more at g.com/euphonia.
Google AI Projects — Live Relay
Using speech recognition and text-to-speech technology, Live Relay aims to make voice calls accessible to the deaf and hard of hearing.
Here’s how it works: A speaking person calls someone with limited or no hearing. Next, the phone listens to the speaking person’s words and converts the speech to text in real time for the disabled person to read. In return, the person who is deaf or hard of hearing writes out a text message, which is read out to the hearing person.
Features such as predictive writing and instant responses are available to help keep up with the pace of a live phone call.
Live Relay doesn’t violate users’ privacy. It runs entirely on a cell phone device and Google does not receive or collect the call content.
Google AI Projects — Project Diva
Inspired by a Google employee’s nonverbal brother who lives with Down syndrome, West syndrome, and congenital cataracts, Project Diva stands for DIVersely Assisted. It uses an external, non verbal trigger device like a button to give Google Assistant commands, enabling them to play music, movies, and more, without the help of others.