Here is How Disabled People Can Use Android Phones with Face Gestures

Just recently, Google has announced two new tools that describe how disabled people use Android phones with face gestures. Google said that using a raised eyebrow or smile, people with speech or physical disabilities can now operate their Android-powered smartphones hands-free.

The newly launched tools put machine learning and front-facing cameras to work detecting face and eye movements. Disabled users can now scan their phone screen and select a task by raising eyebrows, smiling, opening their mouth, or looking to the left, right or up.

The Centers for Disease Control and Prevention estimates that 61 million adults in the United States live with disabilities. So, Google and rivals Apple and Microsoft are determined to make easy to use products and services for them.

Here is How Disabled People Can Use Android Phones with Face Gestures

“Every day, people use voice commands, like ‘Hey Google’, or their hands to navigate their phones. However, that’s not always possible for people with severe motor and speech disabilities.” Google said in a blog post.

“To make Android more accessible for everyone, we’re launching new tools. They make it easier to control your phone and communicate using facial gestures,” Google said.

Anyhow, one of the tools is “Camera Switches,” which lets people use their faces instead of swipes and taps to interact with smartphones.

The other is Project Activate, a new Android application that allows people to use those gestures to trigger an action, like having a phone play a recorded phrase, send a text, or make a call.

“Now it’s possible for anyone to use eye movements and facial gestures that are customized to their range of movement to navigate their phone — sans hands and voice,” Google said.

The free Activate app is available in Canada, Britain, Australia and the United States at the Google Play store. However, it will be available in other countries very soon.

Apple, Google and Microsoft have consistently rolled out innovations that make internet technology more accessible to people with disabilities. For instance, voice-commanded digital assistants built into speakers and smartphones. These enable people with sight or movement challenges to tell computers what to do.

Check Also: Google Search Dark Mode Finally Arrives for Desktop Users: How to Enable

Similarly, there is software that identifies text on web pages or in images and then reads it aloud. There is also an automatic generation of captions that display what is said in videos.

Apple has also built an “AssistiveTouch” feature in the smartwatch that controls the displays by sensing movements such as finger pinches or hand clenches.

“This feature also works with VoiceOver so you can navigate Apple Watch with one hand while using a cane or leading a service animal,” Apple said in a post.

Similarly, Microsoft describes accessibility as essential to empowering everyone with technology tools.

“To enable transformative change accessibility needs to be a priority. We aim to build it into what we design for every team, organization, classroom, and home. Microsoft said

Recommended: How does Google Chrome know what you want to search on the internet?

PTA Taxes Portal

Find PTA Taxes on All Phones on a Single Page using the PhoneWorld PTA Taxes Portal

Explore NowFollow us on Google News!

Onsa Mustafa

Onsa is a Software Engineer and a tech blogger who focuses on providing the latest information regarding the innovations happening in the IT world. She likes reading, photography, travelling and exploring nature.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Get Alerts!

PhoneWorld Logo

Join the groups below to get the latest updates!

💼PTA Tax Updates
💬WhatsApp Channel

>