Posted: 2024-05-16 16:00:18

Google on Thursday shared a handful of accessibility updates, many of which incorporate AI -- a centerpiece of the company's I/O developers conference this week. 

The updates coincide with Global Accessibility Awareness Day and come two days after Google announced that Gemini Nano will enable richer and clearer image descriptions in its TalkBack screen reader. Apple on Wednesday also shared several new features coming to the iPhone, the iPad and VisionOS, including Eye Tracking, Vocal Shortcuts and Live Captions.   

Here are Google's latest accessibility updates, which will arrive across Lookout, Maps and more.

See also: Google's Gemini Assistant Pushes Android Into Its Next Phase

Lookout adds Find mode

Google's Lookout app is designed to help blind and low-vision users identify objects and read documents. The new Find mode, which is rolling out in beta, will offer a new way to find specific objects. You can choose from seven categories of items, like seating and tables or bathrooms, and then Lookout will tell you how far you are from that thing and which direction to go as you pan your camera around the room.

Earlier this year, Google rolled out a Lookout feature globally (in English only) that offers AI-generated image descriptions for photos that users upload. Now Lookout will also give AI-powered descriptions for images captured directly within the app, too.  

Look to Speak adds emoji

Look to Speak now lets people select emoji, symbols and photos, instead of just written phrases, that'll be spoken aloud.

Google

Look to Speak gets a text-free mode

Look to Speak is an app that lets people choose prewritten, customizable phrases with their eyes, which are then spoken aloud. Now the app is getting a text-free mode, which lets you also select and personalize emojis, symbols and photos. This expands the app's capabilities across cognitive differences, language barriers and literacy challenges. 

Project Gameface hands-free cursor expands to Android

Earlier this week, Google also said Project Gameface, an open-source, hands-free gaming "mouse" that lets people control a computer's cursor using head movements and facial gestures, will come to Android. Developers can access Project Gameface for Android devices in Github. They can then build applications that track someone's facial expressions and head movements using device cameras, and translate them into controls.  

Lens in Maps expands screen reader support

Lens in Maps uses AI and augmented reality to pinpoint restaurants, transit stations, ATMs and other places as you move your phone around your surroundings. Earlier this year, Google shared that its TalkBack screen reader can also speak aloud additional information about a location, such as business hours, ratings or directions. Detailed voice guidance will also let you know if you're going the right direction or crossing a busy intersection. These capabilities are expanding to Android and iOS globally in all supported languages this month. 

Accessibility information will appear in more places 

In Maps, you'll find a wheelchair icon to denote places with wheelchair-accessible entrances, with details under the About tab regarding accessible bathrooms, parking and seating. This icon is expanding from Android and iOS to desktop as well. And when searching places on mobile, you can filter reviews to pinpoint helpful information on wheelchair accessibility. 

Find places that cast to hearing devices

Auracast broadcast audio is a capability that lets venues like theaters, gyms or places of worship share enhanced or assistive audio with people using Auracast-enabled Bluetooth hearing aids, earbuds and headphones. Now business owners can add the Auracast attribute to their business profile, allowing people who need hearing assistance to easily locate those businesses. 

Design updates to Project Relate and Sound Notifications

Having a microwave sound in Sound Notifications

Sound Notifications makes it easier to save custom appliance sounds.

Google

Project Relate is an Android app that can be custom-trained on people's unique speech patterns to enable easier communication with others. Custom Cards let users customize the phrases they teach the model so it can learn important words, and now you can create Custom Cards by importing phrases and selecting text from other apps, like Google Docs.

Sound Notifications alerts people with hearing loss about "critical household sounds" like appliances beeping and water running via push notifications, flashes or vibrations. Google says it's now taken user feedback to improve onboarding and simplify saving custom sounds for appliances. 

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above