Sure , Google I / Owas onemassive AI - paloozafocused mostly on Gemini . But for Global Accessibility Awareness Day , there are a few more AI - enhanced imaginativeness features coming to some accessibility features coming to the face to Speak app that will have users talk by gaze at pic , symbols , and – yes – even emojis , which could offer a new avenue for those with speech communication issues or literacy challenge to pass on in their everyday life .

TheLook to verbalise apphas been around since 2020 , but for those who do n’t know , it ’s basically an centre - tracking app that lets user blue-ribbon from a phone number of phrases and quarrel that the telephone set then mouth out loud . In ablog Emily Price Post , Google trace how the existing Look to Speak app now has updates that let users choose from emojis alternatively . drug user will be able to personalize which symbols or emojis they want to apply to fine - melodic phrase these interactions . It ’s a pretty useful feature that makes an exist accessibility app even more accessible , even if it ’s for those who do n’t verbalise English .

Along with that update , there are more features come to a few more Google apps that will make it well-situated for low - imagination users to find object in their environment . The existingLookout appshould now have the power to find what you ’re look for in a room by hovering your earpiece ’s camera in the object ’s undefined counsel .

Users can set what each of these emojis is supposed to relay using generated speech.

Users can set what each of these emojis is supposed to relay using generated speech.Gif: Google

The app now has access to the beta Find modality . Essentially , you prefer an item from among seven categories , include “ Seating & Tables , ” “ Doors & Windows , ” or “ Bathroom . ” After you select the category , you’re able to move the phone around the room , and it should use the photographic camera to pick out the direction and distance you are from the object or door . Lookout will alsogenerate AI - written descriptionsof images to learn more about photos you take straight in the app . Google clarified with Gizmodo that Lookout utilize a visual model developed by Google DeepMind , the same mathematical group working onProject Astra’sAI digital assistant with similar vision capabilities .

Google Maps is also beat a fresh update that should feature the Lens in Maps descriptions in all supported languages . These sum up linguistic process should work with the updated interpreter guidance and covert Lens in the Maps screen lecturer , which the companyaddedto Search and Chrome in October .

Last class ’s big availableness update included Project Gameface , a new feature of speech that could usefacial reflection to control eccentric on - screen . The code was restricted to PCs when it was first released . Now , the feature should be available for Android developers as well as for mobile apps .

Tina Romero Instagram

need more of Gizmodo ’s consumer electronics pick ? Check out our guides to thebest laptop computer , full TVs , andbest headphones . If you require to learn about the next big affair , see our guide toeverything we know about the iPhone 16 .

Daily Newsletter

Get the best tech , science , and culture news in your inbox daily .

News from the futurity , delivered to your present tense .

You May Also Like

Dummy

James Cameron Underwater

Anker Solix C1000 Bag

Naomi 3

Sony 1000xm5

NOAA GOES-19 Caribbean SAL

Ballerina Interview

Tina Romero Instagram

Dummy

James Cameron Underwater

Anker Solix C1000 Bag

Oppo Find X8 Ultra Review

Best Gadgets of May 2025

Steam Deck Clair Obscur Geforce Now

Breville Paradice 9 Review