Science and Tech

Google Live Caption will be able to read aloud responses written by users during calls

May 19. (Portaltic/EP) –

Google has announced a number of new features including an additional Live Caption feature, which will allow users to type responses during calls for the system to read aloud to the person they are speaking with.

The tech company He has detailed what he has been working on in recent months in terms of accessibility to create a series of features “made with and for people with disabilities.”

One of the new features is related to Live Caption, a tool that uses Artificial Intelligence (AI) to provide automatic and real-time captions on Android, Chrome (on mobile and web) and Google Meet.

The most prominent addition to Live Caption allows users to type responses during calls for the system to read aloud and listen to the other person.

This option is already available in the newer Pixel terminals, although it will be expanded, on the one hand, to the versions Pixel 4 and Pixel 5 and, on the other, to selected Android ‘smartphones’. Among them, to Samsung galaxy.

Google has also advanced that this summer it will implement updates to bring this feature to more users with news such as a new subtitle box that optimizes the Live Caption experience on tablets with the Android operating system.

Finally, it has implemented the support of this tool for French, Italian and German, both in the Pixel 4 and 5 as well as in other ‘smartphones’, also including those of the Samsung Galaxy series.

Another novelty is addressed to Lookout, a tool launched in 2019 and designed with and for people with vision problems that uses artificial intelligence to help them with everyday tasks, such as sorting mail and putting groceries away.

Now an ‘image question and answer’ function has been launched, which is able to process the image and provide a description, so that people can ask the system -by voice or text- aspects and details to have more knowledge of what appears on it. To be able to offer this service, Lookout is based on an advanced visual language model developed by Google DeepMind.

Google has said that this feature has reached some users visually impaired and hopes to expand it to a broader group of people in the near future.

ALSO IN MAPS AND CHROME

For people with mobility problems, it has implemented an icon on Google Maps that indicates when a certain place has a wheelchair-accessible entrance.

He has also commented that more information about these places can be found by clicking on the ‘About’ tab, to see if they have seats, parking spaces or accessible bathrooms for these users.

Finally, to improve accessibility, Google has introduced a function that detects errors when a website is misspelled in Chrome’s address bar and that it will suggest websites based on the corrections.

Source link