Google’s AR translation glasses are like subtitles

This page was translated using AI and machine learning.

(Pocket-lint) – The Google I/O 2022 opening keynote saw a number of big announcements, but it was the closing tease that got a lot of people excited: a AR live translation glasses prototype.

The idea behind these glasses is that you can wear them and enjoy Google translation. We’ve all experienced Google Translate, but Google is on a mission to make that translation alive, instantaneous, and capable of fueling conversations between people.

The demo may feature the glasses prototype that uses AR to add captions in your field of vision, so you can read when someone is talking to you – so you know instantly what they’re saying, no matter what. is the language he speaks.

“It’s a bit like subtitles for the world,” explains product manager Max Spear in the video presentation.

The example is of a mother and her child, the mother speaking Mandarin, the child English – and immediately the language barrier disappears, they can communicate seamlessly.

This product is an extension of Google’s efforts in live translation and transcription, allowing written text to come from spoken language, rather than just translating written language.

The possibilities can also be opened up by such a product not only benefiting those who speak different languages, but helping people who are deaf or hard of hearing, again using technology to remove a huge barrier.

In the past, people made fun of AR glasses. Reception has been pretty bad, with privacy concerns, questions about needing a constant flow of information – but with a use case like live translation, we can all see the immediate benefits that such a device would bring.

As Google says, this is just a prototype, but we want this to be a set of augmented reality glasses that will make it into the real world.

Click on the video above to see it in action.

Written by Chris Hall.

Leave a Comment