Google wrapped up its I/O presentation with one big surprise: a look at its latest AR glasses. The key feature Google showed off was the ability to see languages translated right in front of your eyes, which seems to me like a very practical application for AR glasses. While a big part of Silicon Valley is heavily invested in making AR glasses a reality thus far no has suggested a truly “killer” app for AR that would let you overlook the wide variety of privacy concerns inherent with the tech. Live translating the spoken word would definitely be a killer feature.
The company didn’t share any details on when they might be available, and only demonstrated them in a recorded video that didn’t actually show the display, or how you would interact with them. But what was shown in the video painted a very cool picture of a potential AR future.
In one demo, a Google product manager tells someone wearing the glasses that, “you should be seeing what I’m saying, just transcribed for you in real time. Kind of like subtitles for the world.” Later, the video shows what you might see if you’re wearing the glasses: with the speaker in front of you, the translated language appears in real-time in your line of sight.
Until these become a real product we can try, we won’t know how well they might work in practice. And it’s unclear if this is Google’s Project Iris product that we reported on in January or something else entirely. But Google’s vision shown at I/O, if it pans out, would be incredibly useful. You can watch the video for yourself right here†