Google previewed new smart glasses at TED 2025 with real-time translation and object recognition. Powered by Android XR and Gemini AI.


Newsletter

wave

Google Shows Off New AI-Powered AR Glasses at TED 2025

At the TED 2025 event, Google gave the world a sneak peek at its newest prototype: a pair of augmented reality (AR) smart glasses powered by AI. While the glasses are still in the early stages and not ready for the public yet, the demo showed just how far Google has come in combining wearable tech with artificial intelligence.

Real-Time Translation and Smart Object Detection

During the live demonstration, Google's new AR glasses showed off their ability to translate spoken Farsi into English in real-time. In another test, the glasses helped the wearer find a lost hotel key card by identifying it visually. These features are powered by Gemini, Google’s latest AI assistant, which is designed to work seamlessly with the Android XR platform.

The glasses are meant to make it easier for people to interact with the world around them using smart features, without having to look down at their phones.

Sleek and Lightweight Design

Unlike the original Google Glass, these new smart glasses look more like regular eyewear. They have a lightweight frame and a small embedded display in at least one lens. Because all the heavy processing is done through a connected smartphone, the glasses can remain light, comfortable, and stylish.

This design choice helps avoid the bulky and uncomfortable experience of some previous AR headsets.

No Launch Date Yet

Even though the prototype was impressive, Google hasn’t announced when the glasses will be available to buy. This shows the company is taking a cautious approach, especially after past challenges with earlier AR projects like the original Google Glass, which never reached mainstream success.

By waiting to release a fully polished product, Google hopes to avoid the same problems and create a more useful and reliable tool.

Competing in the AR Space

Google isn’t the only tech giant working on AR. Meta and Apple have also invested heavily in AR glasses and headsets. Google's new product signals that the company is still very much in the game and looking to bring AI and AR together in smart, everyday devices.

The long-term goal? A world where digital information is naturally integrated into how we see and interact with the real world—no screens, just smart vision.

FAQ

They offer real-time language translation, object recognition, and a lightweight design. They work with Android XR and Gemini AI.

Google has not set a release date. The glasses are still in the prototype stage and not ready for the public yet.

The new glasses look more like regular eyewear, include advanced AI features, and connect to a smartphone for processing.

No, they need to be connected to a smartphone for full functionality, which helps keep them light and portable.

Gemini is Google's advanced AI assistant. It powers features like real-time translation and object detection in the glasses.

Search Anything...!