Thikra Blog shares smart living tips, home gadget updates, and lifestyle technology insights tailored for UAE readers.
Summary
- Google plans screen-free AI glasses in 2026 for audio, camera, and Gemini assistant tasks.
- Google’s in-lens display glasses show nav, translations, and AI info privately, but no release date yet.
- Wired XR “Project Aura” brings a tethered AR workspace next year.
Google is gearing up to launch its first pair of AI-powered smart glasses in 2026, but the version that’s likely to grab the most attention still doesn’t have a release date. The company revealed more details about the Android XR (also known as extended reality) product family in a recent blog post, following up on its initial announcement at the Google I/O conference in May, detailing two pairs of AI glasses.
One is a simpler, screen-free pair of AI glasses. It uses built-in speakers, microphones, and cameras to let you interact with Google’s AI assistant Gemini, take photos, and handle everyday tasks — all without the bulk of a traditional headset.
The second model is the more futuristic option that we’re really waiting for: display AI glasses. These include an in-lens display that’s visible only to the wearer, showing things like turn-by-turn navigation, translation captions, and other information layered on the real world.
While the first model has been given a launch window of 2026 (no specific month yet), there were no details revealed about the display AI glasses timeline.
An early look at Google’s display AI glasses
Early clips highlight captions, directions, and on-the-go search
A brief preview of the display AI glasses in the new blog post shows a woman using the glasses in her everyday life. First, she uses the glasses to translate a sign at a dance studio, then asks Gemini to tell her what dance the class inside is learning, getting an immediate response in both instances.
Next, she has the glasses provide live translation at a restaurant where she doesn’t speak the language, securing a seat at the bar without hassle.
The display AI glasses are designed to overlay helpful information directly into your line of sight, giving you turn-by-turn navigation, live translation captions, and quick access to AI-powered search without looking at a phone. They aim to make digital assistance more seamless and private, letting you interact with information without others knowing — if you can mange to wear a pair without telling everyone about it.
Wired XR glasses are also on the way
See digital content and the room at the same time
Google also previewed its wired XR glasses, which mix headset-style immersion with a view of the real world. Project Aura from XREAL is the first device in this category, with a 70-degree field of view and optical see-through tech that layers digital content over your surroundings.
Basically, it’s like having a private, digital workspace wherever you go. You could follow a floating recipe while cooking, keep an eye on step-by-step guides while fixing something, or just multitask without blocking out the room around you. You’ll still be tethered to a laptop or other device capable of accessing all the info, though. It’s more of a VR headset that looks like glasses.
Project Aura is expected to launch next year, though details are still limited.
I never expected smart glasses to replace my workout earbuds — but they did
Meta Oakley smart glasses replaced my Powerbeats running earbuds, here’s why.
source
Note: All product names, brands, and references in this post belong to their respective owners.
For more smart home guides, lifestyle tech updates, and UAE-focused recommendations, visit blog.thikra.co.
To shop smart gadgets, accessories, and lifestyle electronics, explore Thikra Store at thikra.co.





Leave a Comment