Google’s long-teased Project Aura, the AI smart glasses developed in partnership with Xreal, is finally taking clearer shape, and it might redefine how Android approaches extended reality.First shown briefly at Google I/O, the device is officially described as wired extended reality (XR) glasses and is set to become the second Android XR product after Samsung’s Galaxy XR headset when it arrives in 2026.Calling them “smart glasses” doesn’t quite do them justice. During a recent hands-on demo, the prototype looked like an oversized pair of sunglasses, complete with a cable running to a compact battery pack that doubles as a trackpad. Google representatives were quick to clarify that Aura isn’t meant to mimic traditional AR glasses; instead, it’s a lightweight headset disguised as eyewear.Up to a 70-degree field of viewOnce powered on, the glasses create a virtual workspace with up to a 70-degree field of view. Users can wirelessly link Aura to a laptop and spread apps around them as if working across multiple monitors. In one demo, Lightroom opened on a virtual desktop while YouTube played in a separate floating window. A tabletop 3D game allowed pinch-and-pull gestures, and looking at a real-world painting triggered Circle to Search, where Gemini identified the artwork in seconds.The experience resembles the Galaxy XR or Vision Pro more than true augmented reality. Digital windows appear in your environment, but the world isn’t overlaid with contextual AR information. The difference here is that Aura feels far less obtrusive, something you could theoretically wear outdoors without drawing attention.Also Read | Gemini’s monthly active user base grew 5x faster than ChatGPT, new data showsGoogle confirmed that everything shown so far runs on apps originally built for Samsung’s Galaxy XR. Not a single experience had to be redesigned, which could be a breakthrough for the XR ecosystem. One of the biggest hurdles for platforms like Meta’s Ray-Ban Display or Apple’s Vision Pro has been the lack of third-party app support.Android XR aims to solve this by letting manufacturers like Xreal tap into existing Android and Galaxy XR-compatible apps, drastically reducing fragmentation and lowering barriers for developers.Story continues below this adXreal CEO Chi Xu sees this as a turning point. “Developers don’t have to choose sides anymore,” he said. “Android XR means the ecosystem starts converging.”Google also demonstrated a set of prototype AI glasses. In one scenario, the glasses displayed an Uber widget triggered by the regular Android app, no special XR version needed. Looking down revealed a live airport map with directions to the pickup point. YouTube Music controls popped up when prompted through Gemini, and photos taken via the glasses could immediately appear on a connected Pixel Watch.More advanced demosThere were more advanced demos, too: live language translation, Google Meet calls, 3D YouTube playback, and even a dual-display prototype offering a wider field of view. A quirky AI app called Nano Banana Pro added K-pop-style decorations to photos taken during the demo.Perhaps the most surprising reveal is that next year’s Android XR glasses will support iOS. As long as an iPhone user has the Gemini app installed, they’ll still get the full multimodal Gemini experience on the glasses.Story continues below this adGoogle says most of its core apps, Maps, YouTube Music, and others will work smoothly on iOS, with limitations affecting mainly third-party apps.Also Read | India’s Year in Search 2025: Google rolls out new ways to search as user behaviour evolvesThis cross-platform strategy could become one of Google’s biggest advantages, especially as Meta opens its APIs and Apple remains tightly locked into its own ecosystem.New prototypes developed with hardware partnersGoogle is also trying to avoid repeating the missteps of Google Glass. The new prototypes are more discreet, developed with hardware partners rather than built alone, and will launch with real apps from day one. Safety and privacy are a priority: if the glasses start recording, a bright pulsing indicator will alert people nearby, and the camera switches feature clear red-and-green markings to prevent misunderstandings.“There will be strict controls over sensor access,” said Google’s XR product director Juston Payne, noting that Gemini and Android’s existing security frameworks carry over to Aura.Story continues below this adThere’s still a lot left unknown, including pricing, battery life, and how refined the final design will be. But with Project Aura, Google appears to be taking a more cautious, ecosystem-first approach. If the company can deliver on its promises, these glasses could represent the most credible push yet toward mainstream, everyday XR.