Michael Abrash described how future Meta glasses will have always-on "contextual AI", and Mark Zuckerberg thinks this will arrive in less than 5 years.Meta Connect 2025 took place last week, and you can read about all the near-term product announcements here. Most years though, including this year, Meta's Reality Labs Chief Scientist Michael Abrash gives a talk about the further off future of AR & VR, including making predictions.Abrash's most famous (or perhaps infamous) predictions were made in 2016, at Oculus Connect 3, when he laid out the exact resolution and field of view he believed VR would reach by 2021, and said that he thought this would come with variable focus.4K Headsets, ‘Perfect’ Eye-Tracking, and ‘Augmented VR’: Oculus’ Abrash Predicts VR in 2021Oculus Chief Scientist Michael Abrash predicts dramatic improvements to field of view and resolution for VR headsets over the next five years among many other areas. Save the image above, because come 2021 we can check in and see if Abrash painted an accurate picture for the improvements we canUploadVRJamie FelthamThis year, Michael Abrash gave half of the talk, joined by Richard Newcombe, VP of Reality Labs Research, for the rest.During his half of the talk, Abrash began by reflecting on these predictions. While high-end consumer headsets reached 4K last year, undistorted wide field of view remains in the realm of $10,000 enterprise headsets and research prototypes."The nine years that have passed since then provide fresh confirmation of Hofstadter's law", Abrash joked.Hofstadter's Law: It always takes longer than you expect, even when you take into account Hofstadter's law.For this year's predictions, Abrash did not speak of display system specifications, nor hardware details at all. Instead, he described where he sees the AI assistant on smart glasses going.This year, Michael Abrash gave half of the talk, joined by Richard Newcombe, VP of Reality Labs Research, for the rest.Today, the Meta AI on smart glasses is reactive, and mostly transient. You issue it commands, such as to play a song or set a timer, or ask it questions. If that question seems related to what you see, such as "what is this?", it will use the camera to capture an image, and analyze that to respond.In the US & Canada there's also a Live AI feature, which lets you have an ongoing conversation with Meta AI without having to keep saying "Hey Meta", and the AI gets a continuous stream of what you're seeing. But this is still limited by the context window of the underlying large language model, and will drain the battery of the first generation Ray-Ban Meta within around 30 minutes, or around an hour for the new generation.According to Abrash, AI-capable smart glasses will eventually evolve to where the AI is always running in the background. Further, the glasses will continuously create a dynamic 3D map of your environment, and your movements and actions within it, including the objects you interact with. It will store a log of these actions and interactions, and use it to provide "contextual AI", he says.For example, you could ask "how many calories have I consumed today?", or "where did I leave my keys?". And without needing to have logged anything in advance, the AI will be able to answer – as long as you were wearing the glasses at the time. 0:00 /1:58 1× Michael Abrash on how future glasses will deliver always-on contextual AI. This will require significant improvements in the power efficiency of the chips and algorithms used for realtime 3D environment meshing, body tracking, and semantic object recognition. It will probably even need custom sensors and chips, both of which Meta Reality Labs Research is working on. For practicality, it might also need the glasses to have their own cellular connection, rather than relying on your phone.But it shouldn't require any fundamental breakthrough. The current rate of advancement of these technologies is already set to make the future Abrash describes possible.In an interview with Rowan Cheung, Mark Zuckerberg also talked about the idea of always-on contextual AI. But while Abrash did not give a timeline, Zuckerberg did. 0:00 /0:38 1× Mark Zuckerberg: glasses will have always-on AI in less than 5 years. "I'm not sure how long it's gonna take to get to that. I don't think this is like five years. I think it's gonna be quicker", Zuckerberg remarked.Of course, a comprehensive log of your actions and interactions throughout your daily life could also be immensely useful for Meta's core business model, targeted advertising.Zuckerberg noted that such a feature would be optional, and for those who enable it, the upside could be essentially getting a high-IQ personal assistant with full context of your life, ready to assist reactively and proactively at all times. But it would also come with significant privacy concerns, both for the wearer and for people nearby. What else will Meta do with the data? And would this kind of always-on sensing of you and the world keep the LED on the front of the glasses illuminated? And Meta will need to build strong trust before significant numbers of people would ever trust it with this level of data collection.