Meta's lineup of smart glasses could soon get a lot more capabilities. The company will begin allowing outside developers to bring their apps to its RayBan and Oakley smart glasses, Meta announced on the second day of its Connect event.Up to now, Meta has only had a limited number of third-party integrations for its glasses, with apps like Spotify and Audible. But Meta will now allow developers to start experimenting with apps that can take advantage of the built-in sensors and audio capabilities of its glasses. This means other companies will be able to create their own custom experiences that use Meta's multimodal AI features.The company is already working with a set of early partners, like Twitch, which is creating livestreaming capabilities for the glasses, and Disney, which is experimenting with an app for inside its parks. A demo video shows a visitor walking around Disneyland and asking the AI assistant about the rides she's seeing and other park information. 18Birdies, a golf app, is working on an integration that can give players club recommendations and yardage stats.Notably, these apps all seem like they work with Meta's non-display glasses, which means that even people who have first-gen Ray-Ban Meta glasses could see a bunch of added new functionality. It's not clear if the company will also allow developers to also build experiences that can take advantage of the display on its newest Meta Ray-Ban Display frames, but that could open up even more possibilities.Meta's new set of tools, officially called the "Wearables Device Access Toolkit," will roll out as a limited developer preview ahead of broader availability in 2026.This article originally appeared on Engadget at https://www.engadget.com/wearables/meta-will-let-outside-developers-create-ai-powered-apps-for-its-smart-glasses-194159233.html?src=rss