After launching its new generation of iPhones earlier this month, Apple is expanding its Apple Intelligence ecosystem. The company has rolled out the Foundation Models framework, a new tool that enables developers to integrate on-device AI features directly into their apps. Apple said that with iOS 26, iPadOS26, and macOS 26 now available worldwide, developers are using the new framework to create offline features that are smarter and privacy focussed, and at no extra cost to users. “We’re excited to see developers around the world already bringing privacy-protected intelligence features into their apps,” said Susan Prescott, Apple’s vice president of Worldwide Developer Relations. “The experiences they’re creating show just how much opportunity the Foundation Models framework opens up.”The Cupertino-based tech giant said that the framework has been developed to make it easier for developers to create experiences backed by the large language model at the core of Apple Intelligence. These experiences span across personalised journaling prompts to detailed summaries of workouts, and even conversational explanations of scientific concepts. ICYMI | iPhone 17 review: Apple finally brings Pro-level power to its regular phoneThe new Foundation Models Framework is allowing developers to create smarter fitness and health apps. The company said that a number of such apps are already tapping into the framework. One such app is SmartGym which allows users to describe a workout in natural language and it instantly transforms it into a structured routine. Another feature of the app, Smart Trainer, now offers lucid explanations for adjustments – be it changing weights or suggesting new exercises. The app also generates monthly progress reports, personalised coaching messages, and greetings based on real-time fitness data. “The Foundation Models framework enables us to deliver on-device features that were once impossible,” said Matt Abras, SmartGym’s CEO.On the other hand, the journaling app Stoic uses the framework to create personalised prompts that respond to a user’s mood or sleep patterns. If someone logs a bad day, Stoic can generate supportive reflections and reminders, all while ensuring private data never leaves the device. “Features that once required heavy back-end infrastructure now run natively on devices with minimal setup,” said founder Maciej Lobodzinski.Swing Vision, another app in the wellness space, analyses tennis and pickleball games to give tailored feedback. 7 Minute Workout, app, which generates personalised routines and motivational feedback. Also, Gratitude app now creates weekly summaries of journal entries and affirmations, while Train Fitness adjusts exercise plans if certain equipment is unavailable.Story continues below this adBig push for learning, creativity and productivityWith the new framework, education apps are also seeing gains. CellWalk, an interactive biology app, allows students to explore 3D cell structures and tap terms for conversational explanations. By using the framework, CellWalk can adapt explanations to a learner’s knowledge level while retaining history to reinforce understanding. “Our visuals have always been interactive, but with the Foundation Models framework, the text itself comes alive,” said developer Tim Davison.Meanwhile, language-learning app Grammo has integrated the framework to offer instant feedback on grammar exercises and generate new questions on the fly. Children’s app Lil Artist now creates customised illustrated stories, while Vocabulary organises saved words into smart categories. Education platform Platzi now offers quick, conversational answers to questions about course content in real time.Also Read | iPhone Air Review: This is the MacBook Air of iPhones. Period. Apple has revealed the framework is being deployed by productivity apps to simplify everyday tasks. Task manager app Stuff can now recognise dates, tags, and lists automatically as users type, or even turn spoken or handwritten notes into organised tasks. Video-editing app VLLO combines Apple’s Vision framework with the Foundation Models framework to autonomously suggest music and stickers tailored to each scene. Similarly, apps like Signeasy and Agenda are using the framework to generate document summaries or search note libraries with natural language.Story continues below this adEasy for developers to buildThe Foundation Models framework is deeply integrated with Apple’s Swift programming language, allowing developers to call the on-device three-billion-parameter model directly with their code. It supports structured outputs and tool calling, so apps can ensure reliable results while still protecting user data.The framework works on any Apple Intelligence-compatible device running iOS 26, iPadOS 26, or macOS 26. While for developers, it lowers the barrier to building AI-powered features, for users, it promises new levels of personalisation and productivity without compromising privacy.