Meta will release a selection of its AI models under an open-source license. The models in question come under the purview of Meta’s chief AI officer, Alexandr Wang.Known for founding data labeling and annotation company Scale AI as a 19-year-old MIT student, Wang joined Meta in June 2025 to lead the organization’s Meta Superintelligence Labs (MSL). Keen to democratize access to US-built AI technologies that offer open access to software engineers, Wang continues Meta’s already heavyweight track record in open-source AI platform development.Meta’s open AI pedigreeThe company, of course, is already well known for its preexisting work in this open AI space. This includes the Llama ecosystem with its open-source AI framework, which started as a comparatively humble single Large Language Model (LLM); the initial development of PyTorch, an open-source machine learning framework for building and training neural networks; its creation of React as a JavaScript library for building user interfaces with its declarative coding style; and its wider work founding the Open Compute Project back in 2011.Although no release date has been laid down for the new open release, Meta’s moves speak volumes in terms of its wider strategy with AI, so why do this now?Our open source codebases grew at an impressive pace, reaching 189,719 total commits in just one year – Meta Engineering.The swinging pendulum of Meta opennessThe pendulum has swung both ways regarding how successful Meta has been so far in terms of AI openness. Initially known for being stridently open, the Facebook parent has traditionally enabled widespread open interaction through its frontier models. For the year 2024 (the latest figures available), the Engineering at Meta blog noted, “[The organization’s] open source codebases grew at an impressive pace, reaching 189,719 total commits in just one year. Community contributors accounted for 71,018, while Meta employees made the remaining 118,701.”With the specter of so-called “openish” AI creeping up on the company (along with delayed access and feature stripping in open versions), Meta’s project Avocado has reportedly shifted toward a more proprietary stance, eschewing its open-ecosystem siblings.Proprietary precautiousness Wang will no doubt have taken all these factors into account before deciding to open-source a selection of new AI models.The mooted group of models in question could arguably exhibit some of the openish taint of other Meta projects, and Zuckerberg himself will want to ensure he reduces user safety risk at all levels to keep him out of the courtroom. It was just last month, in March 2026, that Meta claimed social media addiction isn’t real, only for the juries to disagree. Even as a CEO, a developer at heart only has so many suits and clean white shirts, after all.From a software engineer’s own perspective, Sid Vangala, senior AI systems engineer at MasTec, tells The New Stack that Meta’s decision to open-source parts of its upcoming AI models is fundamentally an ecosystem strategy.“By lowering access barriers, Meta can accelerate developer adoption, shape standards, and drive infrastructure dependence on its tooling,” says Vangala. “Unlike fully closed models from companies like OpenAI or Anthropic, this approach trades short-term control for long-term influence. The implication for enterprises is increased flexibility, as well as new governance responsibilities for model security, provenance, and responsible deployment.”Open for real, or just a re-hash?Professor Amanda Brock, CEO of OpenUK, is rarely without an opinion on the number of open models circulating around the planet at any given time. Brock tells The New Stack that, in her view, the most notable factor here is Meta’s “plans to eventually” open-source the models.“We need to understand what Meta is really planning to do here and what the company means by saying it will open-source the technologies,” Brock says. “If it’s a re-hash of the commercially restricted ‘Llama Community license’, then it’s not open-source according to any rational person’s understanding of the term. In fact, that flawed approach has likely been behind Meta’s previously rumored stepping back from committing to wider open-sourcing.”Brock reminds us that open-source isn’t a magic word; it’s the long-established formula for making something open successful that matters. She suggests that, just maybe, Alexandr Wang has persuaded Zuckerberg that truly open-sourcing its AI is not just about doing the right thing, it’s really all about how to actually build traction, like the Chinese models such as DeepSeek.Jason Corso, co-founder and chief scientist at Voxel51, and ‘Toyota’ professor of artificial intelligence at the University of Michigan, agrees that Meta has been a leader in open source and open weights models, many of which have spurred innovation. Corso tells The New Stack that, however, open-weight models obscure aspects of their training, leading to blind spots when used. “This creates risks for both Meta and model adopters, and it will be interesting to see how Meta addresses this problem differently,” says Corso.Meta’s consumer-seeding strategyAccording to Ina Fried writing on Axios, “Wang sees Anthropic and OpenAI as increasingly focused on delivering their models to governments and the enterprise. By contrast, Meta’s effort is focused on consumers, per sources. Meta wants its models distributed as widely as possible around the world.”Wang will also be thinking about when OpenAI and Anthropic release new models; his total algorithmic logic decision tree may be more complex than we have even outlined here.Looking to the immediate future – and even with Wang’s new selection box on offer when it arrives – it would be rational to think of Meta as less open than it used to be in terms of model access. The time may be right for a new hybrid approach where developer access is championed, but within more carefully defined and governable perimeters and thresholds.The post Open-source leaders question whether Meta’s Alexandr Wang will truly give away its AI models appeared first on The New Stack.