Fresh data has us asking, does AI demand Kubernetes?

Wait 5 sec.

Kubernetes is becoming the de facto operating system for AI. Two-thirds of organizations running generative AI models use Kubernetes for inference, while production use of Kubernetes hits a whopping 82%.From Kubernetes to Kubeflow and beyond, this open infrastructure is what’s enabling organizations to truly build, scale, and own their AI systems. That’s the power of community-driven innovation.Reflecting on the impact of the global cloud-native developer community having grown to 19.9 million developers, we sat down with Bob Killen, senior technical program manager at the Cloud Native Computing Foundation (CNCF), and Liam Bollmann-Dodd, principal market research consultant at SlashData, on the expo floor of KubeCon + CloudNativeCon — the biggest event yet — this March in Amsterdam.INSERT YOUTUBEIn this episode of The New Stack Makers, we dived into the results of two Q1 2026, hot-off-the-presses CNCF-SlashData research collabs:State of Cloud Native DevelopmentThe CNCF Technology Radar ReportThe state of things on the cloud-native ecosystem and tech in general is not surprising. Just like before, success and return on investment with AI hinges on engineering best practices, which are grounded in both the internal developer platform and developer experience, which then impact each other.INSERT PODCASTThen, since coding was never the bottleneck, AI-generated code is making the real short-staffed, tight bottleneck of DevOps, reliability, and security much worse. Operator experience is finally a top concern at most organizations in 2026. And guardrails are the only ways to go safely fast.“The kind of safety with AI is making things better and worse at the same time,” Dodd tells The New Stack. “One of the approaches you can take is if you can take the kind of developer platform or other internal tooling, where you can prevent people from being dangerous to themselves, you can control everything at your end. All security is handled by someone who actually understands how it works. All the pipelines are built by people who actually know how pipelines work.”Of course, the majority of orgs are onboarding non-human developers too. And what’s good for junior developers is what’s good for AI too.“The AI developer, whether they are super competent, medium competent, like upskilled or downskilled, you can basically just say they cannot destroy in our systems, they are locked into what they do, and therefore you can let them be a bit more dangerous because they can’t actually break things.” Dodd continues, referring to a developer suddenly rebranding as an AI developer — or an agentic AI developer. In light of this, Killen reflects on the changing team size in the face of AI. “There’s been a shift in DevOps and platform engineering, where it used smaller teams, where both the dev and ops and people work on both,” he observes. “Now we’ve seen the switch, like larger teams that are focused on platform engineering, providing services for their internal teams to enable the teams internally,” as referenced in Team Topologies. No matter what, the pace of AI is hitting the massive Cloud Native Landscape, and the tech industry is faced with more complexity than ever. But when it comes down to it, open source success still hinges more on people and processes than on tech, and the future of our industry and AI still hinges on open source.The post Fresh data has us asking, does AI demand Kubernetes? appeared first on The New Stack.