OpenAI’s Latest Thing It’s Bragging About Is Actually Kind of Sad

Wait 5 sec.

Despite promising to allocate hundreds of billions of dollars for the build out of enormous data centers, the AI industry has struggled to keep up with its lofty ambitions. According to recent reporting by Bloomberg and Ed Zitron, roughly half the data centers slated to open in the United States are either being delayed or canceled outright. Massive electric component shortages and soaring costs have slowed the infrastructure boom to a trickle, frustrating tech leaders.In the resulting morass, every AI player has been making big claims to try to keep the hype train going. But OpenAI in particular has turned braggadocio into an art form, and its latest boast is ringing a bit sad: in a memo obtained by Bloomberg, the company boasted that it’s planning to have 30 gigawatts worth of compute — enough to power over 22 million US households — by 2030, while its rival Anthropic is only planning for seven to eight gigawatts by the end of 2027.To put those numbers into perspective, OpenAI had just 1.9 gigawatts of computing capacity in 2025, while Anthropic had 1.4 gigawatts.“Even at the high end of that range, our ramp is materially ahead and widening,” OpenAI’s memo reads.The ChatGPT maker also argued that it was outpacing Anthropic by “rapidly and consistently” adding computing capacity.“That gap matters because compute is now a product constraint,” the memo reads.In other words, OpenAI’s big brag isn’t some amazing new technical breakthrough, perhaps one that could achieve impressive AI with far less computing power. Instead, it’s just that it’s building mountains of data centers to overwhelm the competition through brute force.The timing of OpenAI’s memo is telling. It was sent out just after Anthropic showed off its latest AI model, Claude Mythos, which its staffers allege is too powerful and therefore too much of a risk to cybersecurity to be released in full.In a statement to Bloomberg, Anthropic struck back at OpenAI, pointing to a recent deal the company had struck with Broadcom and Google as part of “our disciplined approach to scaling infrastructure.”“We are making our most significant compute commitment to date to keep pace with this unprecedented growth,” the statement reads.OpenAI, on the other hand, has been far more brash in its plans to dominate the industry, claiming it will spend a whopping $600 billion on AI infrastructure through 2030 — which, it’s worth noting, is less than half of what it originally promised to spend.It’s a precarious point of inflection for the Sam Altman-led company. As the walls continue to close in and investors become increasingly antsy ahead of its rumored blockbuster IPO, OpenAI has consolidated its plans to pursue many of the same goals as Anthropic.The central premise: the more compute, the more powerful the AI.“Each new generation of infrastructure lets us train more capable models, making every token more intelligent than the one before,” OpenAI wrote in its memo. “At the same time, algorithmic gains and hardware improvements reduce the cost to serve each token, lowering the cost per unit of intelligence.”More on OpenAI: First Model From Zuckerberg’s Wildly Expensive Superintelligence Labs Flops Compared to Virtually All RivalsThe post OpenAI’s Latest Thing It’s Bragging About Is Actually Kind of Sad appeared first on Futurism.