Michael Burry — the contrarian who made a name calling the housing bubble — is back in the ring, this time with a target squarely on the AI boom. In a flurry of Substack posts and a public debate with Anthropic cofounder Jack Clark and podcaster Dwarkesh Patel, Burry laid out a stark scenario: hyperscalers and AI hopefuls are pouring trillions into chips and datacenters that may never earn their keep.
He’s not talking in abstractions. Burry has put real money where his mouth is, taking short positions against Nvidia and buying puts on companies he believes are chasing the wrong kind of growth. The argument is simple and sharp: when everyone spends to avoid falling behind, nobody gains a durable advantage — and the industry can be left with obsolete, capital-hungry assets.
Why Nvidia, and not Microsoft or Meta?
Burry calls Nvidia “the purest play” on AI. The chipmaker’s GPUs are the backbone of training large models, and their sales have ballooned as customers race to build or rent AI compute. That concentration makes Nvidia uniquely exposed if hyperscaler demand slows. In his words: Nvidia could sell roughly $400 billion of chips this year while he estimates less than $100 billion of application-layer use cases exist. If that math is off — and he says he doesn’t see how it works — the downside could be sharp.
Contrast that with Microsoft, Meta, or Alphabet. Burry argues shorting them would mean betting against big, revenue-generating franchises like Office, social advertising, or Google Search — businesses that can better absorb an AI hiccup. That’s why he’s focused his bearish bets on those he sees as most levered to raw AI infrastructure.
The Buffett escalator and the race to build
To explain the dynamics, Burry resurrected an old Warren Buffett anecdote about two department stores that each installed an escalator simply because the other did. The upgrade didn’t meaningfully change competitive standing — it just spent capital. Burry sees hyperscalers and enterprises doing the same with GPUs and data centers: spending because their peers spend, not because the economics are proven.
There’s also an energy dimension. Building out AI capacity is not just servers and racks; it’s power plants, transformers, and grid upgrades. Burry has suggested larger investments — even small modular nuclear reactors — to meet projected demand. The point is practical: compute doesn’t matter without power, and power constraints could turn a speculative hardware boom into stranded assets. For a sense of how creative companies are getting with data-center strategy, Google has floated ambitious ideas like putting AI data centers in orbit to solve real-estate and cooling limits, an example of the lengths firms will go to chase capacity Google’s Project Suncatcher.
Accounting, depreciation and the risk of writedowns
Burry has accused hyperscalers of stretching depreciation schedules to mask the real cost of their buildouts — essentially smoothing pain into the future. If equipment turns obsolete faster than expected, companies could face meaningful writedowns, and those headline-grabbing growth rates will look different under fresh accounting. That possibility is one reason he’s shorting hardware-heavy players rather than diversified software giants.
The broader stakes: jobs, utility and hype
His critique isn’t purely financial. Burry warns the AI buildout could leave employment lower than expected if companies opt for automation and follow-on cost-cutting after the dust settles. He also frets about “AI brain rot” — professionals over-relying on models and atrophying skills — a fear supported by emerging research on how low-quality AI outputs can degrade reasoning over time Study Warns ‘AI Brain Rot’.
And Burry sees a cultural mismatch: businesses buying into a robotized future the way people once bought into fiber and dot-com dreams. Chatbots and assistants are powerful, but their real-world monetization and productivity gains aren’t guaranteed. That leaves room for a classic boom-and-bust, where infrastructure outpaces demand.
Pushback and the counterarguments
Unsurprisingly, some of Burry’s targets have pushed back. Nvidia has argued that demand for specialised compute remains robust and that their roadmap supports sustained pricing power. Others caution that AI’s potential is broader and stickier than past infrastructure waves: models can embed across health care, finance, and enterprise software in ways that do produce durable revenue.
There’s also a debate about whether the AI moment is more like electricity — a foundational technology with long-term benefits — or more like an overbuilt, cyclical industry. Recent developments in AI capabilities and Google’s renewed push with Gemini tools that integrate deeply into Gmail and Drive show the ramp of real productization, complicating any simplistic bubble narrative Gemini’s Deep Research integration. Still, potential does not guarantee tidy returns for the companies fronting the enormous capital tab.
Timing is everything — and dangerous
One of the more human lessons here is that being right too early can be ruinous. Contrarian bets have a history of working out on a long enough timeline — but only if capital survives until the thesis plays out. Burry’s history gives his warnings weight, but markets can stay irrational longer than investors expect. That tension is why his stance is both provocative and perilous.
He’s not predicting instant collapse; rather, he’s betting on a prolonged period where the true costs of the AI buildout reveal themselves. Whether that becomes a slow reckoning of write-offs and restructured priorities or a quicker market unwind remains the open question.
Markets are pricing conviction into a handful of winners today. Burry is betting that conviction is misplaced. If he’s right, the next few years could be as much about clearing excess capacity as they are about breakthroughs — a messy, expensive phase of correction rather than a clean technological leap.