Nvidia is accustomed to being the company others must explain themselves against — not the other way around. But on Tuesday, the $4 trillion chipmaker made a highly unusual move…
Others are reading now
Nvidia is accustomed to being the company others must explain themselves against — not the other way around. But on Tuesday, the $4 trillion chipmaker made a highly unusual move: it publicly defended its dominance on X after a report suggested that one of its biggest customers, Meta, may shift part of its AI infrastructure to Google’s in-house chips.
Google’s resurgence — powered by its widely praised new Gemini 3 model and growing interest in its TPU hardware — sent Alphabet shares climbing and triggered a rare moment of insecurity from Nvidia. The company’s stock dipped more than 2.5% as investors reevaluated whether Nvidia’s GPU monopoly might, for the first time, face a real competitor.
Nvidia Breaks Character With Public Defense
The spark came from an exclusive report in The Information stating that Google has been pitching its custom Tensor Processing Units (TPUs) to cloud customers — including Meta and several large financial institutions — for use inside their own data centers, not just through Google Cloud.
That would represent a major escalation in Google’s hardware ambitions, and it rattled markets enough that Nvidia decided to respond.
“We’re delighted by Google’s success… Nvidia is a generation ahead of the industry — it’s the only platform that runs every AI model and does it everywhere computing is done,” the company wrote on X.
Also read
On its face, the message was congratulatory. Between the lines, it was unmistakably defensive — a reminder to investors that while Google’s TPUs are gaining traction, Nvidia’s GPUs are still the universal standard for AI.
Why Google Suddenly Matters Again
For nearly a decade, Google’s TPUs were seen as specialized tools — fast and optimized, but too narrow to threaten Nvidia’s GPUs. Nvidia repeatedly emphasized GPUs’ flexibility, calling them the “general-purpose backbone” of AI, whereas TPUs are ASICs, customized for specific workloads.
But that narrative cracked when Google revealed that Gemini 3 was trained entirely on TPUs — and that the model, by many benchmarks and early industry reactions, is one of the strongest foundation models ever released.
Salesforce CEO Marc Benioff called Gemini 3 an “insane leap.” Former Tesla AI director Andrej Karpathy said it had “tier-1 LLM” potential. And Stripe’s Patrick Collison praised its performance in scientific reasoning.
The possibility that Meta — one of the world’s largest GPU buyers — may integrate TPUs into its own infrastructure was enough to signal that TPUs are no longer a curiosity. They are a competitive threat.
Also read
As bearish portfolio manager Brian Kersmanc recently warned:
“Arguably the most successful AI company now, Google, didn’t even use GPUs to train its latest model.”
That comment now reads like foreshadowing.
Google’s Strategic Advantage: The Full AI Stack
Part of Google’s comeback stems from its long-term investment in building the entire AI stack — apps, models, cloud infrastructure, and custom chips — something only a handful of companies on Earth can claim. That vertical integration gives Google cost advantages OpenAI and many rivals lack.
Google also struck a massive deal with Anthropic earlier this year, with the startup committing to use up to 1 million Google TPUs in a contract worth tens of billions of dollars.
Also read
If Meta joins that list, it could represent the start of a genuine shift in the balance of power in AI hardware — and Nvidia knows it.
Nvidia’s Other Battle: The “Big Short” Investor Who Won’t Let Go
The Google issue isn’t Nvidia’s only source of stress. The company has also been fending off a high-profile critic: The Big Short investor Michael Burry.
Burry has repeatedly compared the AI boom to the dot-com and telecom bubbles — and Nvidia to Cisco, a hardware supplier that soared during a build-out phase only to collapse when spending slowed.
He has accused Nvidia of:
- excessive stock-based compensation
- depreciation schedules that inflate profitability
- enabling “circular financing” via AI startups that use Nvidia hardware bought with investor money
In response, Nvidia privately circulated a seven-page memo to Wall Street analysts rebutting Burry’s claims. Burry published it himself on Substack, turning the feud public.
Also read
The memo insisted Nvidia’s accounting is sound, transparent, and comparable to peers — a sign the company is now managing skepticism on multiple fronts.
The Bigger Picture: A Monopoly Finally Facing Pressure
None of this means Nvidia is collapsing. It remains the dominant provider of AI chips, and its GPU ecosystem is far more mature than anything Google offers.
But Google’s resurgence has forced Nvidia into a posture it has not assumed in years: defending itself.
Google’s Gemini 3 is widely viewed as a breakthrough. Its TPUs are finally being taken seriously by hyperscalers. And for the first time, Nvidia’s biggest customers are openly considering diversification.
For a company that has been the undisputed backbone of the AI boom, the shift is remarkable — and, for investors and competitors alike, a sign that the AI hardware race may be entering a new phase.
Also read
Sources: The Information, Fortune, X, Barrons