Nvidia faces new AI chip challenge from Google
Nvidia (NVDA) remains the dominant force in the AI chip market, but competition is intensifying as Google (GOOG, GOOGL) considers selling its in-house AI processors to Meta (META), said an article by Yahoo Finance, News.Az reports.
A report from The Information on Nov. 24 said the potential deal could be worth billions of dollars, sparking concerns that one of Nvidia’s top customers may soon become a rival supplier.
Following the report, Nvidia’s stock fell 2.5%, prompting the company to post a message on X congratulating Google but insisting that Nvidia’s chips remain “a generation ahead” of Google's technology.
Competition accelerated further this week, when Amazon (AMZN) announced the public rollout of its Tranium3 AI chip, claiming it lowers AI training costs by up to 50% compared to other options.
Despite growing pressure from major cloud players offering their own hardware, analysts say Nvidia’s lead is unlikely to disappear soon. The company’s GPUs remain the industry standard for training advanced artificial intelligence models, and demand continues to outpace supply.
It does, however, mean that Google and Amazon could end up taking their own share of the broader market as the global AI build-out continues to expand and companies look for alternatives to Nvidia amid a rush for its AI chips.
One of the main things to understand about the Nvidia versus Google and Amazon debate is that they don’t exactly offer the same products. Google’s TPUs and Amazon’s Tranium3 are types of chips called ASICs, or application-specific integrated circuits, meaning they’re built to accomplish specific tasks very well.
That means Google and Amazon have developed them to handle certain applications efficiently because the chips were made specifically for those purposes.
“[Google knows] the requirements and they know what trade-offs are most efficient for them,” explained Forrester senior analyst Alvin Nguyen.
“They can make something that works better today for them. Now, it doesn't mean that it's superior to Nvidia in every aspect. But … at least for Google, it will be superior for their needs,” he added.
Nvidia’s chips, meanwhile, are available across multiple cloud platforms, including those from Google and Amazon, as well as Microsoft. The architecture that underpins the chips can also be transferred to different use cases, whether that’s training AI models, running models on robots, powering video games, or helping bring computing capabilities to self-driving car technologies.
Nvidia also has its own line of networking products that it sells to third parties including Amazon, which will use the company’s NVLink technology alongside its Tranium4 and Graviton CPU chips in its servers, as well as its popular CUDA software.
Large hyperscalers are able to put up the initial cash necessary to build their own custom chips and use them over time, amortizing some of the cost. Other companies, however, can’t afford to produce their own processors and rely on chips from Nvidia and rival AMD (AMD).
It’s important to note that those same hyperscalers also buy plenty of Nvidia chips. CFO Colette Kress noted during the company’s second quarter earnings that large cloud providers accounted for some 50% of Nvidia’s total data center revenue. The company didn’t mention that percentage in its latest report.
ASICs like Google’s and Amazon’s have one other limiting factor: If the companies change their workloads, they need to rework their chips.
“If your model structures change, you may need to design a new chip,” explained Bernstein analyst Stacy Rasgon, adding that’s why the companies also purchase those Nvidia chips.
But there’s a potential reason to take that risk.
“It's about … total cost of ownership, performance per watt, performance per dollar,” Rasgon said. “And I'm willing to stipulate that for the workloads that they have designed for an ASIC, in theory, should have better TCO than a GPU or something that's more general purpose. Otherwise, why are you bothering?”
Third-party companies like Meta (META), meanwhile, could benefit from using their rivals' chips by simply getting access to more computing power at a time when the world is clamoring to get its hands on Nvidia’s products.
It’s not as though Nvidia is up against it, either. Kress has told investors and analysts the company has visibility toward $500 billion in Blackwell and Rubin AI chip revenue through calendar 2026. The company is currently in fiscal 2026.
And CEO Jensen Huang noted in the company’s most recent earnings announcement that “Blackwell sales are off the charts, and cloud GPUs are sold out.”
From a broader perspective, Nvidia’s chip lead and the outgrowth of competition from its own customers don’t mean the company is in danger of seeing revenue shrink.
According to Rasgon, the more likely scenario is that the AI chip market will continue to expand, making room for both Nvidia and other competitors.
Mizuho analyst Vijay Rakesh offered a similar sentiment, writing in a note to investors that while a TPU deal between Google and Meta is positive for Broadcom, which builds Google’s chips, Nvidia is “still the king.”
That said, the chip industry is evolving at a rapid pace, and there’s no telling where it might go in the months and years ahead.





