Broadcom, Google strike major AI chip deal through 2031
Broadcom has secured a long-term partnership with Google to develop next-generation custom artificial intelligence chips, signaling a deeper push into the rapidly expanding AI hardware race.
Under the agreement, Broadcom will design and supply future versions of Google’s custom AI processors, along with key components powering its next-generation data center systems—through 2031, News.Az reports, citing Reuters.
The move strengthens Google’s strategy to reduce reliance on third-party chipmakers and scale its own AI infrastructure as demand surges.
RECOMMENDED STORIES
The deal comes amid explosive growth in demand for custom AI chips, particularly Google’s tensor processing units (TPUs), which are designed to handle machine learning workloads more efficiently than traditional graphics processors.
Google has been increasingly positioning its TPUs as a competitive alternative to chips made by Nvidia, whose GPUs currently dominate the AI market.
Custom chips are also becoming a key driver of Google’s cloud business, as the company looks to prove that its massive AI investments are translating into real revenue growth.
In a parallel development, Broadcom has also signed an agreement with Anthropic, giving the company access to roughly 3.5 gigawatts of AI computing capacity powered by Google’s processors starting in 2027.
Anthropic, known for its fast-growing AI model Claude, said demand has surged in 2026, with annualized revenue exceeding $30 billion—more than tripling from the previous year.
The startup uses a mix of AI hardware, including chips from Google, Nvidia, and custom processors from Amazon Web Services, which remains its primary cloud partner.
The Broadcom-Google deal highlights a major shift in the AI industry: tech giants are increasingly building their own chips to cut costs, improve performance, and reduce dependence on dominant suppliers.
For Broadcom, the agreement reinforces its position as a key player in the AI supply chain—while for Google, it’s another step toward controlling the full AI stack.
By Aysel Mammadzada





