Anthropic signs biggest compute deal yet with Google and Broadcom as run rate hits $30bn



In short: Anthropic has agreed to access approximately 3.5 gigawatts of next-generation Google TPU compute capacity via Broadcom from 2027, its largest infrastructure commitment to date — while simultaneously disclosing that its revenue run rate has surpassed $30bn, more than tripling from roughly $9bn at the end of 2025.

Anthropic has announced it is securing multiple gigawatts of next-generation compute capacity through a new agreement with Google and Broadcom, while disclosing revenue growth figures that underscore why the AI lab now requires infrastructure at a scale that would have seemed implausible two years ago. The deal, announced on 6 April 2026, gives Anthropic access to approximately 3.5 gigawatts of Google tensor processing unit (TPU) capacity via Broadcom starting in 2027, building on the 1 gigawatt already being supplied to the company in 2026.

Krishna Rao, Anthropic’s chief financial officer, described it as “our most significant compute commitment to date,” adding that the agreement represents a continuation of the company’s “disciplined approach to scaling infrastructure.” The majority of the new capacity will be located in the United States, extending Anthropic’s November 2025 commitment to invest $50bn in American AI computing infrastructure.

Three parties, one infrastructure layer

The announcement is as much about Broadcom as it is about Anthropic or Google. Under the new arrangement, Broadcom acts as the intermediary layer between Google’s custom silicon and Anthropic’s training and inference workloads. In parallel, Broadcom has signed a separate long-term agreement with Google to design and supply future generations of custom TPU chips, and a supply assurance agreement to provide networking and other components for Google’s next-generation AI data racks through 2031.

This makes Broadcom an increasingly indispensable node in the AI infrastructure graph. The chipmaker, led by CEO Hock Tan, is not building AI models; it is building the silicon and the interconnects on which AI models are built. Broadcom shares rose approximately 3% in extended trading on the announcement, a reaction that reflects investor appetite for companies positioned at the physical layer of the AI stack rather than the application layer on top of it. Analysts at Mizuho, led by Vijay Rakesh, estimated that Broadcom would record $21bn in AI revenue from Anthropic in 2026 alone, rising to $42bn in 2027, figures that, even as projections, illustrate the financial weight of what is being committed.

Broadcom had first signalled the scale of its Anthropic relationship in September 2025, when Hock Tan disclosed during an earnings call that a mystery customer had placed a $10bn order for custom TPU racks. In December 2025, he confirmed the customer was Anthropic, and that an additional $11bn order had since followed. The April 2026 announcement is the third act of the same story: a partnership that has now graduated from a reported $21bn commitment to multi-gigawatt infrastructure with a defined delivery timeline.

Revenue and customers: the numbers driving the infrastructure

The compute deal is intelligible only against the backdrop of Anthropic’s commercial growth. The company says its run-rate revenue has now exceeded $30bn, up from approximately $9bn at the end of 2025. That trajectory — more than a threefold increase in roughly three months, is the result of a compounding enterprise sales motion that accelerated sharply after Anthropic closed its Series G funding round on 12 February 2026. That round raised $30bn at a post-money valuation of $380bn, led by GIC and Coatue, and co-led by D.E. Shaw Ventures, Dragoneer, Founders Fund, ICONIQ, and MGX.

When the Series G closed, Anthropic reported that more than 500 business customers were each spending over $1m on an annualised basis. As of the April announcement, that number has exceeded 1,000, doubling in less than two months. The pace of enterprise adoption is the proximate cause of the compute expansion: more revenue requires more inference capacity, more inference capacity requires more training compute, and more training compute requires more gigawatts.

Claude’s multi-cloud architecture

What distinguishes Anthropic’s infrastructure approach from many of its peers is an explicit multi-vendor chip strategy. Claude is trained and served across three hardware platforms: Amazon’s Trainium chips, Google’s TPUs, and Nvidia GPUs. Anthropic says Claude is the only frontier model available on all three major cloud platforms, AWS, Google Cloud, and Microsoft Azure,  a claim that carries commercial as well as technical significance.

The multi-vendor stance gives Anthropic both resilience and negotiating leverage. If capacity is constrained on any single platform, workloads can shift. If one chipmaker faces supply disruption, export controls, or pricing pressure, Anthropic is not exposed to the full force of that shock. The strategy has precedent: Microsoft’s own AI models reflect a similar instinct to hedge against single-vendor dependence, though in Microsoft’s case the hedge is against a partner rather than a hardware supplier.

The AWS relationship remains foundational. In late 2024, Anthropic named Amazon its primary cloud and training partner, with total Amazon investment reaching $8bn. Project Rainier, an Anthropic supercomputer cluster running roughly 500,000 Amazon Trainium 2 chips in Indiana, is expected to scale beyond one million Trainium 2 chips by the end of 2025. The Google relationship, which now extends through the new Broadcom deal to multi-gigawatt scale in 2027, sits alongside this rather than replacing it.

The US infrastructure commitment

The April deal is framed explicitly as an extension of Anthropic’s November 2025 domestic infrastructure pledge: a $50bn commitment to American AI computing infrastructure, developed initially in partnership with Fluidstack, the UK-based neocloud operator, with data centre sites in Texas and New York coming online through 2026. The new Broadcom capacity, the majority of which will be US-based, expands that footprint into 2027 and beyond.

This domestic emphasis is not incidental. The Trump administration’s AI Action Plan has explicitly targeted US-based compute capacity as a strategic priority, and Anthropic, like its peers, has positioned its infrastructure investments accordingly. Whether that alignment reflects sincere strategic conviction or tactical regulatory positioning — or both — the practical effect is the same: a substantial share of the world’s next-generation AI training capacity is being locked into American geography.

What the deal says about the compute arms race

The Anthropic-Google-Broadcom announcement is a data point in a pattern that has been building for 18 months. SoftBank’s $40bn bridge loan to fund its OpenAI commitment reflected the same underlying dynamic: AI labs have grown so fast that their compute requirements now exceed what can be financed from revenue alone, requiring financial engineering at a scale once reserved for infrastructure utilities. Meta’s $27bn infrastructure deal with Nebius reflects a parallel logic at the hyperscaler level.

The compute arms race is also reshaping how AI companies manage their relationships with the services built on top of their models. Anthropic has been attentive to this: the company recently moved to restrict access to Claude via certain third-party frameworks, a decision that illustrated how the cost dynamics of frontier model inference are forcing AI labs to make difficult choices about which use cases they subsidise and which they price explicitly.

For Broadcom, the trajectory is simpler: a chipmaker that was not widely discussed in the context of AI two years ago is now a load-bearing element of the infrastructure on which two of the world’s most consequential AI models, Google’s Gemini and Anthropic’s Claude — are built and served. That position, cemented through 2031 for Google’s custom silicon and through the new multi-gigawatt agreement for Anthropic’s TPU access, is the real story beneath the headline numbers. Nvidia remains the dominant force in AI accelerators, and firms like Nvidia’s enterprise AI platform continues to expand its reach. But Broadcom’s rise as the custom silicon partner of choice for hyperscale AI compute is one of the defining semiconductor industry shifts of this decade.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


spring-sale-imagery

DeWalt/ZDNET

Spring means lawn and garden prep and DIY projects around the house. And if you’ve been looking for a handy gadget to help you with small repairs and crafts, you can pick up the DeWalt MT21 11-in-1 multitool at Amazon ahead of its Big Spring Sale for 25% off, bringing the price down to $30 (matching the lowest price of the year so far). It also comes with a belt sheath to keep it close by on jobsites.

Also: 10 DIY gadgets I never leave out of my toolkit

The MT21 has a compact design, measuring just 4 inches when fully folded and expanding to 6 inches when the pliers are deployed. The hinged handle is made of durable steel with a rubberized grip in iconic DeWalt yellow and black, adding a bit of visual flair while making the multitool more comfortable to use. Each of the included tools is also made of stainless steel for strength and reliability on jobsites and in the garage.

Also: The best Amazon Spring Sale DeWalt deals

The 11 featured tools include: regular and needlenose pliers, wire cutters, two flathead screwdrivers, a Phillips screwdriver, a file, a can and bottle opener, a saw blade, a straight-edge blade, and an awl tool. Each tool folds into the handle to keep them out of the way until needed and to protect your hands while using the multitool. 

We’re big fans of multitools here at ZDNET, and definitely recommend this highly rated one from DeWalt.

How I rated this deal 

DeWalt is one of the leading names in power tools, and if you’re looking for a handy EDC gadget or just need something for occasional DIY repairs, the MT21 multitool is a great choice. With 11 tools in a single gadget, you can do everything from assembling flat-pack furniture to minor electrical repairs. While not the steepest discount, getting your hands on a high-quality multitool for 25% off is still a great value. That’s why I gave this deal a 3/5 Editor’s rating.

Amazon’s Big Spring Sale runs March 25-31, 2026. 


Show more

Deals are subject to sell out or expire anytime, though ZDNET remains committed to finding, sharing, and updating the best product deals for you to score the best savings. Our team of experts regularly checks in on the deals we share to ensure they are still live and obtainable. We’re sorry if you’ve missed out on this deal, but don’t fret — we’re constantly finding new chances to save and sharing them with you at ZDNET.com


Show more

We aim to deliver the most accurate advice to help you shop smarter. ZDNET offers 33 years of experience, 30 hands-on product reviewers, and 10,000 square feet of lab space to ensure we bring you the best of tech. 

In 2025, we refined our approach to deals, developing a measurable system for sharing savings with readers like you. Our editor’s deal rating badges are affixed to most of our deal content, making it easy to interpret our expertise to help you make the best purchase decision.

At the core of this approach is a percentage-off-based system to classify savings offered on top-tech products, combined with a sliding-scale system based on our team members’ expertise and several factors like frequency, brand or product recognition, and more. The result? Hand-crafted deals chosen specifically for ZDNET readers like you, fully backed by our experts. 

Also: How we rate deals at ZDNET in 2026


Show more





Source link