
"Tensor Processing Units are custom-designed chips built for the matrix math powering AI model training and inference. Unlike Nvidia's general-purpose GPUs, TPUs are architected around a single customer's workload, delivering better performance per watt at scale."
"Broadcom's AI revenue trajectory has accelerated sharply: $4.4 billion in Q2 FY2025, $5.2 billion in Q3, roughly $6.2 billion in Q4, and $8.4 billion in Q1 FY2026, up 106% year-over-year."
"This partnership is one of the most consequential custom silicon relationships in AI infrastructure. Our AI revenue growth is accelerating, and we expect AI semiconductor revenue to be $10.7 billion in Q2."
Broadcom has entered a long-term agreement with Alphabet to design and supply custom Tensor Processing Units (TPUs) and networking components for Google's AI racks through 2031. This deal also includes Anthropic, providing access to 3.5 gigawatts of TPU-based AI compute starting in 2027. Broadcom's AI revenue has surged, with projections of $10.7 billion in Q2 FY2026, contributing to a target of $100 billion in AI chip revenue by 2027, heavily reliant on Google TPU volumes.
Read at 24/7 Wall St.
Unable to calculate read time
Collection
[
|
...
]