"If you want to know why Nvidia is valued at $5 trillion, take a look at the data and chart below. They show how this tech giant is scooping up a huge portion of the AI spending boom. As AI enters its industrial phase, the world's most advanced data centers now measure their scale not in square footage or servers, but in gigawatts of computing capacity."
"TD Cowen analysts put this in context, writing in a research note this week that 1 gigawatt is roughly the output of a nuclear reactor. That's the new baseline for next-generation AI data centers, such as xAI's Colossus 2 in Memphis, Meta's Prometheus in Ohio and Hyperion in Louisiana, OpenAI's Stargate, and Amazon's Mount Rainier project in Indiana. These sprawling structures require huge amounts of electricity, and combine that with capital and silicon to churn out intelligence."
Next-generation AI data centers are being built at gigawatt scales, with 1 gigawatt roughly equivalent to a nuclear reactor's output. Bernstein Research estimates about $35 billion in capital cost per gigawatt. That capital covers an industrial ecosystem spanning semiconductors, networking gear, power systems, construction, and energy generation. GPUs are the largest single cost component, accounting for roughly 39% of total spending. Major gigawatt-scale projects include xAI's Colossus 2, Meta's Prometheus and Hyperion, OpenAI's Stargate, and Amazon's Mount Rainier. Nvidia captures a disproportionate share of spending tied to these buildouts.
Read at Business Insider
Unable to calculate read time
Collection
[
|
...
]