Nvidia just announced plans to invest up to $100 billion in OpenAI to build out a new generation of AI data centers, one of the largest AI computing projects in history. These are remarkable numbers that get even bigger when you consider the countless supporting projects, like many subsea cables, or local corporate and public sector computing projects, that will likely be essential for this global AI network.
The digital divide is evolving beyond household access to broadband. As artificial intelligence (AI) is woven into the fabric of everyday life - from smart homes and virtual assistants to creative and professional tools - a new divide is emerging: those with a fiber connection, and those without.
With this switch, Cisco says it's offering an Nvidia Cloud Partner-compliant reference architecture for neocloud and sovereign cloud deployments. Available to order before the end of the year, the Cisco N9100 series switches offer a choice of Cisco NX-OS or Sonic operating systems, supporting Ethernet for AI networks, and providing greater flexibility in how neocloud and sovereign cloud customers build their AI infrastructure.
Chief financial officer Amy Hood said: "This quarter, roughly half of our spend was on short-lived assets, primarily GPUs [graphics processor units] and CPUs [central processor units], to support increasing Azure platform demand, growing first-party apps and AI solutions, accelerating R&D by our product teams, as well as continued replacement for end-of-life server and networking equipment." There is also longer term expenditure, which includes $11bn of finance leases that are primarily for large datacentre sites.
A recent nationwide survey of more than 1,400 U.S. households found that two-thirds of Americans believe AI is already driving up their power bills, and most said they can't afford more than a $20 monthly increase. They're right to be worried. As tech companies pour hundreds of billions into new data centers, the surge in electricity demand is rewriting the economics of the grid - and households are footing the bill for an "AI power tax" they never voted for.
Organizations have long adopted cloud and on-premises infrastructure to build the primary data centers-notorious for their massive energy consumption and large physical footprints-that fuel AI's large language models (LLMs). Today these data centers are making edge data processing an increasingly attractive resource for fueling LLMs, moving compute and AI inference closer to the raw data their customers, partners, and devices generate.
When it comes to artificial intelligence, a few names dominate the conversation like Nvidia ( NASDAQ:NVDA ), Taiwan Semiconductor Manufacturing ( ), or even Intel ( NASDAQ:INTC ) in recent months. These companies rightfully claim the spotlight. These players drive the AI narrative because they deliver tangible results - record revenues, market share gains, and innovations that fuel everything from chatbots to autonomous systems. Investors flock to them, bidding up shares on every earnings beat or product launch. Yet beneath the hype, AI's foundation relies on more than just processing power and fabrication prowess. Data storage and high-speed memory are the unsung necessities that enable seamless data flow , preventing bottlenecks in the AI pipeline.
The two companies announced the deal on Thursday, with Anthropic pitching it as "expanded capacity" that the company will use to meet surging customer demand and allow it to conduct "more thorough testing, alignment research, and responsible deployment at scale." Google's take on the deal is that it will enable Anthropic to "train and serve the next generations of Claude models," and involves "additional Google Cloud services, which will empower its research and development teams with leading AI-optimized infrastructure for years to come."
In August, the US government announced it was converting about $9 billion in federal grants that Intel had been issued during the Biden administration into a roughly 10 percent equity stake in the company. During its third-quarter earnings on Thursday-its first financial update since Trump's surprise investment-Intel reported that it earned $13.7 billion in revenue over the past three months, a three percent increase year-over-year. It's the fourth consecutive quarter that Intel has beat revenue guidance.
For storage, Dell's network-attached storage (NAS) component, PowerScale, now integrates with Nvidia GB200 and GB300 NVL71 - much of this update is about reduction as it will use up to five times less rack space, 88% fewer network switches, and up to 72% lower power consumption when compared to rival services. According to Dell, the integration will also deliver 16,000-plus GPU-scale.
With their enhanced processing capabilities, large "hyperscale" complexes are the preferred data centers for the computation-heavy training and use of AI models. They can cover an area of over 1 million square feet, roughly equal to 17 football fields. Water is used to maintain humidity and as a coolant for the heat-generating machines, and as American data centers have grown in size and number, so has their water consumption, from 5.6 billion gallons in 2014 to 17.4 billion in 2023.
Datacenters are set to standardize on the larger, 21-inch rack format by 2030, according to Omdia, as hyperscalers and server makers fully embrace it, leaving enterprises to the existing 19-inch standard. The analyst biz forecasts that the larger rack format, popularized by the Open Compute Project (OCP), will make up over 70 percent of kit shipped by the end of the decade, as it is increasingly adopted by vendors such as Dell and HPE that have been riding the AI infrastructure wave.
Internal documents obtained by Business Insider reveal that AWS has flagged a "fundamental" shift in how startups are allocating their cloud budgets. Increasingly, they're delaying AWS cloud adoption and diverting spending toward AI models, inference, and AI developer tools. Instead of pouring money into traditional cloud services like compute and storage, these companies are spreading costs across newer AI technologies that are easier to switch between, according to the documents.
Aerospace engineer KR Sridhar always dreamed big: He used to work with NASA on technology to convert carbon dioxide into oxygen to support life on other planets or let humans breathe air on Mars. But as the Soviet Union fell and the space race slowed, Sridhar pivoted to providing clean energy technology for the rising global middle class. He cofounded Ion America in 2001-renamed Bloom Energy five years later-with a focus on fuel cells that deliver cleaner, on-site, off-grid power.
Nava - $30M Series C Employee Benefits Nava, a tech-enabled employee benefits brokerage platform, has raised $30M in Series C funding led by Thrive Capital. Founded by Brandon Weber, Donald DeSantis, Kareem Zaki, and Lincoln Reis in 2019, Nava has now raised a total of $90M in reported equity funding. Crosby - $20M Series A Technology Crosby has raised $20M in Series A funding led by Bain Capital Ventures, Index Ventures, and Elad Gil. Crosby was founded by Ryan Daniels and John Sarihan.
CoreWeave started life in 2017 as Atlantic Crypto, mining Ethereum with racks of GPUs, but the real acceleration hit in 2024, driven by deals with AI labs and enterprises needing scalable compute without building their own data centers. Revenue jumped from $229 million in 2023 to $1.9 billion. The March IPO priced at $40 per share but debuted flat at $23 billion valuation amid market jitters. Shares bottomed at $33 in April before rocketing 322% on AI hype, peaking at $187 per share in June.