What does a 1 gigawatt AI data center worth  billion include?

One gigawatt capacity is considered the new standard for AI data centers with many components, with the largest cost being for Nvidia’s GPUs.

According to analysis by investment bank TD Cowen (USA), one gigawatt (GW) is the new standard for next-generation AI data centers, such as xAI’s Colossus 2 in Memphis, Meta’s Prometheus in Ohio, Hyperion in Louisiana, OpenAI’s Stargate in Texas and Amazon’s Mount Rainier project in Indiana.

Meanwhile, analysis from Bernstein Research shows that building a gigawatt-sized AI data center requires about $35 billion. “Each gigawatt of capacity is not just a measure of energy, but also represents an emerging industrial ecosystem spanning semiconductors, networking equipment, power systems, construction to energy production,” commented Bernstein Research.

Combining data sources, TD Cowen and Bernstein Research estimate 39% of the cost, or about $13.65 billion, is for AI chips – GPUs optimized for artificial intelligence tasks. They have powerful parallel processing capabilities, performing thousands of calculations simultaneously, greatly accelerating tasks such as training deep learning models, analyzing big data, and operating complex AI applications.

Nvidia dominates the market, controlling about 90% of the data center GPU market share, according to analyst firm IDC. TD Cowen’s data shows that each GW equates to more than a million GPUs in use. Even TSMC, Nvidia’s chip manufacturing partner, also earns $1.3 billion per gigawatt from producing these components.

“It’s not surprising that Nvidia is valued at $5,000 billion,” Business Insider comment.

Meanwhile, networking – circuits that help connect GPUs together, mainly network equipment such as high-speed switches and optical connections – accounted for 13%, equivalent to 4.5 billion USD. Arista Networks, Broadcom and Marvell are the leading switch suppliers of choice. Amphenol and Luxshare provide cables and connectors, while InnoLight, Eoptolink and Coherent, which specializes in manufacturing optical transceivers, are also beneficiaries.

Costs for land and construction of server buildings account for about 11%, equivalent to 3.85 billion USD. Infrastructure and other materials are 8%, or 2.8 billion USD.

 

OpenAI’s Stargate data center project is being built in Texas (USA). Image: Reuters

However, energy is a big problem for AI data centers. According to calculations, electricity usage alone accounts for 10% of the cost, or 3.5 billion USD. Systems such as gas turbines and diesel generators account for 6% ($2.1 billion) and backup power sources account for 5% ($1.75 billion). In total, the electricity cost of a one-gigawatt AI data center is about 21% ($7.35 billion).

The current electrical infrastructure in the US is considered to not meet the needs of data centers. However, many companies have started building large systems. According to units specializing in providing energy infrastructure construction equipment such as Siemens Energy, GE Vernova and Mitsubishi Heavy, the number of orders for power generation turbines and power grid infrastructure equipment has skyrocketed in the context that power companies in the US are trying to ensure reliable power sources on a large scale.

In addition, a one-gigawatt AI data center also requires 4% of costs for thermal management, equivalent to 1.4 billion USD. Continuous operation causes machines to emit huge amounts of heat, forcing them to cool down with cooling systems mainly using air and liquid. Other costs such as CPU, storage, or operating personnel account for an additional 4%.

By Editor

Leave a Reply