AI is drastically reshaping data center infrastructure, particularly the power demands of server racks. Expect major shifts in rack power consumption and cooling strategies as AI workloads increase.
The Rise of High-Density Racks
Power Consumption Projections

- AI-focused racks could draw up to 1MW each by 2030.
- Average racks are projected to increase to 30-50kW within the same timeframe.
Implications for Data Centers
The massive power jump needed for AI server racks compared to standard racks will force data centers to rethink their infrastructure. This includes not only power delivery but also, crucially, cooling solutions.
Strategic Priorities: Power and Cooling
Cooling Challenges
Traditional cooling methods are struggling to keep up with the heat generated by high-density AI racks. Cooling is now a primary concern, not just a support function.
Evolving Cooling Techniques:
- Liquid cooling is gaining traction, with direct-to-chip solutions showing promise.
- Microfluidics, tested by companies like Microsoft, offer significantly better heat removal compared to traditional cold plates. Early trials demonstrated heat removal up to three times more effective.
Power Delivery Innovations
Delivering 1MW of power to a single rack requires rethinking power distribution architecture.
Key Changes:
- Moving towards high-voltage DC (e.g., +/-400V) to reduce power loss and cable size.
- Utilizing centralized CDUs (Cooling Distribution Units) for managing liquid flow to rack manifolds.
