McKinsey frames the trend as straightforward: “to keep pace with the AI boom, investors could pour $7 trillion into building data centers over the next five years.” The premise is that AI models, especially large language models, foundation models, and other compute-intensive systems, will end up consuming about three-quarters of incremental compute demand by 2030. The study suggests that traditional capacity is insufficient to absorb this surge, thereby forcing new builds.
The scale is staggering: to avoid bottlenecks in compute supply, stakeholders from chipmakers to real estate developers must act preemptively. Yet McKinsey warns that blind investment is not enough. Developers must guard against overconcentration in suburban zones, misjudged energy or water constraints in rural areas, and hardware lock-in to soon-obsolete architectures. McKinsey argues that overbuilding is less of a risk than underestimating demand. Even if future architectures become more efficient, unused capacity could be absorbed by emergent applications nobody has conceived yet, much like how broadband surpluses enabled unforeseen digital industries.
The $7 trillion estimate spans all roles in the data-center ecosystem: real estate, hardware, cooling systems, energy, networking, infrastructure providers, and AI model training facilities. The capital, McKinsey suggests, will not flow evenly. Developers will favor sites with robust grid access, stable power, cool climates that allow for passive cooling, and strong fiber connectivity. In parallel, technological innovation will shape how this infrastructure evolves. Liquid cooling, advanced heat recapture, and energy storage systems will be central to efficiency design, while compute hardware, ranging from GPUs to high-bandwidth memory and custom AI accelerators, will carry both premium costs and short life cycles.
McKinsey also notes the rising importance of modular deployment. Instead of monolithic facilities, many investors will pursue phased builds that can be expanded or reconfigured as demand shifts. Because AI is still evolving, rigid architectural bets may backfire, so investors are urged to adopt a “through-cycle” mindset, holding course despite volatility and planning capacity that can adapt to emerging workloads. In short, the firm argues, ambition and prudence must go hand in hand.
This vast build-out is not without risk. One major concern is that demand for raw compute could taper or shift toward smaller, more specialized models that require fewer resources. If so, some data centers built today may face underutilization or become obsolete before full depreciation. Another risk stems from hardware innovation. As training efficiency improves, newer processors may dramatically reduce energy and compute needs, making today’s designs inefficient within just a few years.
Energy and environmental concerns loom equally large. The energy burden of AI is enormous, and many potential data-center locations lack sufficient grid capacity or renewable sources. Water, a critical input for cooling systems, poses another constraint, especially in arid or drought-prone regions. Without careful site selection and sustainable designs, projects could draw criticism from regulators, local communities, or ESG-minded investors.
Geographic and regulatory risks also present obstacles. Facilities in rural or developing regions often face challenges such as limited fiber connectivity, local permitting delays, infrastructure gaps, or land-use restrictions. Geopolitical tensions or sudden changes in taxation and environmental policy could also affect project viability.
Meanwhile, cybersecurity and cryptographic threats add another layer of vulnerability. As these centers store and process vast quantities of AI models and sensitive data, they become high-value targets for cyberattacks. Emerging technologies like quantum computing could undermine traditional encryption systems, while AI-enhanced attacks may compromise networks at unprecedented speed. Industry experts urge operators to invest in post-quantum security standards, continuous monitoring, and robust resilience frameworks to protect data integrity.
McKinsey advises stakeholders to begin with a clear understanding of their role within the value chain, whether they are co-location providers, equipment vendors, hyperscalers, network operators, or real-estate investors, and to align capital plans accordingly. Instead of committing to massive one-time builds, they should prioritize modular, phased deployment that grows with actual demand. Sustainability should be embedded from the outset, emphasizing renewable energy integration, water recycling, and adaptive cooling systems.
Before committing to land or power purchases, investors should perform rigorous due diligence on grid capacity, fiber access, environmental regulations, and regional risk factors. Finally, flexibility is crucial: firms should reserve budget and technical scope to pivot toward new computing architectures or workload types as AI evolves.
A particularly underexplored dimension of this investment wave is its potential impact on digital equity. Regions with weak infrastructure or limited grid capacity may struggle to attract data-center investment, widening the digital divide. Yet if capital is deployed strategically into underserved geographies, especially those with improving grids and connectivity, AI capacity could expand global access to digital tools and innovation. McKinsey implies that the next phase of data-center investment could become either a catalyst for inclusion or a driver of concentration, depending on how incentives and governance frameworks are structured.
The report also signals implications for national policy. Governments that streamline permitting, stabilize energy markets, and support renewable transitions will attract capital more effectively. By contrast, those with fragmented regulation or slow approval cycles may miss the current window of opportunity. At the same time, the sustainability dimension offers a chance for alignment between climate goals and economic diversification.
The McKinsey Re:think newsletter’s $7 trillion forecast underscores a pivotal shift in global infrastructure: artificial intelligence is not just a software revolution but a physical one. The data economy will increasingly depend on how well societies build and manage the physical foundations, compute, power, and connectivity that enable intelligence itself. The opportunity is vast, but so are the pitfalls. For investors, developers, and policymakers alike, the coming decade demands foresight in location, design, scalability, and resilience.
TrustStrategy launches next-gen AI crypto bot
Bitcoin hits record high of $125,000 over the weekend
Ohio board greenlights vendor for state crypto use
Samsung teams up with Galeon for Web3 healthcare AI