AI, infrastructure, commodities and the investment case ahead

Artificial intelligence (AI) is advancing at a rate that is reshaping global economic assumptions. Technical progress across model design, compute efficiency and hardware performance has dramatically lowered the cost of running AI systems. Over 18 months, the cost to developers for generating AI outputs (measured per million tokens) has fallen by approximately 99 per cent. This dramatic reduction is attributed to advances in model architecture and graphics processing unit (GPU) performance, marking a significant shift in the economics of AI deployment, and is likely to further support widespread integration across sectors (Brookfield, Building the Backbone of AI, p. 7). Explained simply, the cost of generating tokens (AI output) has fallen sharply, thanks to more efficient models and faster chips. This has made AI significantly cheaper to use, accelerating its adoption across industries.

This cost compression has underpinned a rapid rise in enterprise and government adoption, with AI now deeply embedded into operational workflows and decision-making systems. The technology has transitioned from an emerging capability to a foundational one, what the report terms a “baseline capability” for economically competitive nations (PGIM, p. 3). Importantly, the speed of uptake is materially outpacing prior technology cycles, with generative AI tools achieving mass adoption in months rather than years.

This momentum places substantial demands on the physical infrastructure that enables AI. These demands span a wide range of assets, from data centres and transmission grids to semiconductor foundries, energy generation and the materials that underpin them. It is projected that global infrastructure investment will need to exceed US$7 trillion ($10.8 trillion) over the next decade to support capacity expansion across data centres, electricity networks and supporting systems (Brookfield, p. 3). Consultancy McKinsey & Co. estimates that US$19 trillion ($29 trillion) of digital infrastructure investments will be needed through to 2040. The convergence of these two forces, AI-driven infrastructure acceleration and a structural global shortfall, frames one of the most consequential investment environments of the coming decades.

From Atchison’s perspective, this is not a cyclical opportunity, but the early stages of a long-term capital formation cycle. However, a strong theme alone does not constitute a strong investment. For infrastructure and related commodities to merit inclusion in portfolios, the underlying assets must exhibit durable economics: stable earnings, contractual revenue visibility and effective capital discipline. The investment case strengthens when the theme is anchored in measurable fundamentals.

These fundamentals are increasingly observable. Electricity demand from global data centres is anticipated to grow by more than tenfold, from around 7 gigawatts (GW) to over 82 GW by 2034, driven by larger AI models, increased inference volumes and widespread enterprise integration. (“Inference” is the moment a trained model stops learning and starts working, turning its knowledge into real-world results; that is, actually “doing.”)

In the same period, the global installed base of high-performance AI chips is forecast to rise from roughly 7 million units to around 45 million. In response, governments across Europe, North America and Asia are mobilising capital to secure national compute capacity, strengthen domestic chip supply and expand AI-capable energy systems. PGIM (2025) flags this “infrastructure race” as a matter of national competitiveness, warning that insufficient AI infrastructure investment could lead to structural divergence between countries.

That said, increased capital expenditure is not in itself an investment rationale. What matters to investors is the cashflow profile underpinning the assets. Many infrastructure projects linked to AI benefit from long-term, contracted revenue streams, such as multi-year data-centre leases, power purchase agreements and compute service contracts with investment-grade counterparties. These arrangements provide reliable, forecastable income that can be valued using traditional infrastructure frameworks. Moreover, structural supply bottlenecks reinforce pricing power.

Commodities also play a critical role in this ecosystem. The construction and operation of AI infrastructure rely heavily on copper, aluminium, nickel and rare earth elements, core inputs for data centres, grid infrastructure and chip manufacturing. Allianz (2024) anticipates sustained demand pressure on these materials, driven not only by AI, but also by broader electrification, renewables deployment and modernisation of transport systems (Allianz, p. 7). The investment opportunity here is not speculative; it is grounded in the free-cashflow resilience of high-quality producers operating in structurally constrained supply markets.

Efficiency gains in AI do not reduce these demands; they reinforce them. Despite inference costs falling significantly, overall compute workloads are expanding rapidly as AI becomes deeply integrated into core enterprise functions. In Atchison’s view, the case for allocating capital to AI-linked infrastructure and commodities does not rest on the transformative potential of AI alone. Rather, it stems from the characteristics these assets provide to portfolios: inflation resilience, real-economy linkage, long-term visibility and most importantly, sustainable earnings while also accounting for capital expenditure. AI may be the catalyst, but it is the quality and durability of the resulting cashflows that will support the investment.

Sources:

  • Global Risk Perspectives 2025: The Productivity Challenge in a Multipolar World. PGIM Megatrends Research, October 2025.
  • Building the Backbone of AI: Why AI Infrastructure is the Opportunity of a Generation. Brookfield Insights, May 2024.
  • The Infrastructure Conundrum: Addressing the Global Investment Gap. Allianz Global Economic Outlook, March 2024.