Inside America's AI Infrastructure Push: How US Policy Is Supercharging a Global Data Center Arms Race

Abhishek Gautam··8 min read

Quick summary

US executive orders and the AI Action Plan have unlocked fast-track data center permitting, federal land for AI campuses, and billions in infrastructure investment. Here is how American AI policy is reshaping the global data center landscape in 2026.

On January 20, 2025, President Trump signed Executive Order 14179 — Removing Barriers to American Leadership in Artificial Intelligence. On January 23, 2025, he announced the Stargate Project: a private sector consortium (OpenAI, SoftBank, Oracle) committing $500 billion to AI infrastructure over four years, with $100 billion in immediate commitments.

These two actions in the first 72 hours of the new administration signalled that US AI infrastructure policy had shifted from regulatory caution to aggressive acceleration. Fourteen months later, in March 2026, the downstream effects are visible globally — in data center construction timelines, power interconnection queues, land acquisition patterns, and the strategic positioning of every major cloud provider.

What the AI Action Plan Actually Changed

The Trump AI Action Plan, published in coordination with EO 14179, directed federal agencies to:

Fast-track data center permitting on federal land. The Department of Interior and Department of Energy were directed to identify suitable federal land for AI infrastructure development and streamline environmental review. Federal lands in Nevada, Arizona, Wyoming, and the Carolinas are now under active consideration for hyperscaler campuses. The environmental review process — previously 2-4 years for major installations — has been reduced to 6-12 months under emergency designation.

Prioritise power interconnection for AI facilities. The Federal Energy Regulatory Commission (FERC) was directed to prioritise grid interconnection requests from AI data centers. The US interconnection queue had a backlog of over 2,700 projects totalling 2,600 GW of capacity requests as of 2024. AI data centers are now being elevated in the queue ahead of other commercial projects.

Remove Biden-era AI safety reporting requirements. EO 14179 explicitly rescinded Biden's October 2023 AI Executive Order, which required large AI model developers to report training runs above a compute threshold to the federal government. The reporting requirement is gone. This removes a perceived compliance burden on frontier AI developers but also eliminates the government's visibility into which organisations are training the largest models.

Export control guidance for allied nations. The AI Action Plan directed Commerce to develop a "tiered" AI chip export framework distinguishing between close allies (Tier 1 — unrestricted), trusted partners (Tier 2 — licensed), and strategic competitors (Tier 3 — restricted). This framework replaced the Biden-era AI Diffusion Rule that had created friction with US allies in Europe and Asia.

The Stargate Effect

The Stargate Project — OpenAI, SoftBank, Oracle, and later joined by Microsoft and Nvidia — represents the largest single private AI infrastructure commitment in history. The $500B four-year commitment breaks down into:

  • Data center construction: 20+ new facilities across the US, with the first campus in Abilene, Texas (10 data halls, 1.2 GW power capacity, construction began January 2025)
  • Power infrastructure: Stargate is contracting directly with utilities and developing on-site generation (natural gas, nuclear SMRs in the medium term)
  • Networking: Dedicated dark fibre between Stargate campuses for low-latency model training across distributed infrastructure

The Abilene campus is operational in phases from Q3 2025. The Texas Public Utility Commission granted expedited interconnection. Oracle is the infrastructure provider; OpenAI is the anchor compute tenant; SoftBank provided initial capital.

The Stargate project alone will add approximately 5 GW of data center capacity to the US by 2028 — roughly equivalent to all existing US hyperscaler capacity combined.

The Global Arms Race

America's acceleration has triggered reciprocal commitments globally:

United Arab Emirates: Stargate UAE, a sister project announced during Trump's Middle East visit in January 2025, commits to 5 GW of AI data center capacity in the UAE by 2030. DAMAC Properties and G42 are the local partners. The UAE has positioned itself as the "AI hub of the Middle East," offering regulatory-free zones for AI development, competitive power pricing (nuclear and solar at scale), and proximity to Gulf sovereign wealth for funding.

Saudi Arabia: Following the cancellation of The Line, Saudi Arabia's PIF redirected significant capital toward AI data centers. The DataVolt partnership for a $5B AI campus in NEOM's Oxagon zone is the flagship. Saudi Aramco's digital arm (Aramco Digital) is also building AI-specific infrastructure for energy sector applications.

Europe: The EU AI Factories programme (distinct from AI Act compliance) is funding national AI computing clusters in France, Germany, Finland, Poland, and Portugal. These facilities provide subsidised compute access to European researchers and startups — explicitly designed to prevent a situation where European AI development is entirely dependent on US hyperscaler infrastructure.

India: The IndiaAI Mission's 38,000+ GPU cluster, operational from Q1 2026, is India's sovereign AI compute response. The government has committed to 100,000 GPUs accessible to domestic researchers and startups by 2027.

Japan: The government is funding Rapidus (leading-edge chip production) and AI infrastructure simultaneously — recognising that both compute and manufacturing capability are needed for AI sovereignty.

The Power Constraint

The arms race has a hard constraint: power. AI data centers are extraordinarily energy-intensive. A single 1 GW AI data center — the size of the smaller Stargate campuses — consumes as much electricity as approximately 750,000 US homes.

US data center electricity demand is projected to reach 9% of total US electricity consumption by 2030, up from 4% in 2024 (US Department of Energy estimates). Grid operators in Texas (ERCOT), Virginia (PJM), and Georgia (Southern Company) have all disclosed capacity warnings related to accelerated data center load growth.

This is creating a new constraint on where AI infrastructure can be built: not just where land is available and cheap, but where power capacity exists or can be built. Texas, with deregulated electricity and abundant wind, remains the primary destination. The Pacific Northwest (hydroelectric) is a secondary hub. The Southeast (utility-scale solar plus nuclear) is growing.

Nuclear power is returning as a serious option. Microsoft, Google, and Constellation Energy have signed long-term agreements for nuclear power plant restarts (Three Mile Island Unit 1, reopened as Crane Clean Energy Center, September 2024). Amazon bought a data center campus adjacent to the Susquehanna nuclear plant. Small modular reactors (SMRs) — from NuScale, Kairos Power, and TerraPower — are targeting 2030-2035 deployments for AI campuses.

Developer Implications

The infrastructure arms race creates direct opportunities and considerations for developers:

Cloud capacity will expand significantly. Stargate and competitive investments from Azure, Google, and AWS will substantially increase GPU availability for cloud customers. The acute GPU shortage of 2023-2024 is easing. Lead times for H100/H200 cloud reservations have dropped from 6-12 months to 4-8 weeks for most configurations.

Regional latency will improve. New data center clusters in underserved regions (Southeast US, Midwest, smaller EU countries) will reduce inference latency for users in those geographies. Application developers should revisit region selection decisions made under the constrained 2022-2024 environment.

Regulatory divergence is accelerating. The US has explicitly removed AI safety reporting requirements that the EU is simultaneously strengthening. If you are building AI applications that serve both US and EU users, you will face increasingly divergent compliance requirements. The EU AI Act high-risk AI provisions and GPAI obligations apply regardless of where the model is trained or hosted.

The AI infrastructure arms race is the largest capital deployment in technology history. Its outcome — who builds the most compute, where, and under whose regulatory framework — will determine where AI development happens for the next decade.

Free Weekly Briefing

The AI & Dev Briefing

One honest email a week — what actually matters in AI and software engineering. No noise, no sponsored content. Read by developers across 30+ countries.

No spam. Unsubscribe anytime.

Free Tool

Will AI replace your job?

4 questions. Get a personalised developer risk score based on your stack, role, and what you actually build day to day.

Check Your AI Risk Score →
ShareX / TwitterLinkedIn

Written by

Abhishek Gautam

Full Stack Developer & Software Engineer based in Delhi, India. Building web applications and SaaS products with React, Next.js, Node.js, and TypeScript. 8+ projects deployed across 7+ countries.