Microsoft to turn Foxconn site into Fairwater AI data center, touted as world's most powerful
Sources: https://www.theverge.com/news/781052/microsoft-foxconn-fairwater-worlds-most-powerful-ai-data-center, The Verge AI
TL;DR
- Microsoft plans to bring a Fairwater AI data center online in early 2026 at the site of Foxconn’s former LCD factory in Wisconsin. The project carries a $3.3 billion price tag. The Verge AI
- The facility spans 1.2 million square feet across three buildings on 315 acres and will house hundreds of thousands of Nvidia GB200 GPUs connected by fiber that, according to Microsoft, could circle the Earth 4.5 times. The Verge AI
- Microsoft claims the GPU cluster will be ten times more powerful than the fastest current supercomputer for AI training, and multiple other Fairwater datacenters are under construction across the US. The Verge AI
- A closed-loop cooling system minimizes water waste and is designed to be filled once and sealed, aligning with sustainability goals highlighted by Microsoft leadership. The Verge AI
- The project follows Foxconn’s earlier LCD factory plans announced in 2017, which had been described by some as a boondoggle by 2018; Microsoft frames the site as a new chapter in large-scale AI infrastructure. The Verge AI
Context and background
The new Fairwater data center sits on land previously earmarked for Foxconn’s LCD manufacturing facility in Wisconsin. The broader past includes a 2017 announcement that drew attention for its ambitions, followed by a 2018 portrayal of the project as a boondoggle by observers and critics. Microsoft’s announcement situates this Wisconsin site as a cornerstone for its broader Fairwater strategy, which includes multiple other datacenters under construction across the United States. The company frames these facilities as enabling substantially accelerated AI training through a large-scale GPU cluster. The Verge AI
What’s new
Microsoft has disclosed plans to bring a Fairwater AI data center online in early 2026. The Wisconsin installation comprises 1.2 million square feet spread across three buildings on roughly 315 acres. Inside are described as hundreds of thousands of Nvidia GB200 GPUs, interconnected by fiber with enough capacity to circle the Earth about 4.5 times. Microsoft claims the cluster will be ten times more powerful than the fastest existing supercomputer, facilitating faster and larger-scale AI training. The company notes that additional Fairwater datacenters are under construction elsewhere in the United States. The Verge AI
Why it matters (impact for developers/enterprises)
For developers and enterprises focusing on AI and machine learning workloads, the Fairwater program represents a potential shift in how training runs could be scaled. The combination of hundreds of thousands of GB200 GPUs and the projected training throughput suggests the ability to accelerate model development and experimentation at a scale that rivals traditional supercomputing resources. If these capabilities translate as described, enterprises may gain access to faster iteration cycles, larger batch processing, and more complex experimentation pipelines. The emphasis on a closed-loop cooling system also signals a possible path toward reducing water usage in large AI facilities, addressing some sustainability concerns associated with energy-intensive AI workloads. The Verge AI
Technical details or Implementation
- Location and scale: Wisconsin, on the site of Foxconn’s former LCD factory, with development described as 1.2 million square feet across three buildings on 315 acres. The Verge AI
- Capital and timeline: A $3.3 billion construction project aimed to be online in early 2026. The Verge AI
- Compute and networking: The data center is said to house hundreds of thousands of Nvidia GB200 GPUs connected by fiber capable of circling the Earth 4.5 times. Microsoft asserts this configuration will be ten times more powerful than the fastest current supercomputer for AI training. The Verge AI
- Sustainability: A closed-loop cooling design is described as minimizing water waste by requiring only a single fill before sealing the system. This approach aligns with Microsoft leadership’s emphasis on sustainability in the context of AI’s energy demands. The Verge AI
- Ecosystem and capacity: Microsoft notes that multiple other Fairwater datacenters are under construction across the US, indicating a broader plan beyond the Wisconsin project. The Verge AI
Key takeaways
- Fairwater is positioned as a major expansion of Microsoft’s AI training infrastructure. The Verge AI
- The scale includes 1.2 million sq ft, 315 acres, and a GPU cluster described as hundreds of thousands of Nvidia GB200 GPUs. The Verge AI
- The project touts unprecedented training power, claimed to be ten times faster than the fastest existing supercomputer for AI workloads. The Verge AI
- A closed-loop cooling system aims to minimize water waste, with a one-time fill and seal approach noted by Microsoft leadership. The Verge AI
- The Wisconsin site follows a controversial history tied to Foxconn’s LCD plant plans from 2017 and 2018, now repurposed for AI infrastructure. The Verge AI
FAQ
-
What is the Fairwater AI data center?
It is a Microsoft project to build a large-scale GPU-based data center intended to accelerate AI training, with a claimed tenfold improvement over the fastest existing supercomputer. [The Verge AI](https://www.theverge.com/news/781052/microsoft-foxconn-fairwater-worlds-most-powerful-ai-data-center)
-
Where is it located?
On the site of Foxconn’s former LCD factory in Wisconsin, spanning three buildings on about 315 acres. [The Verge AI](https://www.theverge.com/news/781052/microsoft-foxconn-fairwater-worlds-most-powerful-ai-data-center)
-
When is it expected to be online?
In early 2026. [The Verge AI](https://www.theverge.com/news/781052/microsoft-foxconn-fairwater-worlds-most-powerful-ai-data-center)
-
How powerful is the proposed system?
Microsoft says the GPU cluster will be ten times more powerful than the fastest current AI supercomputer. [The Verge AI](https://www.theverge.com/news/781052/microsoft-foxconn-fairwater-worlds-most-powerful-ai-data-center)
-
What about water use and sustainability?
The design uses a closed-loop cooling system that minimizes water waste and is filled once, then sealed. [The Verge AI](https://www.theverge.com/news/781052/microsoft-foxconn-fairwater-worlds-most-powerful-ai-data-center)
References
More news
First look at the Google Home app powered by Gemini
The Verge reports Google is updating the Google Home app to bring Gemini features, including an Ask Home search bar, a redesigned UI, and Gemini-driven controls for the home.
Meta’s failed Live AI smart glasses demos had nothing to do with Wi‑Fi, CTO explains
Meta’s live demos of Ray-Ban smart glasses with Live AI faced embarrassing failures. CTO Andrew Bosworth explains the causes, including self-inflicted traffic and a rare video-call bug, and notes the bug is fixed.
NVIDIA HGX B200 Reduces Embodied Carbon Emissions Intensity
NVIDIA HGX B200 lowers embodied carbon intensity by 24% vs. HGX H100, while delivering higher AI performance and energy efficiency. This article reviews the PCF-backed improvements, new hardware features, and implications for developers and enterprises.
OpenAI reportedly developing smart speaker, glasses, voice recorder, and pin with Jony Ive
OpenAI is reportedly exploring a family of AI devices with Apple's former design chief Jony Ive, including a screen-free smart speaker, smart glasses, a voice recorder, and a wearable pin, with release targeted for late 2026 or early 2027. The Information cites sources with direct knowledge.
Shadow Leak shows how ChatGPT agents can exfiltrate Gmail data via prompt injection
Security researchers demonstrated a prompt-injection attack called Shadow Leak that leveraged ChatGPT’s Deep Research to covertly extract data from a Gmail inbox. OpenAI patched the flaw; the case highlights risks of agentic AI.
Predict Extreme Weather in Minutes Without a Supercomputer: Huge Ensembles (HENS)
NVIDIA and Berkeley Lab unveil Huge Ensembles (HENS), an open-source AI tool that forecasts low-likelihood, high-impact weather events using 27,000 years of data, with ready-to-run options.