What Is the AI Bubble?
“People talk about an AI bubble, it’s really more of an AI window. There’s a window of opportunity that will close at some point, and these companies are really now jockeying for position.”
These are the words of Randy Thelen, CEO of The Right Place, the local nonprofit promoting data center projects in Solon, Lowell, and Gaines Townships.
The window is closing–not for regular Americans, but rather for the investors with billions of dollars at stake.
Big Tech’s AI Losses
Microsoft’s latest earnings report indicates that OpenAI posted approximately $12 billion in losses in Q3. Separately, Microsoft reported a $4.1 billion impact related to its investment in OpenAI. Estimates also suggest that OpenAI may be losing up to $15 million per day on Sora 2.
OpenAI has reportedly committed to roughly $600 billion in infrastructure agreements as of February 2026, down from earlier discussions of $1.4 trillion, with questions remaining about how these commitments will ultimately be financed.
The Unit Economics of AI
AI software is unusual in that many companies currently have negative unit economics on a per-user basis, meaning the cost to serve an average user exceeds the revenue generated from that user.
Unlike traditional software, where adding more users pushes marginal costs lower, AI systems incur ongoing compute expenses every time they are used. When per-user unit economics are negative, scaling usage increases total losses rather than reducing them.
How Do AI Data Centers Work?
AI data centers run on specialized chips, mainly GPUs, or graphic processing units. GPUs are designed to handle massive computations, allowing them to process large amounts of data simultaneously to generate responses in real time.
When you enter prompt after prompt into ChatGPT, the GPUs in hyperscale data centers are working hard, consuming significant power, and generating substantial heat to produce each answer.
AI data centers are different from server farms, and don’t let these CEOs try to tell you differently.
Why Are AI Data Centers So Expensive?
For AI-centric data centers, compute hardware like GPUs and AI accelerators account for the majority of hardware spending.
According to OpenAI’s CFO, it costs $50 billion to build a 1 gigawatt AI data center facility. $15 billion of that is spent on land and infrastructure, while $35 billion is spent on chips.
That means that roughly 50-70% of the costs for these massive projects are for the chips alone.
A contributing factor to recent tech layoffs is due to capital reallocation. As companies increase spending on AI infrastructure, particularly for GPUs and datacenter buildouts, they are tightening budgets in human capital. In a lot of cases, workforce reduction is a strategic shift in spending priorities.
Creative Financing for AI Data Centers
Traditional banks don’t want to accept chips as collateral on loans for these AI data centers because a GPU cluster could be obsolete in as little as 18 months.
When tech companies can’t get traditional financing for loans on these chips, they have to get “creative,” according to the CFO of OpenAI. Some of this creativity might involve government-backed loans and subsidies.
OpenAI sent a letter to the federal government in October 2025 asking for taxpayer-funded subsidies to be expanded to cover the entire AI supply chain, particularly for semiconductors.
The other way these tech companies are getting creative is with circular financing.
Companies like OpenAI and Nvidia, a producer of chips, created deals where they’ve invested in each other. Nvidia previously invested up to $100 billion in OpenAI to help them fund their data center buildouts. In return, OpenAI committed to filling these data centers with Nvidia chips.
This is how you end up with spaghetti diagrams where companies pass seemingly infinite money around in a circle.
How Big Tech Companies Keep Debt Off Their Balance Sheets
Other creative financing tactics include using special-purpose vehicles to keep billions of dollars of debt off their balance sheets, a technique that several major financial institutions used leading up to the 2008 financial crisis. Meta is currently using an SPV to build a $27 billion hyperscale AI data center in Louisiana.
It’s worth noting that the reason these companies are able to go forward with their unimaginably expensive side quests is because their main businesses are so profitable.
Meta, Amazon, Microsoft, and Google can survive the AI bubble bursting because they’ll always have customers for their main products.
But the average American won’t weather the AI bubble bursting quite as well.
The AI Bubble’s Impact on Average Americans
An AI stock market crash could erase 8% of all household wealth in America and cut consumption by $500 billion.
A crash today would have a bigger effect on the average American today than it did 25 years ago during the dot-com bust because the share of household wealth in the stock market is higher today at 21%.
Additional risks lie with private AI labs and their venture backers, along with stranded assets from overbuilding. What happens to the utilities companies that built gas plants for AI data centers after they’ve lost their customer?
So when Randy Thelen says there’s a window of opportunity, we’re wondering who it’s for? Community members have spoken, and we would like to close the door on big tech’s AI window.