To some on Wall Street, the AI boom looks less like a broad-based revolution and more like a roller-coaster that keeps adding speed and very few riders. Chip giants send money into GPU clouds and model labs that already live on their hardware. Those clouds borrow against racks of GPUs and sign multibillion-dollar contracts to host AI workloads. AI companies drag other power and chip suppliers onto the train with compute deals and equity-for-chips arrangements that stretch well into the next decade. Each move adds another turn to the ride and pushes the cars a little faster. But some look at the ride and see an AI economy that’s screaming in circles on one track.
On earnings calls and in press releases, these moves sound like strategy: “partnerships,” “ecosystems,” “assured demand.” From above, the pattern is harder to ignore. Instead of a broad base of independent customers buying compute as they need it, a few giants finance one another’s buildouts, pre-sell years of infrastructure to each other, and then point to those contracts as proof the coaster can’t possibly (possibly!) derail. Nvidia crushed its most recent earnings — and its stock fell the next day as worried whispers of a “circular economy” and “AI bubble” concerns started traveling faster than the guidance.
The mechanics sound simple and, to some, uncomfortable. “New” AI dollars are leaving a corporate budget as a multiyear cloud commitment, landing in a contract that’s already earmarked for certain labs, jumping again into a GPU cloud’s lease — and ending up supporting someone’s credit line or equity stake. The same promised workloads now prop up: a cloud backlog, a GPU-backed loan, and a chipmaker’s growth slide. Across the internet and social media, this is being called a “three-companies-in-a-trench-coat” economy, an “ouroboros,” a “Jenga tower,” and a “crazy” round of Monopoly, where the same stack of pretend money keeps getting counted as fresh wealth every time it passes “go.”
Goldman Sachs wrote this fall that “AI bubble concerns are back, and arguably more intense than ever,” citing “the increasing circularity of the AI ecosystem” as part of the problem. Morgan Stanley’s Todd Castagno has warned that the AI ecosystem is becoming increasingly “circular” as today’s loop-de-loops can inflate demand and valuations without creating economic value.
“We are increasingly going to be customers of each other,” Microsoft CEO Satya Nadella said in November, laying out his company’s latest round of AI alliances. Microsoft and Nvidia bankroll Anthropic, which runs on Microsoft’s cloud and chews through Nvidia GPUs; AMD hands OpenAI six gigawatts of future supply and, potentially, an up-to-10% stake in the company; Saudi-backed Humain wires in “exclusive technology” deals with AMD and Cisco, while Nvidia and Elon Musk’s xAI are lining up a separate 500-megawatt data center in the kingdom. And everyone gets to hold up the relationships as proof that their side of the stack is indispensable.
Veritas Investment Research’s Anthony Scilipoti says his team has identified “another 80–100” circular deals involving Nvidia, on top of the headline partnerships investors already know. And the Bank of England, in its latest Financial Stability Report, has warned that valuations for AI-focused tech stocks look “materially stretched.” Investors are increasingly worried that AI money isn’t just flowing through a broad market anymore. It’s looping, corkscrewing, and doubling back on itself, faster every quarter, while the companies at the heart of the AI boom insist the ride can only go farther — and faster.
OpenAI has, in many ways, become the gravitational center of this trillion-dollar AI buildout — and because the company doesn’t have to offer everyone a look at its books, it has become the “black hole of the circular economy,” as Zacks Investment Research senior stock strategist Kevin Cook told Quartz recently.
Oracle is staking its future on OpenAI’s appetite with a compute commitment worth up to $300 billion over five years starting in 2027, a number that probably would have sounded outrageous in any other era. Today, it sounds almost — almost — normal. Oracle has been racing to build the data centers to serve that work, leaning on bond markets and private credit, but since the deal was announced, the company’s shares have lost more market value than the entire face value of the contract because the deal is already underwater.
Meanwhile, Nvidia has taken stakes in GPU clouds, which borrow billions against towers of Nvidia hardware and then sell that capacity back to AI labs, many of which Nvidia also backs or courts. In September, the chipmaker and OpenAI announced a letter of intent to deploy at least 10 gigawatts of Nvidia systems for OpenAI’s next-generation infrastructure — a project Nvidia CEO Jensen Huang has called “the biggest AI infrastructure project in history.” Nvidia said it “intends to invest up to $100 billion” in OpenAI as that hardware rolls out. Still, chief financial officer Colette Kress has reminded investors that, despite the headlines, there is still “no definitive agreement” and “no assurance” the deal will be completed on the expected terms. And with a recent $2 billion investment in Synopsys, Nvidia has now moved into the software that helps design the next generation of chips and systems, embedding itself even further down the stack.
Nvidia isn’t just selling tickets; it’s designing the track, leasing the land, and deciding which riders get a seat.
Threaded between OpenAI and Nvidia is CoreWeave, the GPU cloud startup that turned rack after rack of Nvidia chips into structured finance. In 2024, CoreWeave secured a $7.5 billion debt facility led by Blackstone and a consortium of private-credit players, using data centers and GPUs as collateral. This year, CoreWeave expanded its agreement with OpenAI again: a deal worth up to $6.5 billion brought the total contract value to about $22.4 billion. Nvidia owns a more-than-5% stake in CoreWeave and has agreed to buy capacity from it (over $6 billion of it), acting as a kind of backstop customer for the same compute that CoreWeave is selling to OpenAI and others.
Then, there’s AMD, which has its own orbit. As part of the OpenAI–Oracle megaproject (aka the broader Stargate buildout with Oracle and SoftBank), AMD has committed to supply up to 6 gigawatts of Instinct GPUs by 2030, and OpenAI has been granted warrants that could, if milestones are met, give it up to a 10% stake in AMD. In the Gulf, AMD and Cisco have teamed up with Saudi-backed startup Humain in a joint venture that plans to deliver up to 1 gigawatt of AI infrastructure over the next several years, starting with a 100-megawatt deployment in Saudi Arabia; AMD and Cisco will be minority equity holders, with the Public Investment Fund–backed Humain in the driver’s seat.
There’s also the problem of exit velocity. Once a company has pledged 10 gigawatts of AI power or promised investors hundreds of billions in AI infrastructure, backing down carries a political and reputational cost that doesn’t show up in spreadsheets. Utilities have rewritten long-term plans. Local officials have posed with shovels in fields that now double as collateral. Sovereign funds have stamped their names on AI parks meant to prove they are on the right side of the future. Industrial policy, grid planning, and corporate capex now converge on the same bet: that this small cast of riders will keep screaming excitedly around every turn.
David Meier, a senior investment analyst at The Motley Fool, told Quartz recently that the circular economy is “absolutely a concern” because “we’ve seen this play before”: in the buildout of the internet, where vendors were essentially financing their cover. “That worked — until it didn’t,” he says. “I won’t say the same thing is going to happen, but people should be a little bit concerned because in a situation like that, you could get a bubble that essentially propagates itself.” He also notes that this time much of the capital is equity rather than plain-vanilla debt, which he reads as “a little bit different” because that structure can spread the pain through equity markets and pension funds rather than confining it to a few lenders.
Still, there’s a very loud crowd arguing that the loop is the least interesting thing about the AI economy.
Nvidia has started pushing back at Michael Burry’s “circular” talk, telling analysts that its cross-deals are tiny next to its revenue and that the startups it backs predominantly earn money from outside customers, not from Nvidia itself. In its latest earnings, Nvidia said demand for its newest Blackwell chips was “off the charts,” with cloud GPU capacity effectively sold out. AMD CEO Lisa Su makes a similar case, arguing that the bubble debate “misses the bigger picture” because AI is a structural shift in how workloads are run, not a passing theme.
David Wagner, the head of equities at Aptus Capital Advisors, told Quartz that the companies at the center of the AI boom know exactly how much leverage they have: balance sheets that can handle fresh debt, cash flow that can underwrite long-lived projects, and what he calls plenty of “runway” and “dry powder” to keep spending if the cycle wobbles. He sees AI already turning into real revenue in cloud and software, and some degree of overbuild is the cost of making sure the lights stay on when the next wave of workloads shows up. The circular deals look strange from the outside; from his seat, they look like big platforms using every tool they have to lock in a market they already dominate.
When critics point out that Microsoft is both a shareholder and a major customer of CoreWeave, its CEO, Michael Intrator, points back to actual use rate: Copilot, Office 365, and Meta’s AI experiments are chewing through capacity, he says, describing demand as “overwhelming,” stretching from hyperscalers to sovereign AI projects. He says debt and vendor financing are just ways to keep up with an order book that keeps overshooting the forecasts.
Other strategists sit somewhere in the middle of the circular economy debate. Mark Jamison opened a recent piece in Barrons by conceding that “AI looks like a circular money machine” — or “so the alarmists say” — before arguing that “the evidence suggests something different, a powerful technological transformation that remains grounded in fundamentals.”
The big-bank house view leans the same way. Morgan Stanley’s tech team has described AI spending as part of a longer-term profit cycle, modeling roughly $1.1 trillion in AI software revenue by 2028 (up from $45 billion in 2024) and arguing that AI capex has “considerable potential for return,” while J.P. Morgan’s outlook says tech-led gains don’t yet resemble a bubble — so long as the incremental capex produces durable cash flow. Charles Schwab’s Liz Ann Sonders has argued that this isn’t dot-com 2.0 because today’s AI leaders are enormous, cash-rich incumbents, not cash-burning startups, but she also warns that disappointment relative to sky-high expectations could still roil markets.
Still, the market wants to know: How much of this motion reflects real demand from the rest of the economy? Oracle can point to an OpenAI contract that runs into the hundreds of billions. CoreWeave can point to more than $20 billion in obligations from OpenAI and another long-dated deal with Meta. Nvidia can point to the 10 gigawatts of planned OpenAI capacity plus AMD’s six-gigawatt pledge. Each of those numbers supports everyone’s individual growth story. None of them cleanly separates end users paying for AI from a handful of companies buying one another’s capacity and calling it “momentum.”
The circular economy has become a main attraction, a key operating system of the AI boom. The chipmakers, clouds, and labs that dominate the story are financing each other’s expansions, locking in each other’s demand, and building power-hungry infrastructure around promises that all stem from the same circle of names. Still, the money is real. So the result of all this dealmaking looks to some like unstoppable momentum. To others, this looks like a very elaborate way of keeping the roller coaster ride moving — farther and faster in tighter and tighter loops.