Connect with us

Hi, what are you looking for?

Saturday, Feb 7, 2026
Mugglehead Investment Magazine
Alternative investment news based in Vancouver, B.C.
Why OpenAI hasn’t yet delivered a traditional return on investment
Why OpenAI hasn’t yet delivered a traditional return on investment
One of the data centre's in support of the Stargate project. IMage via OpenAI.

AI and Autonomy

Why OpenAI hasn’t yet delivered a traditional return on investment

Every additional user interaction carries a marginal cost, turning scale into a financial liability rather than a margin lever

OpenAI’s investment in ChatGPT has delivered extraordinary revenue growth, but it has not yet delivered a conventional return on investment.

Since ChatGPT’s public launch, OpenAI has scaled into one of the highest-grossing private software businesses in history.

Subscriptions, enterprise licensing, and API usage have pushed annualized revenue into the tens of billions of U.S. dollars. Few companies have expanded top-line revenue this quickly. That growth, however, sits alongside an equally aggressive cost structure.

OpenAI’s largest expense is compute. Training and running frontier-scale language models requires constant access to specialized hardware, primarily high-end GPUs. Inference compounds the problem. Every additional user interaction carries a marginal cost, turning scale into a financial liability rather than a margin lever.

As a result, OpenAI remains deeply unprofitable.

Industry estimates suggest annual losses still exceed revenue, even after the expansion of paid ChatGPT tiers and enterprise contracts. Long-term infrastructure commitments now stretch into the tens of billions of dollars. Those obligations rise with model ambition rather than decline with scale.

From a traditional ROI perspective, the conclusion is straightforward. OpenAI has not generated positive free cash flow from ChatGPT. It is not EBITDA-positive. EBITDA being finance-speak for earnings before interest taxes depreciation and amortization.

Furthermore, there is no evidence that the product, in its current form, produces surplus capital after operating costs. That gap between revenue and return has drawn skepticism from outside the company.

Read more: Artificial intelligence moves renewable power from variable to dependable

Read more:Quantum computing revives debate over Bitcoin’s long-term security

Chasing artificial general intelligence is expensive

Arvind Krishna, CEO of IBM, recently questioned whether enterprise returns could ever justify the capital being committed to artificial general intelligence, OpenAI’s stated long-term priority.

He framed the issue in blunt financial terms. A single one-gigawatt data centre now requires roughly USD$80 billion in capital, based on current construction and equipment costs.

A company committing 20 to 30 gigawatts would therefore face capital expenditures approaching USD$1.5 trillion. At an industry level, Krishna said total commitments tied to chasing artificial general intelligence (AGI) are trending toward 100 gigawatts, implying capital spending closer to USD$8 trillion.

“It takes about $80 billion to fill up a one-gigawatt data center,” he said. “That’s today’s number. If one company is going to commit 20-30 gigawatts, that’s $1.5 trillion of [capital expenditure].”

Considering the “total commits” of “chasing AGI” amounts to 100 gigawatts, he reasoned, that’s “$8 trillion of capital expenditure.”

“It’s my view that there’s no way you’re going to get a return on that because $8 trillion of capital expenditure means you need roughly $800 billion of profit just to pay for the interest.”.

The math lays bare the central problem facing ChatGPT as a business. Growth alone does not equal return. ROI depends on whether incremental revenue exceeds incremental compute. That threshold has not yet been crossed.

OpenAI’s internal projections reportedly push profitability several years into the future. Those forecasts assume declining compute costs, sustained pricing power, and limited competitive pressure. None are guaranteed.

.

Follow Mugglehead on x

Like Mugglehead on Facebook

Follow Joseph Morton on x

joseph@mugglehead.com

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

AI and Autonomy

MirrorMe said engineers redesigned the robot’s joints and rebuilt its power system from the ground up

AI and Autonomy

Within 3 years the vast majority of data centres will be in space, the tech mogul predicted

AI and Autonomy

The growth reflects rising use of AI tools that help operators predict output, manage equipment, and stabilize power networks

AI and Autonomy

The new model allows farmers to flag a plant immediately and instruct the robot to eliminate it