OpenAI's internal documents predict $14 billion loss in 2026 according to report

OpenAI logo displayed on a phone screen and ChatGPT website displayed on a laptop screen are seen in this illustration photo taken in Krakow, Poland on December 5, 2022.
(Image credit: Jakub Porzycki/NurPhoto via Getty Images)

Internal OpenAI documents predict the AI specialist is set to bleed fully $14 billion in losses for 2026 according to a new report. It's also claimed that OpenAI will continue to make huge losses totalling $44 billion until 2029, when it won't just turn a profit, but will by then be generating Nvidia-style revenues.

A new report from The Information claims to have seen internal OpenAI documents setting out various financial performance projections. That $14 billion loss for 2026 is said to be roughly three times worse than early estimates for 2025.

Images of Nvidia's Blackwell GPU from GTC.

OpenAI will be making Nvidia-style money by 2029, if you can believe it. (Image credit: Nvidia)

The revenue split for that $100 billion is said to be just over 50% from ChatGPT, roughly 20% from sales of AI models to developers through APIs and another 20% or so from "other products", which include video generation, search and mooted new services including AI research assistants.

It's also thought that the cost of inference, which is running AI models as opposed to training them, is coming down fast. Intriguingly, OpenAI expects to spend less on acquiring training data, too. That's forecast to cost $500 million this year, but taper down to $200 million annually towards the end of the decade.

Exactly what that says about how OpenAI trains its new models and what data it uses isn't clear. But it could suggest a move to more recursive training on AI-generated data.

Anywho, all one can say for sure is that a huge amount of money is involved. Whether OpenAI will come good financially—or for the human race, generally—well, that's a totally different matter.

Asus RX 9070 Prime graphics card
Best graphics card 2026
Jeremy Laird
Hardware writer

Jeremy has been writing about technology and PCs since the 90nm Netburst era (Google it!) and enjoys nothing more than a serious dissertation on the finer points of monitor input lag and overshoot followed by a forensic examination of advanced lithography. Or maybe he just likes machines that go “ping!” He also has a thing for tennis and cars.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.