Fabulous news everyone: Market analyst says the AI bubble is 17X bigger than the dotcom goldrush, and 4X larger than the subprime bubble that caused the 2008 crash
Welcome to the future bub.

The AI sector isn't just a bubble, says one senior market analyst: It's the single biggest bubble the markets have ever seen, the bubble of bubbles if you will, a bubble so large it looms over the entire global economy and leaves Sir Mix-A-Lot breathless.
In unrelated news, the Associated Press has just reported that OpenAI's valuation has hit $500 billion, making a company that's never turned a profit into the most valuable startup in history.
One market analyst reckons this tomfoolery has gone far enough, these companies and those who invest in them are about to hit "diminishing returns hard", and is telling their clients to steer well clear.
Let's put the argument for AI as briefly as possible: It's going to change the world on a scale that is currently so unimaginable it could only be described as revolutionary. It will transform industries and economies. And it is only fair to say that AI technologies have achieved some remarkable things that may point in this direction, particularly in the field of medicine.
But that's the thing. We're all getting familiar with AI tech in some aspects, whether that's Gemini shouldering-in on what used to be a perfectly good search engine, the constant wheedling offers it makes about taking notes or summarising conversations, nevermind the endless flood of brain-melting slop on social media. Some of the functionality is neat, some is annoying, but nothing about it feels revolutionary. Not even close.
So do you buy the hype? Up until now investors certainly have, and even governments are rushing to get on-board with the AI revolution. Here in the UK our Prime Minister Keir Starmer, a man with the charisma of an empty pizza box, was somehow galvanised into the creation of "a blueprint to turbocharge AI" for "a decade of national renewal." Starmer recently met the US President, frabjous day, and the pair announced a "Tech Prosperity Deal" where firms like Google and Microsoft agreed to spend billions building big expensive AI things for themselves in the UK and call it largesse.
All of which is to say: there is a hell of a lot of money riding on AI producing… well, something genuinely transformative in the near future. So much money that, if the bubble bursts, the pop may herald the kind of brutal economic fallout that can define eras.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Even the moneymen are starting to think that something might not pass the smell test here. A new note to its clients from independent research firm the Macrostrategy Partnership goes in with both feet, but I will caveat it: Independent this firm may well be, but it has taken a very firm and conservative stance on AI for a long time.
This note to investors was first reported on by MarketWatch, and written by Julien Garran (who was formerly leader of UBS’s commodities strategy team, so presumably knows what he's on about).
Garran's wildest claim is that AI is no mere bubble, but a bubble 17 times larger than the dotcom bubble and four times that of the sub-prime bubble behind the 2008 global crash. The argument is that artificially low interest rates have led to misallocation, economics jargon for money and work being spent in the wrong place and destabilising things because the output, the products or even promises if you will, don't materialise.
Garran gets to that number with some creative economising using the Wicksellian differential to calculate a GDP deficit that altogether includes AI, real estate, VC investments, and for some reason NFTs. Under this metric the misallocation in a pre-crash 2008 was around 18% of GDP: Garran estimates that this figure could now be an eye-watering 65%.
Analysts naturally find ways (and leftfield differentials) to make the numbers fit their world view, but Garran does highlight some real-world examples of how the AI productivity boom is going. He cites a study where the task-completion rate for AI at a software company was between 1.5% to 34% and, even with the tasks AI was better at, it couldn't reliably replicate that success over time. There's a chart from another economist, based on Commerce Department data, suggesting that AI pickup among big companies is declining.
"We don't know exactly when LLMs might hit diminishing returns hard, because we don’t have a measure of the statistical complexity of language," says Garran. "To find out whether we have hit a wall we have to watch the LLM developers. If they release a model that costs 10x more, likely using 20x more compute than the previous one, and it's not much better than what's out there, then we've hit a wall."
Garran further points out that the audience using LLMs the most are costing these companies more in compute power "than their monthly subscriptions". And he could've added that most of us use them for free. He then comes up with a sentence that is supposed to be a dire warning but just sounds funny, about the bubble bursting and pushing the economy "into a zone 4 deflationary bust on our investment clock." Not the investment clock dammit!
I should re-emphasise Garran is an AI critic and works for a firm that is telling its clients not to over-invest or even invest in AI. So take everything in that context. This is no truth from on high but it does feel like the mood music around this technology is shifting slightly. Perhaps AI will change the world. Perhaps not like some think.
2025 games: This year's upcoming releases
Best PC games: Our all-time favorites
Free PC games: Freebie fest
Best FPS games: Finest gunplay
Best RPGs: Grand adventures
Best co-op games: Better together

Rich is a games journalist with 15 years' experience, beginning his career on Edge magazine before working for a wide range of outlets, including Ars Technica, Eurogamer, GamesRadar+, Gamespot, the Guardian, IGN, the New Statesman, Polygon, and Vice. He was the editor of Kotaku UK, the UK arm of Kotaku, for three years before joining PC Gamer. He is the author of a Brief History of Video Games, a full history of the medium, which the Midwest Book Review described as "[a] must-read for serious minded game historians and curious video game connoisseurs alike."
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.