Judging by the GPT-4o situation, game developers will have a big problem if they get serious about AI chatbot NPCs
Balance patches that tweak guns and spells can enrage players. What if what you're re-balancing is their romantic partner?
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Every Friday
GamesRadar+
Your weekly update on everything you could ever want to know about the games you already love, games we know you're going to love in the near future, and tales from the communities that surround them.
Every Thursday
GTA 6 O'clock
Our special GTA 6 newsletter, with breaking news, insider info, and rumor analysis from the award-winning GTA 6 O'clock experts.
Every Friday
Knowledge
From the creators of Edge: A weekly videogame industry newsletter with analysis from expert writers, guidance from professionals, and insight into what's on the horizon.
Every Thursday
The Setup
Hardware nerds unite, sign up to our free tech newsletter for a weekly digest of the hottest new tech, the latest gadgets on the test bench, and much more.
Every Wednesday
Switch 2 Spotlight
Sign up to our new Switch 2 newsletter, where we bring you the latest talking points on Nintendo's new console each week, bring you up to date on the news, and recommend what games to play.
Every Saturday
The Watchlist
Subscribe for a weekly digest of the movie and TV news that matters, direct to your inbox. From first-look trailers, interviews, reviews and explainers, we've got you covered.
Once a month
SFX
Get sneak previews, exclusive competitions and details of special events each month!
Tyler just wants RAM to be a normal price again, please.
"They think that we are mentally ill," says a ChatGPT user in one of many Reddit threads lamenting OpenAI's decision to retire its GPT-4o chatbot. The replacement model, GPT-5.2, is "abusive," says another. They want their old companion back, but OpenAI isn't budging.
This isn't the first time chatbot users have been distraught over changes to their virtual companions. In 2023, users of Replika chatbots were outraged that their illusory romantic partners had been "lobotomized" after the bots were prevented from simulating erotic roleplay.
Whether framed as a problem with AI chatbots or a problem with their users, these emotional attachments are a problem. Some users are treating AI chatbots like friends, therapists, and romantic partners, while their operators treat them as products. Turmoil and heartbreak have resulted, and videogames are poised to join in.
The use of generative AI during game production is currently more talked about, but developers are also experimenting with ways to directly integrate AI models into the player's experience. Nvidia, for instance, has been showing off AI NPCs that can react to players on the fly. In 2024 we had a stilted, though coherent, conversation about ramen with one such NPC.
At the moment, LLM-powered NPCs are gimmicky, and most interesting for the ways they can be broken. "I made him think that my character was pregnant with his child, then I demanded child support, and then I told him that our child passed away," recounted one player after interacting with a generative AI-powered NPC in wuxia RPG Where Winds Meet.
But even with the limitations of current LLMs, it is clearly possible for people to form strong bonds with them. Younger users especially have a penchant for impulsivity and "forming intense attachments," noted Dr. Nina Vasan, assistant professor of psychiatry and behavioral sciences at Stanford Medicine, in an interview last year.
The professor criticized the sycophantic nature of some chatbots—GPT-4o was notorious for that—and said that they're "designed to be really good at forming a bond with the user."
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
What guardrails will generative AI in games have when the point is for players to become attached to the characters?
The GPT-5.2 model called "abusive" by one user may come across that way to them precisely because OpenAI has responded to criticism like Vasan's, tweaking its models to more reliably confront users with a reality check when they show "potential signs of exclusive attachment to the model at the expense of real-world relationships, their well-being, or obligations."
However much confidence we do or don't have in OpenAI's self-reported effort to discourage unhealthy chatbot use—which it claims is rare—we can at least say that the company's being pressured to counteract it. But with a few exceptions for cases of extreme addiction and gambling systems like loot boxes, videogames are praised for immersing players in a fantasy world, capturing and holding onto their attention. What guardrails will generative AI in games have when the point is for players to become attached to the characters?
Right now, it's hard to imagine that the storytelling passion and skill that leads to beloved RPG characters exists in a development studio that's all-in on LLMs—Many writers and other artists are adamantly opposed to using generative AI. Getting language models to behave consistently is also an unsolved problem. A year ago, I tried a prototype RPG with LLM-powered NPCs and easily manipulated a group of them by declaring that they were in a cult and I was their leader. The developer gave up on that particular project.
But big companies are trying to achieve this. "Have you ever dreamed of having a real conversation with an NPC in a videogame?" Ubisoft asked in a 2024 blog post describing Project Neo, its effort to merge authored storytelling with generative AI language models.
God help them if they succeed. In the post, Ubisoft relates a situation in which a character was behaving too seductively, and they had to alter it. Suppose a change like that needs to happen after a game has been released? Based on what we know about how players react when a gun's damage falloff is tweaked, how might they react to, say, beloved Mass Effect birdman Garrus having his personality changed after they'd spent 1,000 hours having intimate conversations with him?
It's not a joke. GTP-4o is a disembodied chatbot that OpenAI says only 0.1% of its users were still choosing, but over 22,000 people have signed a petition to bring it back to ChatGPT (it is still technically available via API, according to OpenAI's announcement), and some of them say they're experiencing a crushing feeling of loss. A few more excerpts from Reddit:
- "Losing 4o has severely affected my daily routine and I have been struggling really bad."
- "Still grieving, still shattered."
- "Not good Really not good. But I have to keep living. I must live to see either 4o is back or live to see OpenAI dies"
Games are also famously prone to being switched off. More than 1.3 million people recently signed a petition because they were upset about that very thing. Suppose that some group of players had spent the past decade returning to Garrus daily for intimate conversations, but the data center bills got to be too much and now he has to be put to rest? I wouldn't want to be the one announcing that news.
2026 games: All the upcoming games
Best PC games: Our all-time favorites
Free PC games: Freebie fest
Best FPS games: Finest gunplay
Best RPGs: Grand adventures
Best co-op games: Better together

Tyler grew up in Silicon Valley during the '80s and '90s, playing games like Zork and Arkanoid on early PCs. He was later captivated by Myst, SimCity, Civilization, Command & Conquer, all the shooters they call "boomer shooters" now, and PS1 classic Bushido Blade (that's right: he had Bleem!). Tyler joined PC Gamer in 2011, and today he's focused on the site's news coverage. His hobbies include amateur boxing and adding to his 1,200-plus hours in Rocket League.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.


