OpenAI now says ChatGPT 'shouldn't give you an answer' when asked: 'Should I break up with my boyfriend?'
Though if you're looking for someone to tell you to dump him, Reddit is still only too happy to oblige.

Personally, I'd rather not feed all of my anxieties into a black box of AI modelling like ChatGPT—that's what my therapist, private Discord server, and locked social media accounts are for. Joking aside, an alarming amount of folks are turning to LLMs rather than fellow humans when it comes to puzzling out personal problems. For these users, OpenAI is attempting to ensure ChatGPT won't lead them up the garden path into thornier emotional territory.
According to the latest update blog post, if a user asks ChatGPT "Should I break up with my boyfriend?" for instance, "ChatGPT shouldn't give you an answer." As young people are apparently becoming increasingly reliant on ChatGPT for emotional support—at least, so says OpenAI CEO Sam Altman—this example is hardly out of left field.
"[ChatGPT] should help you think it through—asking questions, weighing pros and cons," the post elaborates, "New behavior for high-stakes personal decisions is rolling out soon."
A number of the announced tweaks to ChatGPT are geared around 'healthy use'. One already implemented new feature is a popup that will encourage users pouring hours into venting to ChatGPT to take more frequent breaks. Though the company is still tinkering with how frequently these will nudge users, OpenAI writes, "Our goal isn't to hold your attention, but to help you use it well." Still, being able to say that folks are using your product so much you're having to remind them to take breaks feels like such a humble-brag.
As for the motivation behind this, OpenAI admits "we don't always get it right," specifically citing an earlier update that made ChatGPT "too agreeable" before it was rolled back earlier this year. "We also know that AI can feel more responsive and personal than prior technologies, especially for vulnerable individuals experiencing mental or emotional distress. To us, helping you thrive means being there when you’re struggling, helping you stay in control of your time, and guiding—not deciding—when you face personal challenges."
OpenAI also shares, "There have been instances where our 4o model fell short in recognizing signs of delusion or emotional dependency." This is likely at least in part referencing the suite of high-profile stories out of Rolling Stone, The Wall Street Journal, and The New York Times sharing accounts that allege a relationship between heavy use of ChatGPT and mental health crises.
"While rare," the post continues, "we're continuing to improve our models and are developing tools to better detect signs of mental or emotional distress so ChatGPT can respond appropriately and point people to evidence-based resources when needed."
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
So basically, OpenAI isn't going to actively discourage users from treating ChatGPT like an overly effusive best friend or an unpaid therapist. However, the company does recognise it is within its best interests to erect at least some emotional guardrails for its userbase.
I suppose that's something—especially as the US government is apparently so disinterested in regulating AI in any meaningful way, that it attempted to ban more local governance from figuring it out at a state-level. In other words, the bleak bottom line is that internal regulation like the aforementioned updates for ChatGPT are perhaps the most we can hope for in the immediate future.

1. Best overall:
HP Omen 35L
2. Best budget:
Lenovo Legion Tower 5i
3. Best compact:
Velocity Micro Raptor ES40
4. Alienware:
Alienware Aurora
5. Best mini PC:
Minisforum AtomMan G7 PT

Jess has been writing about games for over ten years, spending the last seven working on print publications PLAY and Official PlayStation Magazine. When she’s not writing about all things hardware here, she’s getting cosy with a horror classic, ranting about a cult hit to a captive audience, or tinkering with some tabletop nonsense.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.