PUBG dev says it bans up to 100,000 accounts a week, and now it's deploying AI models to hunt the cheats

(Image credit: PUBG Corporation)

PUBG developer Krafton has a new blogpost from its anti-cheat team, discussing the omnipresent problem of cheating in the game and what it's doing to tackle the villains. It begins with a pretty astonishing demonstration of the scale of the problem: "Every week, the PUBG: Battlegrounds Anti-Cheat Team identifies and imposes permanent bans on an average of 60,000 to a maximum of about 100,000 accounts involved in the use, distribution, or sale of illegal software."

Then the question might be why do players still encounter cheaters at all? Krafton says it recognises that "a more comprehensive approach" is needed and that, while it can go on banning accounts permanently till the cows come home, it needs a "fundamental solution" to analyse and track accounts doing the bad things.

Its focus is on accounts that are used for cheating in ranked mode, which it puts in two categories: first is "hijacked accounts" and second are accounts that "exploit the Survival Mastery Level system". The first is fairly self-explanatory, and Krafton says its analysis shows that roughly 85% of permanently banned accounts were created prior to PUBG's transition to a free-to-play model (January 2022). Krafton says this isn't because these accounts have been cheating for ages and only just got detected, but "rather implies that it is highly probable that cheaters obtained other players' accounts and started using illegal software on those acquired accounts."

It's a pretty straightforward trick. Scammers nick a legit account and sell it to a cheater, who then gets to happily feast on chicken dinners until Krafton bans the account: at which point they just buy another and keep on trucking. 

The Survival Mastery Level element is that newly created accounts are not allowed to play in ranked matches until they reach level 80, but "certain illicit vendors have established so-called 'workshops' where players can gain Mastery Level experience points through repetitive actions facilitated by macros." They use these to boost new accounts and hacked accounts that don't have the required level, then sell them. 

Krafton says it's had enough, and wants to address this supply of accounts that's behind so much of the cheating in PUBG. It says that while it was previously focused on pattern-spotting to identify hacked accounts, this had issues such as new forms of abuse going under the radar and the low accuracy of detection meaning it was difficult to apply serious punishments like a permanent ban (because there's always a chance that suspicious behaviour may well just be a legitimate player behaving suspiciously rather than cheating). 

So the PUBG anti-cheat team "initiated the development of a machine learning model that could learn the characteristics and patterns of Mastery Level abuse." It began to be used this year and Krafton has "expanded and refined the criteria for detecting disruptive players" and found great success. "The number of bans issued against disruptive accounts has increased by over threefold compared to the period before the introduction of this model. Furthermore, the internal monitoring process for suspected disruptive players/accounts has shown continuous improvement, and the number of monitored account vendors has decreased. We have also observed an increase in the prices of these accounts."

Now that's two true measures of success: the cheat sellers are finding that it's harder to produce these accounts, reflected in price rises, and some seem to have moved on from PUBG entirely.

Krafton's also built a machine learning model capable of detecting hijacked accounts and, while this doesn't seem to have been in use as long as the first, it has reached a stage "where we can ascertain the scale and attributes of these accounts and leverage this information effectively." It says it's currently working on enhancing this model's accuracy and applying it to "diverse anti-cheat measures."

The post ends by re-emphasising these measures are "targeting the source" of cheating, and telling players to ensure their accounts are secure by for example installing the Steam Guard Mobile Authenticator. 

60,000 to 100,000 accounts a week is wild and shows why companies are looking at AI models in an effort to help stem the flow. Krafton's not the only big publisher with ideas here and some of the ideas developers have come up with in recent times really are funny. Call of Duty's latest anti-cheat tech gets into psychological warfare by making hackers have "hallucinations", while Ubisoft is smashing hackers in their thousands with something called QB but, perhaps wisely, no-one quite knows what it is.

Rich Stanton

Rich is a games journalist with 15 years' experience, beginning his career on Edge magazine before working for a wide range of outlets, including Ars Technica, Eurogamer, GamesRadar+, Gamespot, the Guardian, IGN, the New Statesman, Polygon, and Vice. He was the editor of Kotaku UK, the UK arm of Kotaku, for three years before joining PC Gamer. He is the author of a Brief History of Video Games, a full history of the medium, which the Midwest Book Review described as "[a] must-read for serious minded game historians and curious video game connoisseurs alike."