Keeping Xbox safe: Microsoft acted on 7 million community violations in the last six months
Microsoft spills the beans in a new transparency report on how it tackles bad actors online
➡️ The Shortcut Skinny: Xbox crackdown
⛔ A new report details Microsoft’s moderation policy over the last six months
🔨 The company has taken over 7 million actions to tackle poor community behavior
😶 The majority of those were taken against cheaters and spam accounts
👇 The number of player reports has decreased significantly since last year
Microsoft has revealed that it took action against Xbox users who violated its community guidelines 7.3 million times in just the first six months of this year.
It included the figure in its inaugural Digital Transparency Report, which provides an overview of the company’s moderation policies and breaks down what action Microsoft has taken to combat toxic behavior in its gaming community. It’s a little dry but shows the steps being taken to make the best Xbox Series X games that little bit more enjoyable when playing online.
According to the report, Microsoft received a total of 33 million user reports between the beginning of January and the end of June of this year. In the same period, it made a total of 7.3 million enforcements, which span any action “taken against a player, usually in the form of a temporary suspension which prevents the player from using certain features of the Xbox service”.
Microsoft said 4.33 million of those enforcements were issued against “inauthentic accounts” responsible for spamming and cheating, with many suspected being bots or throwaway, secondary accounts. In comparison, only 1.05 million enforcements were made against profanity, 814k against adult sexual content, 759k issued to tackle harassment or bullying, and even smaller numbers in response to other behavior.
Of those enforcements, 4.78 million of them (roughly a third) were taken proactively, which means they were made before a player brought the bad behavior to Microsoft’s attention. While 2.53 million were issued reactively – that is, made after a player raised it Microsoft.
The company says proactive moderation action had increased nine times on that of the previous year, and that 100% of all actions relating to account tampering, piracy and phishing were made through proactive enforcement. That either suggests Microsoft’s systems are particularly adept at spotting and striking down behavior of that kind, or players are unlikely to report it.
Interestingly, the number of player reports had decreased significantly from the previous six-month period, a trend that extends back into the second half of 2020. In the first half of this year, eight million fewer player reports were submitted than the 41.83 million from July to December 2021, which itself was down from 52.05 million in January to June 2021.
In some instances, when a player receives an enforcement notice they can dispute and appeal the case. Only 6% of all appeals made in the last six months resulted in a reinstatement of the player’s account, with the rest upholding the original enforcement action.
So, the big takeaway: don’t treat the Xbox community guidelines lightly because there’s little chance of coming back after Microsoft catches you. And, judging by this report, you will be caught.