November 15 (Portaltic/EP) –
Xbox has suspended about 4.78 million accounts that went against their community guidelines during the first six months of this year 2022, as recently announced by Microsoft.
The company has published the first transparency report of the console, with which it seeks to make its users aware of the measures they have taken to moderate the content of their platform.
With the presentation of this study, which analyzes the actions taken between January 1 and June 30, 2022, Xbox has commented that it has the objective of making players aware of what its security practices consist of and has committed to publishing it every two years.
First of all, he has highlighted that the players are also the community administrators themselves and that, thanks to their alerts, he has been able to maintain security in the Xbox ecosystem. So much so that users provided over 33 million reports of misconduct or suspicious conduct during this period.
In addition, Xbox has reminded that players are in control of what happens in the environment of the brand, both on console and on PC and in the beta version of Xbox Cloud Gaming. Thus, they can do it thanks to the implementation of filters for messages or parental controls in the Xbox Family Settings app.
In this line, he has pointed out that users have the opportunity to create a personalized experience thanks to the different configuration options that it offers, either with automated text filteringmedia or web links or tools such as muting and blocking other players.
On the other hand, he recalled that he offers help to players in other areas, beyond privacy, through the non-profit organization Crisis Text Line, with whom he partnered in 2018. Users can use this service when report concerning conduct by other players, for example, if they post content related to suicidal ideation or self harm.
In relation to the actions carried out on user accounts in the period analyzed, the company has said that it proactively suspended until 4.78 million accounts in just six months, 4.33 million of them for being fraudulent or falsified accounts.
According to the company, these accounts cause a negative impact on users in multiple ways, from sending ‘spam’ to the introduction of cheats that interrupt the game or the illegal growth of their follower lists, which “subtracts value from experiences” of those who do use this ecosystem correctly.
On the other hand, Microsoft has commented that, together with these deceptive accounts, it was in charge of suspending 199,000 profiles that distributed sexual content87,000 that committed fraud, 54,000 that encouraged harassment, 46,000 that included insults and 26,000 that involved identity theft.
With this, it has advanced that it has managed to eradicate all the manipulated accounts, as well as those hacked and in which identity theft occurs from January to June 2022.
PROACTIVE AND TARGETED MODERATION
Xbox has explained that it currently works with two types of moderation. First of all, consider proactive moderation the one by which it applies measures before the harmful content affects the players.
On the contrary, reactive moderation consists of the application of corrective measures once they have reached the users and they inform Xbox of what happened, whether it is a content not appropriate for the ecosystem or an account whose activity does not meet the community guidelines.
In relation to the latter, the company has pointed out that it distinguishes the content reported by the players in three categories: for conduct (cheating, competitive and unsportsmanlike behavior, etc.), user-generated content and communications with other users.
Based on the volume of guideline violation reports received by players, Xbox has stated that there were 36 percent fewer reports received from players in the reporting period than in the same half of the previous year, that is, 33.08 million, compared to the 52.05 million incidents reported between January and June 2021.
XBOX SHARES
Once it has received information about a possible infringement of community regulations, the company studies the case to determine if it has occurred and takes action to correct it: either remove infringing content either suspend the player’s account, or combine both methods.
According to the table of actions taken by Xbox from January to June 2022, 63 percent of its executions were intended to block the accounts3 percent for the deletion of content and 34 percent for the blocking of both accounts and content.
Once it collects all of this inappropriate and child sexual abuse/exploitation-related content, it reports it to the National Center for Missing and Exploited Children (NCMEC) for in-depth analysis.
Likewise, it has commented that it notifies users of what happened to their accounts and/or their contents so that they can request clarification and proceed to review the case. Then, they will have to explain their actions before the moderating agentswhich will determine whether or not your actions are justified.