Science and Tech

Meta admits that error rates in moderation guidelines on its platforms "they are too high"

Meta admits that error rates in moderation guidelines on its platforms "they are too high"

Dec. 3 (Portaltic/EP) –

Goal has admitted that the company moderation error rateswith respect to the content of its different platforms, such as Instagram and Threads, “they are still too high”, and assures that he will continue working to “improve precision and accuracy” with which they implement their content moderation rules.

The company led by Mark Zuckerberg has a series of moderation solutions for its social networks, with which reviews content posted by users to ensure it complies with its community standards and usage policies. These tools are overseen by internal Meta teams and external partners globally.

Thus, as detailed in his Transparency Centerthese teams review and analyze publications to identify if they are false, if they contain misinformation or if they promote harmful actions for userssuch as harassing behavior, violence or nudity, as outlined in Instagram’s community standards. Furthermore, they try to protect minor users with measures aimed at limiting age-appropriate content.

However, these content moderation guidelines are overriding the limitation of posts on Meta platforms, as users moderation error rates on Instagram or Threads “are too high”as the company itself has admitted.

As explained by the president of Meta Global Affairs, Nick Clegg, in statements collected by The Verge, The company is aware that, when applying its moderation policies, “error rates are still too high.” As a consequence, he has stated that is hindering “freedom of expression” that the technology company proposed to allow on its platforms.

“Too often remove or restrict harmless contentand too many people are unfairly penalized,” said Clegg.

These statements follow recent cases of content moderation failures, which caused the deleting accounts and blocking content misclassified as fake or controversial, as shared by some affected users.

At the time, Meta noted that it was reviewing issues related to its Threads and Instagram moderation systems. Later, the technological solved these problems and alleged that They were due to a tool that “broke” and did not show enough context for moderation teams.

Clegg has also shared that they were subject to “very strict rules” during the pandemic, which forced them to remove “large volumes of content” from Meta’s social networks. “We feel like we exaggerated a little. We are very aware because users raised their voices and rightly complained that sometimes we apply too much and make mistakes,” eliminating or restricting “innocent content,” he added.

Likewise, during the recent United States presidential elections, the Advisory Council of Goal also highlighted the “over-application” of moderation policies, including the dangers that this practice can cause, such as “leading to excessive elimination of political speech.”

Despite all this, the Meta manager has stated that the company intends to continue working on its moderation guidelines to “improve the precision and accuracy” with which they implement the rules.

Source link