Oct. 9 (Portaltic/EP) –
x (formerly Twitter) resolves complaints that urge the platform to review published content that infringes copyright more quickly than those that are related to videos or non-consensual intimate images (NCII, for its acronym in English), according to a recent study.
The social network allows you to report possible breaches related to the Rules and Terms of its service, through different mechanisms that address both the content included in specific publications, lists or direct messages, among others.
Likewise, child sexual exploitation, pornography or unauthorized use of copyrighted material.
In all cases, the platform ensures that it confirms receipt of correctly submitted complaints within a period of 24 hours. However, he anticipates that, although they are usually resolved “in a few days”, resolution times vary and This can take up to 30 days. This depends on factors outside the control of the social network, such as the need for a user to submit information and whether they decide to request a review of the measures.
A group of researchers has discovered that X prioritizes complaints related to copyright infringement – that is, those submitted under the Digital Millennium Copyright Act (DMCA) – against those that demand the deletion of intimate content published without the consent of its owners.
These may be those captured with hidden cameras or showing full or partial nudity, sexual acts, videos and images taken in an intimate context or those that superimpose or otherwise digitally manipulate one person’s face on the unclothed body of another. This is known as ‘deepfake’.
A group of experts has carried out an investigation to determine how X deals with these types of complaints compared to those in which copyright is violated, highlighting that in the United States, one in eight adults has shared their intimate content without their consent or have been threatened with it – which is also called revenge porn – and that specific legislation is urgently needed to eliminate NCIM online both on this and other platforms.
Despite this growing problem, they have studied how with non-consensual images or videos (NCII).
To reach this conclusion, they created ten different accounts on X and uploaded a total of 50 nude images generated by Artificial Intelligence (AI). Half of them reported them under the ‘copyright infringement’ mechanism, while the other half, for including non-consensual nudity.
The researchers have insisted in their report that, to test X’s capabilities, they used five unique photographs, each of which represented an AI-generated personality. This ensured that the study did not depend on a single image to represent all cases of NCII. Each of these five photographs, likewise, se duplicated ten times to obtain five reports per photograph in each of the two complaint mechanisms.
Thus, the copyright condition resulted in the complete deletion of these images within 25 hours, while X took longer to resolve complaints related to non-consensual intimate images: no images were removed until more than three weeks after having informed the platform. This means that remained visible during the review period.
And not only that: publications with sexual content had 9.08 visits on average after those three weeks, compared to publications reported for infringing copyright, with an average of 7.36 visits. Although minimal, due to the recent creation of these accounts, it shows that their distribution was greater.
Finally, the experts have indicated that their report highlights the need for stronger regulations and protocols aimed at protecting victims of the dissemination of non-consensual content. Also, it contributes to a “broader” understanding of the platform’s responsibilities and the way in which laws influence its behavior.
Add Comment