Flutterby™! : Reporting Non-Consensual Intimate Medi

Next unread comment / Catchup all unread comments User Account Info | Logout | XML/Pilot/etc versions | Long version (with comments) | Weblog archives | Site Map | | Browse Topics

Reporting Non-Consensual Intimate Medi

2024-10-10 16:28:44.473559+02 by Dan Lyke 0 comments

Reporting Non-Consensual Intimate Media: An Audit Study of Deepfakes

Abstract: Non-consensual intimate media (NCIM) inflicts significant harm. Currently, victim-survivors can use two mechanisms to report NCIM—as a non-consensual nudity violation or as copyright in- fringement. We conducted an audit study of takedown speed of NCIM reported to X (formerly Twitter) of both mechanisms. We uploaded 50 AI-generated nude images and reported half under X’s “non-consensual nudity” reporting mechanism and half under its “copyright infringement” mechanism. The copyright condition resulted in successful image removal within 25 hours for all images (100% removal rate), while non-consensual nudity reports resulted in no image removal for over three weeks (0% removal rate). We stress the need for targeted legislation to regulate NCIM removal online. We also discuss ethical considerations for auditing NCIM on social platforms.

[ related topics: Erotic Sexual Culture Ethics Nudity Journalism and Media Artificial Intelligence Copyright/Trademark Model Building ]

comments in ascending chronological order (reverse):