Flutterby™! : Accidental Child Porn

Next unread comment / Catchup all unread comments User Account Info | Logout | XML/Pilot/etc versions | Long version (with comments) | Weblog archives | Site Map | | Browse Topics

Accidental Child Porn

2018-02-27 19:54:09.167431+01 by Dan Lyke 0 comments

Fake Porn Makers Are Worried About Accidentally Making Child Porn — Images of celebrities as minors are showing up in datasets used in making AI-generated fake porn.

This collection of images, or faceset, is used to train a machine learning algorithm to make a deepfake: a fake porn video that swaps Watson’s face onto a porn performer’s body, to make it look like she’s having sex on video. If someone uses the faceset that contains images of Watson as a child to make a deepfake, that means that a face of a minor was in part used to create a nonconsensual porn video.

(Leaving aside all of the IP questions inherent in using someone's likeness for an application they didn't license it for, and how protectable that likeness is, and...)

[ related topics: Erotic Sexual Culture Artificial Intelligence ]

comments in descending chronological order (reverse):

Comment policy

We will not edit your comments. However, we may delete your comments, or cause them to be hidden behind another link, if we feel they detract from the conversation. Commercial plugs are fine, if they are relevant to the conversation, and if you don't try to pretend to be a consumer. Annoying endorsements will be deleted if you're lucky, if you're not a whole bunch of people smarter and more articulate than you will ridicule you, and we will leave such ridicule in place.


Flutterby™ is a trademark claimed by

Dan Lyke
for the web publications at www.flutterby.com and www.flutterby.net.