Flutterby™! : Accidental Child Porn

Next unread comment / Catchup all unread comments User Account Info | Logout | XML/Pilot/etc versions | Long version (with comments) | Weblog archives | Site Map | | Browse Topics

Accidental Child Porn

2018-02-27 19:54:09.167431+01 by Dan Lyke 0 comments

Fake Porn Makers Are Worried About Accidentally Making Child Porn — Images of celebrities as minors are showing up in datasets used in making AI-generated fake porn.

This collection of images, or faceset, is used to train a machine learning algorithm to make a deepfake: a fake porn video that swaps Watson’s face onto a porn performer’s body, to make it look like she’s having sex on video. If someone uses the faceset that contains images of Watson as a child to make a deepfake, that means that a face of a minor was in part used to create a nonconsensual porn video.

(Leaving aside all of the IP questions inherent in using someone's likeness for an application they didn't license it for, and how protectable that likeness is, and...)

[ related topics: Erotic Sexual Culture Artificial Intelligence ]

comments in ascending chronological order (reverse):