Flutterby™! : Apple CSAM hash collisions

Next unread comment / Catchup all unread comments User Account Info | Logout | XML/Pilot/etc versions | Long version (with comments) | Weblog archives | Site Map | | Browse Topics

Apple CSAM hash collisions

2021-08-18 22:55:58.783442+02 by Dan Lyke 1 comments

Apple announced that they're root through your iPhone photos looking for "child sexual abuse material" (CSAM). Everyone with any knowledge at all about how this shit goes down rolled their eyes and said "oh hell no". Apple said "no, really, this will use hashes of the image identification, it's not like we're actually looking through your photos!"

Everyone with any knowledge at all about how this shit goes down rolled their eyes and said "that's even worse."

Sure enough, there are now working hash collisions.

[ related topics: Apple Computer Photography Erotic Sexual Culture Work, productivity and environment Databases iPhone ]

comments in ascending chronological order (reverse):

#Comment Re: Apple CSAM hash collisions made: 2021-08-19 17:15:03.233037+02 by: Dan Lyke

Another hash: https://mobile.twitter.com/nulllzero/status/1428103864037875718