Apple Reportedly Working on Problematic iOS Tool to Scan for Child Abuse Photos on iPhones

Apple is purportedly poised to announce a new tool that will help identify child abuse in photos on a user’s iPhone. The tool would supposedly use a “neural matching function” to detect if images on a user’s device match known child sexual abuse material (CSAM) fingerprints. While it appears that Apple has taken user…

Read more…

via Gizmodo

Check out the Finding Your Identity Podcast