Apple Officially Cancels Its Plans to Scan iCloud Photos for Child Abuse Material

Apple has officially killed one of its most controversial proposals ever: a plan to scan iCloud images for signs of child sexual abuse material (or, CSAM).

Read more…