ast year, Apple announced plans to help combat child sexual abuse in several areas. As part of , the company implemented a new parental control feature for Messages to help prevent children from seeing to see if they matched known child sexual abuse material (CSAM), was delayed.
After a year in limbo, Apple quietly kills its controversial CSAM photoscanning feature
Jan 25, 2023
1 minute
You’re reading a preview, subscribe to read more.
Start your free 30 days