Apple delays controversial plan to scan iPhones for child abuse images

Apple delays controversial plan to scan iPhones for child abuse images

Silhouette of a mobile user next to the on-screen projection of the Apple logo in this portrait illustration taken on March 28, 2018.

dado ruvik | Reuters

Apple said on Friday it would delay a controversial plan to scan user photo libraries for child abuse images.

“Last month, we announced plans to help protect children from predators who use and exploit communication devices, and to limit the spread of child pornography,” the company said in a statement. We are doing. “” Based on feedback from clients, advocacy groups, researchers and others, we have decided to take extra time over the next few months to gather feedback and make improvements before proceeding. release these important child safety features. ”

Apple stock fell slightly on Friday morning.

Apple immediately sparked controversy after announcing its system for checking users’ devices for illegal child sexual abuse material, or CSAM. Critics pointed out that the system, which can check images stored in an iCloud account against a known CSAM image database, is at odds with Apple’s messaging regarding customer privacy.

The system does not scan the user’s photographs, but rather searches for known digital “fingerprints” that match the CSAM database. If the system detects enough images in a user’s account, it is reported on a human monitor which can verify CSAM images and pass the information to law enforcement as needed.

Apple’s CSAM detection system was due to go live for customers later this year. It’s unclear how long Apple will delay its release after Friday’s announcement.

Despite the controversy surrounding Apple’s move, this is actually common practice among tech companies. Facebook, Dropbox, Google and many others have systems capable of automatically detecting CSAMs uploaded to their respective services.

.

LEAVE A REPLY

Please enter your comment!
Please enter your name here