Apple delays controversial plan to scan iPhones for child exploitation images

FAN Editor

Silhouette of a mobile user seen next to a screen projection of the Apple logo in this picture illustration taken March 28, 2018.

Dado Ruvic | Reuters

Apple on Friday said it would delay a controversial plan to scan users’ photo libraries for images of child exploitation.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said in a statement. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple immediately stirred controversy after announcing its system for checking users’ devices for illegal child sex abuse material, or CSAM. Critics pointed out that the system, which can check images stored in an iCloud account against a database of known CSAM imagery, was at odds with Apple’s messaging around its customers’ privacy.

The system does not scan a user’s photos, but instead looks for known digital “fingerprints” that it matches against the CSAM database. If the system detects enough images on a user’s account, it is then flagged to a human monitor who can confirm the CSAM imagery and pass the information along to law enforcement if necessary.

This is breaking news. Please check back for updates.

Free America Network Articles

Leave a Reply

Next Post

Employers add 235,000 jobs in August vs 720,000 estimate

The latest data comes as the more contagious delta variant upends the recovery. September 3, 2021, 1:01 PM • 4 min read Share to FacebookShare to TwitterEmail this article Employers added 235,000 jobs in August, far below expectations of 720,000 new hires, and the unemployment rate dipped slightly to 5.2%, […]

You May Like