Site icon Medical Market Report

Apple delays plans to roll out CSAM detection in iOS 15

Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology that it chaotically announced last month, citing feedback from customers and policy groups.

That feedback, if you recall, has been largely negative. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.

In a statement on Friday morning, Apple told TechCrunch:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

More soon…

Read more:

>

  • Apple confirms it will begin scanning iCloud Photos for child abuse images
  • Apple’s CSAM detection tech is under fire — again
  • Apple details child abuse detection and Messages safety features
  • Source Link Apple delays plans to roll out CSAM detection in iOS 15

    Exit mobile version