fbpx
News

Apple to postpone roll out of controversial child protection features

The proposed changes will be reworked to address user privacy and security concerns

iPhone

Apple is backpedaling on its plan to roll out a set of controversial new child protection features to its devices, following significant backlash from customers and experts alike.

The new features — which would only affect U.S. users, and were never slated to be introduced in Canada — included a tool that automatically scanned images saved to a user’s iCloud Photos, identified child sexual abuse materials (CSAM), and reported them to Apple moderators who could then contact the National Center for Missing and Exploited Children (NCMEC).

A parental control feature — which pings parents if their kids send or receive sexually explicit photos, and automatically blurs the images — announced as part of this update also came under fire.

The features, while well intentioned, are being critiqued as poorly designed and breaching user on-device privacy.

For context, other tech companies like Google, Microsoft and Facebook already scan their servers — but not user devices — for child abuse materials.

However, in the original announcement from Apple, the company stated that “instead of scanning images in the cloud” its new feature “performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.”

Apple told The Verge on September 3rd that “based on feedback from customers, advocacy groups, researchers and others,” the company will be delaying the release of their child protection features in the U.S. until later this year, following additional research and improvements.

Source: The Verge

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments