Apple abandons controversial plan to check iOS devices and iCloud photos for child abuse imagery
Apple abandons controversial plan to check iOS devices and iCloud photos for child abuse imagery
Invalid Date by CNN
Key Facts
- CNN — Apple is abandoning its plans to launch a controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material (CSAM) following backlash from critics who decried the feature’s potential privacy implications.
- “Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” the company said in a statement provided to Wired.
- Apple was criticized in 2021 for its plan to offer a different tool that would start checking iOS devices and iCloud photos for child abuse imagery.
- Apple plans to bring expanded end-to-end encryption of iCloud data to include backups, photos, notes, chat histories and other services, in a move that could further protect user data but also add to tensions with law enforcement officials around the world.
ORGANIZATION
PRODUCT
MISCELLANEOUS
This story was produced by the Kwhen Automated News Generator. For more articles like this, please visit us at finance.kwhen.com. Write to editors@kwhen.com. © 2021 Kwhen Inc.
Was this content valuable for you?