
Apple abandons plans to scan iCloud for CSAM images
Apple has abandoned plans to implement an image-scanning system in iCloud for signs of child sexual abuse material (CSAM). Wired reports.
During the announcement of new security features for its cloud storage, the tech giant said it would not scan photos. According to an Apple spokesperson, the company will pursue a different approach to combat illegal content.
“Children can be protected without corporations scanning personal data. We will continue to work with governments, civil rights advocates and other companies to help protect young people, preserve their right to privacy and make the Internet a safe place for everyone,” he said.
Experts say that the distribution of CSAM materials is a serious problem that is only getting worse each year. However, the proposed Apple technology did not fit this task due to high privacy risks, they say.
In August 2021, the tech giant introduced a feature to scan users’ devices to detect illegal materials related to sexual offences against children. Security experts criticized the company’s plans, warning of possible abuses by authorities.
Less than four days later, Apple issued a detailed explanation of how the feature would work. The company promised not to turn the CSAM-content detection algorithm into a mass-surveillance tool.
Nevertheless, public pressure intensified on the tech giant, urging it to abandon the use of the feature. An open letter criticizing it was signed by more than 90 civil rights groups.
A month after the announcement, the tech giant delayed rollout of the feature for an indefinite period.
Recall that in July 2022, UK authorities backed a plan to scan citizens’ smartphones for CSAM materials.
In the same month, Meta announced the start of development of a nudity detector in photos, sent via Instagram direct messages.
Subscribe to ForkLog news on Telegram: ForkLog AI — all the news from the world of AI!
Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!