Apple may drop controversial CSAM image scanning program


This site can earn affiliate commissions from the links on this page. Terms of use.

Apple raised a lot of eyebrows earlier this year when it announced a plan to tackle child sexual abuse with a multi-pronged approach to several new technologies that would be implemented in iOS 15. The most controversial was a program. which would scan users’ iCloud libraries for CSAM. , which stands for Child Sexual Abuse Material. With the rollout of iOS 15.2 this week, Apple implemented one of those expected features – the ability to detect nude photos in the children’s version of Messages – but the aforementioned scanning technology was notoriously absent. To date, it appears that all references to the image scanning portion of Apple’s plan have been removed from its website, leading people to question whether Apple has scuttled the technology for good because of violent reaction.

Previously, Apple had announced that it was simply delaying the launch of the technology due to criticism, saying it needed time to listen to the comments and revise its implementation, according to Macrumors. In September, he issued the following statement: “Last month, we announced plans for features to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse material. Based on feedback from customers, advocacy groups, researchers, and others, we’ve decided to take more time over the next few months to gather feedback and make improvements before releasing these security features. children of crucial importance.

However, instead of the company saying those plans are still in place, Macrumors writes that the company has simply wiped every sign off its child safety website. As you can see by visiting the link, it only talks about the newly launched nudity detection algorithm for messages, which is not enabled by default and appeared in iOS 15.2. As we noted in our coverage, “…when implemented on a device with a family sharing account, it will check for nudity in images sent and received by the Messages app. If nudity is detected, Messages scrambles the image and displays an onscreen warning, which explains the dangers of sharing explicit photos and asks if the viewer wishes to continue. So far, it seems that this technology has been received without much noise, but the week is not yet over.

Apple rolled out nudity detection for children using its Messages app this week, apparently with little reaction.

Interestingly, critics of Apple’s iCloud scanning technology have put forward what essentially boils down to a “slippery slope” argument, claiming that if Apple can design an algorithm that searches for X, what makes it? prevents searching for Y and Z in the future? As the Electronic Freedom Foundation said, “All it would take to widen the narrow backdoor that Apple is building is an extension of machine learning settings to find content types. additional, or an adjustment of the configuration indicators to analyze the accounts of anyone, not just children. It is not a slippery slope; it is a fully built system that only waits for outside pressure to make any changes. There were also concerns that governments would co-opt Apple’s technology to monitor its citizens, a claim the company vehemently promised it would never allow.

Finally, although Apple removed mentions of CSAM from its Child Safety portal, we were still able to find the original text that Apple posted when it announced its new initiative, so maybe Apple just forgot that PDF. . What’s remarkable, however, is that the recently updated child safety pages only mention nudity detection for images in Messenger, not CSAM. Despite removing references to the controversial technology from its website, an Apple spokesperson told The Verge that the company’s plans have not changed and are still being delayed.

Now read:

About Irene J. O'Donnell

Check Also

“Edge of Tomorrow” program to create Israeli combat soldiers of the future

For more articles from The Media Line, click here. “Edge of Tomorrow”, an innovative Israeli …