A lawsuit seeks to force Apple to take stronger action against the distribution of abusive content in iCloud. According to the petition filed in a US court, the company knew it had a “devastating problem” with child sexual abuse material (CSAM) and yet remained inactive. The plaintiff is a 9-year-old girl who, according to the information, is herself a victim of child abuse: unknown people asked her via iMessage to create her own CSAM content and share it via iCloud (Jane Doe v. Apple, United States District Court, Northern District of California, Docket No. 5:24-cv-510).
Advertisement
Request a scan of iCloud content
case It relies primarily on an iMessage message from a high-ranking Apple executive. In it, he wrote to a colleague that the company offered “the best platform to distribute child pornography, etc.” due to its data security efforts. This internal conversation from 2020 was made public by Epic Games’ major lawsuit against Apple.
In particular, Apple is being accused of no longer scanning iCloud content, even though technology such as PhotoDNA exists that other cloud giants also rely on. The plaintiff says that most iCloud content is not protected by end-to-end encryption and will therefore eventually be verifiable. It further states that the manufacturer can integrate a reporting device for encrypted content. It is also criticized that iCloud makes it too easy to share photo albums, and supposedly traces can be easily erased.
The lawyers write that the lawsuit is not against end-to-end encryption or data protection in general. What is needed is approval as a class action lawsuit, millions in damages and a number of requirements for Apple to scan iCloud for abusive content.
Apple cancels planned CSAM scan
Three years ago, Apple announced CSAM detection technology that would scan iCloud photos locally on the iPhone. However, the plan was shelved after heavy criticism from customers, security researchers, civil rights activists and data protection advocates. Such a hybrid approach was ultimately “impossible to implement without endangering user security and privacy,” Apple’s data protection chief later admitted – and thus responding to criticism from the US child protection initiative that continued to call for the technology to be implemented. The lawsuit also once again accuses Apple of not implementing the highly controversial project.
Recently there has also been criticism from Great Britain: according to a child protection organization, unlike competitors such as Google and Meta, Apple reports only a very small number of CSAM on its servers and invests very little in protective functions. Apple itself has recently mentioned the nude filter integrated into its operating system as a protective function, which automatically makes pornographic images unrecognizable and only shows them when tapped.
(lbs)