Reporting of images of abuse: NGO criticizes Apple for “underreporting”

0
24
Reporting of images of abuse: NGO criticizes Apple for “underreporting”


In Great Britain, Apple has been accused of not reporting abuse content or not reporting it enough. As the Guardian reported this week, the organization National Society for the Prevention of Cruelty to Children (NSPCC) is criticizing the iPhone maker for allegedly engaging in massive “underreporting.”

Advertisement


According to data obtained by the NSPCC through the UK’s Freedom of Information Act, Apple is said to have reported a total of 337 “crimes of child abuse images” in England and Wales between April 2022 and March 2023. In the whole of 2023, only 267 cases of abuse material were reported to the American National Center for Missing and Exploited Children (NCMEC) across all platforms. This is significantly lower than what other large network companies report: Google is said to have reported 1.47 million cases and Meta 30.6 million. In the NCMEC Annual Report,

Richard Collard, head of online child protection at the NSPCC, said “There is a worrying discrepancy between the number of child abuse crimes committed on Apple services in the UK and the almost negligible number of reports of abusive material to authorities around the world.” Apple is “clearly lagging” behind other tech companies here.

Apple did not want to answer the Guardian’s questions, but cited its guidelines. The company decided not to launch a program to directly scan iCloud photos for CSAM after widespread criticism that it could break encryption or have other negative privacy implications. The Los Angeles-based child protection organization Heat Initiative said Apple “does not recognize CSAM at all” in most of its environments. Apple “has not invested enough in trust and safety teams.”

Apple equipment for the federal administration: iPhones, iPads & Co for 770 millionApple equipment for the federal administration: iPhones, iPads & Co for 770 million

The NSPCC now fears that Apple’s rollout of AI features under Apple Intelligence could become a problem due to computer-generated abuse images. However, the company is likely to use filters here and is much more cautious than its competitors anyway. For example, as the company has announced, you cannot create photorealistic images. According to the Heat Initiative, Apple is “a black hole” when it comes to CSAM tracking.


(B.Sc.)

Today widget: old mini-programs die out with the update to iOS 18Today widget: old mini-programs die out with the update to iOS 18

LEAVE A REPLY

Please enter your comment!
Please enter your name here