Crypto messenger iMessage: Users will soon be able to report nude photos to Apple

0
13
Crypto messenger iMessage: Users will soon be able to report nude photos to Apple


Apple is integrating a reporting function for nude content into crypto messenger iMessage. As the Guardian reports, after iOS 18.2, it will first be possible for users to report photos and videos recognized as nude content locally by the operating system directly to Apple. This is part of the “Communications Protection” feature built into all Apple operating systems, which is now enabled by default for children under 13. Parents can also set up the function on teens’ devices and optionally activate it for themselves – or turn it off.

Advertisement


If the system detects a nude photo received via iMessage, it is automatically made unrecognizable and the recipient is warned about the sensitive content. At this point in the future there will be a new function to transmit the recordings received to Apple. In addition to the associated photo or video, “limited surrounding text messages” and the whistleblower’s name or account are also sent to the group. As an iPhone notice dialog published by the GuardianIt said Apple would review the material and possibly take action. This includes blocking iMessage accounts and possibly even notifying law enforcement authorities.

According to the Guardian, the new feature will be launched first in Australia as new regulatory rules for messaging and cloud services will come into effect there soon. However, the introduction of new reporting functions is planned worldwide.

Apple originally opposed the bill in Australia (as well as other countries), saying it threatened the privacy of communications provided by end-to-end encryption. The law now gives providers more leeway to report illegal content – ​​without a backdoor to encryption.

APNIC Chief Scientist: IPv6 Introduction Possibly Obsolete heise onlineAPNIC Chief Scientist: IPv6 Introduction Possibly Obsolete heise online

To better deal with child sexual abuse material (CSAM), Apple considered scanning iCloud Photos locally on the iPhone several years ago – and automatically flagging any CSAM content found in the background to Apple. Had considered broadcasting in. After heavy criticism from customers, security researchers, and civil rights activists, the company shut down the project.

Child protection functions planned in parallel, such as nudity filters, were modified and eventually integrated into the operating system. Users may eventually click on such unfamiliar images to view them. Since iOS 18, on devices of children under 13, the Screen Time code – ideally known only to parents – must also be entered.

Apple has recently been accused by various quarters of not doing enough to deal with CSAM, especially in iCloud. According to the allegation, derogatory material is also distributed there through shared albums. A US class action lawsuit accuses the company of ignoring such material.


(lb)

New app: Apple’s blood sugar management testedNew app: Apple’s blood sugar management tested

LEAVE A REPLY

Please enter your comment!
Please enter your name here