Apple finally wants to expose a major Siri data security scandal that dates back several years. The company stressed to Mac&i in a statement on Monday, “The audio recordings or data collected by the voice assistance system are never used to create marketing profiles nor sold to third parties for any purpose. Went.”
Advertisement
Apple is responding to renewed speculation about Siri data misuse as the company looks to settle a US class action lawsuit out of court. The plaintiffs accused the company of, among other things, giving audio recordings collected by Siri to third parties for personalized advertising.
Apple wants to end Siri data security scandal
The company said, “To avoid further legal disputes, Apple decided to reach a settlement in this process so that we can leave behind the concerns we already addressed in 2019 regarding the evaluation of Siri audio recordings by third parties.” Was done.” Additionally, Apple pointed out that the lawsuit, which has now been settled, provided no evidence that Siri recordings were used for targeted advertising. Siri data is used solely to improve the service and Apple is working on making the voice assistance system “even more private.”
In 2019, employees of subcontractors working for Apple suddenly reported to their everyday work, which reportedly included evaluating Siri audio recordings, under lax data protection practices. Sometimes intimate details can be heard in unwanted Siri recordings, often regarding personal information. A whistleblower later called for an EU investigation and an end to such language assistance systems.
Unwanted Siri recordings should be removed
In response to the reports, Apple halted human analysis of audio recordings, but only for a short period of time. Since then, Siri audio recordings have only been evaluated after the user has opted-in. Apple generally stores transcripts of Siri requests for a certain period of time, even without this opt-in. According to Apple, Siri data is tied to a random ID, not the user’s account. The company wants to delete recordings recorded when Siri was inadvertently activated.
(lb)