TikTok’s algorithm could lead to liability for dangerous videos

0
30
TikTok’s algorithm could lead to liability for dangerous videos


TikTok can generally be held responsible for decisions made by its algorithms, the US Federal Court of Appeals for the Third Circuit said, citing the US Supreme Court’s July 1 censorship ruling. Tanner: If an algorithm compiles external content in such a way that the compilation becomes an independent statement, this statement should be attributed to the operator of the algorithm, even if the content itself does not come from it. Another US federal appeals court (9th District) has found another way to hold operators of online services liable for third-party content.

Advertisement


The reason for the case in the Third Circuit is tragic: the death of a ten-year-old child. Although TikTok has a minimum age requirement of thirteen, the child used the Chinese video app. Their algorithm recommended a video on the “For You Page” that included a life-threatening request (the “Blackout Challenge”): users should please video themselves strangling themselves until they lose consciousness. Unfortunately, the child followed the instructions and did not survive. Now the child’s estate and his mother want to sue TikTok and its parent company ByteDance in US federal district court. The district court rejected it, but the federal appeals court interpreted the law differently and sent the case back to first instance.

Once again the key point is the famous Section 230, which is part of the 1996 US federal law Telecommunications Act. It grants immunity to content that operators of interactive communications services do not provide themselves, but which is posted by third parties (barring exceptions that are irrelevant here). The textbook example is a web host that should not be held responsible for any stupidity spread by its customers on its hosted websites.

French charges and new investigation against Telegram boss DuroFrench charges and new investigation against Telegram boss Duro

No provider or user of the interactive computer service will be treated as the publisher or speaker of any information provided by another information content provider.

There is no obligation under US federal law to distribute third-party content. This problem was also the trigger for Section 230: a forum operator would typically remove postings that were not suitable for young people; a judge saw this as grounds to hold the operator liable for all postings that were not removed. Lawmakers responded with Section 230 to keep hosting services available and affordable and not force them to play the role of censorship police.

However, the boundary between the distribution of third-party content and the statements of the service operator itself is not clear. Of course you have to stand up for your own statements. The US states of Texas and Florida want to force large online services to distribute content that they do not want to distribute by law. Removing postings would be as illegal as reducing their distribution. Operators would also be prohibited from taking measures to protect children on their own initiative. Rewarding or favoring certain postings would also be prohibited.

From the point of view of the US Supreme Court, these state laws against censorship are censorship itself. Accordingly, operators of online services have the right to decide what they display and what they do not, even if the contributions themselves come from third parties. Because these selection decisions are in themselves an expression of opinion, even if only certain posts are blocked. The operator then expresses which content it disapproves of, the Supreme Court explained on July 1. And the First Amendment of the US Constitution enshrines the right to expression, which state laws are not allowed to interfere with.

The US Federal Court of Appeals for the Ninth Circuit now refers to this: If an operator uses algorithms that say something on their own (“expressive algorithms”), the operator should be liable for these decisions. Section 230 only protects against liability for statements made by others. The situation is different with algorithms that make selection decisions based on user input or past usage behavior; a classic example is the search function in which the user enters search terms of his or her choice. In the court’s view, Section 230 certainly provides protection for the resulting expenses.

Since the plaintiffs claim that TikTok’s algorithms are of the former type, the federal appeals court says that the federal district court cannot dismiss the lawsuit. It then sends the case back; the district court must therefore clarify whether the suggestion of dangerous videos on the For You page was the result of the child’s previous entries or whether it arose from a judgmental algorithm for which TikTok may be liable. Only then can the district court decide whether Section 230 actually bars the lawsuit. The federal appeals court acknowledges that many other US courts have interpreted Section 230 more broadly in favor of liability protection for online services.

Australia: 17 years in prison for sextortion, 180 minors among victimsAustralia: 17 years in prison for sextortion, 180 minors among victims

LEAVE A REPLY

Please enter your comment!
Please enter your name here