David French makes an emotional appeal to hold TikTok liable for the tragic asphyxiation death of a 10-year-old girl, Nylah Anderson, who took the “so-called blackout challenge.” It is, without a doubt tragic and horrible, as the facts leave no doubt.
In 2021, a 10-year-old girl named Nylah Anderson was viewing videos on TikTok, as millions of people do every day, when the app’s algorithm served up a video of the so-called blackout challenge on its “For You Page.” The page suggests videos for users to watch. The blackout challenge encourages users to record themselves as they engage in self-asphyxiation, sometimes to the point of unconsciousness. Nylah saw the challenge, tried it herself and died. She accidentally hanged herself.
Nylah’s parents sued, and the Third Circuit upheld the suit. At Techdirt, Mike Masnick explains where the circuit went terribly wrong in its decision.
We’ve already hit (and not for the last time) the key problem with the Third Circuit’s analysis. “Given … that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms,” the court declared, “it follows that doing so amounts to first-party speech under [Section] 230, too.” No, it does not. Assuming a lack of overlap between First Amendment protection and Section 230 protection is a basic mistake.
Section 230(c)(1) says that a website shall not be “treated as the publisher” of most third-party content it hosts and spreads. Under the ordinary meaning of the word, a “publisher” prepares information for distribution and disseminates it to the public. Under Section 230, therefore, a website is protected from liability for posting, removing, arranging, and otherwise organizing third-party content. In other words, Section 230 protects a website as it fulfills a publisher’s traditional role. And one of Section 230’s stated purposes is to “promote the continued development of the Internet”—so the statute plainly envisions the protection of new, technology-driven publishing tools as well.
But David focuses not on the scope of Section 230’s protection on the whole, but on the need to prevent tragedy.
But does TikTok have any responsibility? After all, it not only hosted the video. According to the claims in the legal complaint Nylah’s mother filed, TikTok’s algorithm repeatedly put dangerous challenges on Nylah’s For You Page. To continue with the offline analogy, imagine if an adult walked up to Nylah after school and said, “I know you, and I know you’ll like this video,” and then showed her a blackout challenge performed by somebody else.
In that circumstance, wouldn’t we hold the adult who presented the video to Nylah even more responsible than the person who actually made the video? The very fact that the recommendation came from an adult may well make Nylah more susceptible to the video’s message.
Algos aren’t people, no less adults, making adult decisions for children. Algos don’t think. They don’t feel. They don’t care. They are merely a set of coded instructions to give a user other content that, based upon what a user watches, would be of interest to the user. Granted, its purpose can be nefarious, to keep a user using, and using, and using, but that’s not the complaint.
In the offline world, the adult who presented the video to Nylah could well be liable for wrongful death, and no amount of objections that he just showed the child a video made by someone else would save him from liability. After all, he approached the child of his own volition and offered her the video unsolicited. That was the adult’s own speech, and adults are responsible for what they say.
In the offline world, an adult would have an appreciation of why certain content is inappropriate, even dangerous, for a particular user, such as a 10-year-old girl. If an adult pushed a dangerous TikTok on a child, the adult would certainly be responsible for the conduct. But algos make no determination of appropriateness. How could code make such a subjective determination? It’s just code.
As David notes, the Supreme Court’s 2023 Moody decision includes dicta suggesting that algorithms are expressive speech protected by the First Amendment.
But with legal rights come legal responsibilities. The First Amendment doesn’t permit anyone to say anything they’d like. If I slander someone, I can be held liable. If I traffic in child sex abuse material, I can be put in jail. If I harass someone, I can face legal penalties. Should the same rules apply to social media companies’ speech, including to their algorithms?
The Third Circuit said yes. One Obama appointee and two Trump appointees held that TikTok could be held potentially liable for promoting the blackout challenge, unsolicited, on Nylah’s page. It couldn’t be held liable for merely hosting blackout challenge content — that’s clearly protected by Section 230 — nor could it be held liable for providing blackout challenge content in response to a specific search.
It’s understandable that David employs analogies to construct an argument for liability for a heartbreaking tragedy, as there is no legal argument that could sustain the position. That there are traditional and extremely limited exceptions to the First Amendment is a common gambit for those who seek to extent the parameters of prohibited speech to new realms, new technologies, because not doing so can result in terrible consequences. “Do it for the children” is, perhaps, the most effective emotional appeal around, and has served as the basis for many limitations on freedom lest a single child be harmed.
Could the internet survive without algos? Sure, even if they alert users to content of interest they might otherwise never find. But that does not make algos into sentient adults with the capacity to make relative judgments as to what will cause any individual harm. It’s horrific that Nylah Anderson killed herself for something as utterly idiotic as the “blackout challenge.” The maker of that TikTok bears responsibility. Nylah’s parents bear responsibility. The algo, however, is not to blame for the mechanical act of giving a user what the user wants.