An appeals court has reinstated a lawsuit against TikTok, overturning a lower court's ruling that Section 230 immunity shielded the short-video app from liability after a child died while participating in a dangerous “blackout challenge.”
Several children died while participating in the “Blackout Challenge,” in which Third District Court Judge Patty Shwartz described participants in her opinion as being encouraged to “strangle themselves with belts, purse strings or the like until they blacked out.”
Because TikTok promoted the challenge in children's feeds, Tawainna Anderson was among the grieving parents who tried to sue TikTok in 2022. She was ultimately told that TikTok was not responsible for recommending the video that caused the death of her daughter Nylah.
In her opinion, Shwartz wrote, Section 230 does not prevent Anderson from arguing that TikTok's algorithm aggregates videos from third parties, “resulting in an expressive product” that “communicates with users.” [that a] curated video stream will be interesting for them.”
The judge cited a recent Supreme Court ruling that said “a platform's algorithm that reflects 'editorial judgments' about how to assemble third-party speech it wants in the manner it wants is the platform's own 'expressive product' and therefore protected by the First Amendment,” Shwartz wrote.
Because TikTok's For You Page (FYP) algorithm decides which third-party speech to include or exclude and organizes content, TikTok's algorithm is considered TikTok's own “expressive activity.” This “expressive activity” is not protected by Section 230, which only protects platforms from liability for third-party speech, not for the speech of the platforms themselves, Shwartz wrote.
The appeals court has now remanded the case to the district court to decide Anderson's remaining claims.
Paragraph 230 does not allow “indifference” to the death of a child
According to Shwartz, the platform would not be liable if Nylah had discovered the “Blackout Challenge” video through a search on TikTok, but because she found it on her FYP, TikTok has become a “positive promoter of such content.”
Now TikTok must address Anderson's claims, which are “based on TikTok's algorithm,” Shwartz said, as well as potentially other claims Anderson may re-file that may be barred by Section 230. But the district court must determine which claims are barred by Section 230 “consistent” with the Third Circuit's decision.
District Judge Paul Matey agreed in part, finding that TikTok knew of the dangers at the time Nylah took part in the Blackout Challenge and “took no and/or wholly inadequate measures to contain the Blackout Challenge and prevent its spread and, in particular, to prevent the Blackout Challenge from being shown to children on their FYPs.”
Matey wrote that Section 230 does not protect companies “from virtually all claims that are only loosely related to content posted by third parties,” as TikTok apparently believes. He argued for a “far narrower” interpretation of Section 230 to prevent companies like TikTok from reading the Communications Decency Act as if it permitted “casual indifference to the death of a 10-year-old girl.”
“Anderson's estate could seek damages for TikTok's knowing distribution and targeted recommendation of videos that it knew could be harmful,” Matey wrote. That includes pursuing “lawsuits seeking to hold TikTok liable for continuing to host the Blackout Challenge videos despite knowing they were causing the deaths of children” and “lawsuits seeking to hold TikTok liable for its targeted recommendations of videos that it knew were harmful.”
“The company may choose to curate the content it presents to children in a way that highlights the lowest virtues and the most sordid tastes,” Matey wrote. “But it cannot invoke immunity that Congress has not granted.”
Anderson's lawyers at Jeffrey Goodman, Saltz Mongeluzzi & Bendesky PC had already provided Ars with a statement following the previous court's ruling, indicating that the parents were not ready to give up their squabbles in 2022.
“The federal Communications Decency Act was never intended to allow social media companies to send dangerous content to children, and the Andersons will continue to fight to protect our children from an industry that exploits youth in the name of profit,” the attorneys said.
TikTok did not immediately respond to Ars' request for comment, but previously vowed to “stay true to our commitment to user safety” and “promptly remove Blackout Challenge content if found.”