Society

British parents sue TikTok over children's deaths linked to 'blackout challenge'

Navigation

Ask Onix

Parents demand accountability from TikTok after tragic deaths

Ellen Roome, one of several UK parents taking legal action against TikTok, has called for greater transparency and responsibility from the social media platform following the deaths of five children allegedly connected to a dangerous online trend.

Lawsuit details and allegations

The lawsuit, filed in Delaware's Superior Court by the Social Media Victims Law Centre, accuses TikTok's parent company, ByteDance, of engineering addictive features that prioritise engagement over safety. The legal action claims the deaths of Julian "Jools" Sweeney, Isaac Kenevan, Archie Battersbee, Noah Gibson, and Maia Walsh were a "foreseeable result" of these design choices.

Roome, whose 14-year-old son Jools died in Cheltenham in 2022, is in the U.S. for the first hearing-a Motion to Dismiss-where TikTok is attempting to have the case thrown out. If the lawsuit proceeds, the discovery phase would require TikTok to release the children's data, provided it hasn't been deleted.

Families seek answers and legislative change

Roome has been campaigning for Jools' Law, which would grant parents access to their deceased children's social media accounts. She believes TikTok holds critical data that could explain her son's death, which a coroner ruled was not a suicide but potentially linked to an online challenge.

Other families share similar suspicions. Isaac Kenevan, 13, from Basildon, died after reportedly attempting a "choke challenge," while Maia Walsh, also 13, was found dead in Hertfordshire in 2022. An inquest into Maia's death will examine her TikTok use. Louise Gibson, whose 11-year-old son Noah died in Worcestershire, has since joined the lawsuit.

Archie Battersbee's case differs slightly; a coroner concluded he died accidentally after a "prank or experiment" went wrong, with no evidence linking it to an online challenge.

TikTok's response and legal arguments

A TikTok spokesperson expressed sympathy for the families but emphasised the platform's policies against dangerous content. "We strictly prohibit content that promotes or encourages harmful behaviour," the statement said, noting that 99% of violating content is removed proactively before being reported.

TikTok is seeking to dismiss the case, arguing the Delaware court lacks jurisdiction over UK-based defendants and that U.S. law, including the First Amendment, shields platforms from liability for third-party content.

Calls for systemic change

Matthew Bergman, the lawyer representing the families, told BBC Radio 4's Today programme that online harm is a rare unifying issue across political divides. "Whether liberal or conservative, we all love our kids," he said, advocating for a combination of legislation, financial accountability, and public pressure to drive change.

"Social media companies are feeding our children harmful material. They make their products addictive by design."

Ellen Roome

Roome added that the lawsuit is "not about money" but about forcing tech companies to take responsibility for their role in children's safety.

Related posts

Report a Problem

Help us improve by reporting any issues with this response.

Problem Reported

Thank you for your feedback

Ed