Five British families are suing TikTok over the deaths of their children in a landmark US case, according to The Independent.
The parents, who will attend the hearing in Delaware on Friday, are the first families from the UK to pursue legal action against the company in an American court over the deaths of their children.
Ellen Roome, Lisa Kenevan and Liam Walsh are attending on behalf of the families, with Ms Roome saying that “parents should not have to cross continents to fight multinational technology companies just to find out what happened to their child” after they die.
The lawsuit alleges that TikTok’s algorithms promoted and amplified dangerous content to children, including material linked to the so-called “Blackout Challenge”. The families claim that this content contributed to their children’s deaths and that the company has repeatedly refused to release the data needed to understand what their children were exposed to in the critical period before they died.
Ms Roome, who believes her 14-year-old son Jools died after taking part in an online challenge in April 2022, previously told The Independent: “In light of what’s happened, I’ve learned an awful lot about online activity that I was very naive about before.
“I thought Jools was merrily watching silly dance videos, or harmless challenges like standing on your hands and pulling your t-shirt upside down. I now know there’s masses of harmful and illegal content. [Online safety law changes] can’t come soon enough. I don’t want any other family going through what we will have to for the rest of our lives.”
Friday’s hearing is a motion to dismiss, which is a crucial procedural stage in the case, because, if unsuccessful, the lawsuit will proceed to discovery, where TikTok could be legally compelled to disclose internal records and the children’s account data.
Despite multiple requests from bereaved families, they say TikTok has not yet provided this information.
A TikTok spokesperson said: “Our deepest sympathies remain with these families.
“We strictly prohibit content that promotes or encourages dangerous behaviour. Using robust detection systems and dedicated enforcement teams to proactively identify and remove this content, we remove 99 per cent that's found to break these rules before it is reported to us.
“As a company, we comply with the UK’s strict data protection laws.”
The company added that the “Blackout Challenge” has been blocked on TikTok since 2020, and it has never found evidence that this content was trending on the social media platform, with the challenge pre-dating TikTok.
The families say this hearing marks a “significant moment” for bereaved families seeking truth, accountability, and systemic change, and could set an important precedent for how social media companies are held responsible for harm to children.
The case comes as global concern about the impact of algorithm-driven social media on children and the role of recommender systems in promoting harmful content increases.
With calls growing for improved safeguarding for young people online, the lawsuit will be closely watched by policymakers in the UK and internationally, as governments consider stronger regulation and accountability for technology companies operating at scale.