1. Website Planet
  2. >
  3. News
  4. >
  5. Seven French Families Sue TikTok for Content Shown to Teens
Seven French Families Sue TikTok for Content Shown to Teens

Seven French Families Sue TikTok for Content Shown to Teens

Sarah Hardacre Written by:
Alexandros Melidoniotis Reviewed by: Alexandros Melidoniotis
19 November 2024
Seven families in France have come together to file a lawsuit against TikTok, alleging that the content shown to their adolescent children on the platform was harmful and led to the deaths of two of the children.

The lawsuit claims that TikTok’s algorithm displayed content related to suicide, self-harm, and eating disorders, causing “direct damage.”

The families’ lawyer, Laure Boutron-Marmion, said, “The parents want TikTok’s legal liability to be recognized in court.” She added, “This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product’s shortcomings.”

TikTok has not provided any comments on the case directly but has said there are more than 630 French-speaking moderators on the platform. It reminded parents that they are able to control their teens’ accounts, a feature that has been in place since 2020.

TikTok also stated that its community guidelines prohibit content promoting or sharing plans for suicide or self-harm and that it uses both technology and teams of moderators to enforce these policies.

Within the seven families that have come together, two had daughters who died by suicide, four had daughters who attempted to take their own lives, and at least one of the girls had an eating disorder.

This is not TikTok’s only legal challenge, nor is it the only social media platform facing legal issues over the safety of young users.

A judge recently ruled that Meta must face a lawsuit in Massachusetts that alleges that Meta failed to protect its young consumers with the addictive nature of its platform.

Earlier this year, the European Commission opened a formal investigation into Meta to determine if its platforms have followed the Digital Service Act (DSA) regulations regarding the protection of minors.

TikTok is also facing a lawsuit in the US that alleges that TikTok was responsible for the death of a 10-year-old girl who copied a dangerous online challenge from her TikTok feed.

Rate this Article
4.3 Voted by 3 users
You already voted! Undo
This field is required Maximal length of comment is equal 80000 chars Minimal length of comment is equal 10 chars
Any comments?
Reply
View %s replies
View %s reply
More news
Show more
We check all user comments within 48 hours to make sure they are from real people like you. We're glad you found this article useful - we would appreciate it if you let more people know about it.
Popup final window
Share this blog post with friends and co-workers right now:
1 1 1

We check all comments within 48 hours to make sure they're from real users like you. In the meantime, you can share your comment with others to let more people know what you think.

Once a month you will receive interesting, insightful tips, tricks, and advice to improve your website performance and reach your digital marketing goals!

So happy you liked it!

Share it with your friends!

1 < 1 1

Or review us on 1

3461641
50
5000
114310187