Lawsuits Abound Against Social Media Giants
Social Media Being Sued for Addiction?
All of the major social media companies rely on addictive algorithms and promote harmful content to increase engagement and keep users online as long as possible. The following companies are named in the social media addiction lawsuits:
Meta Platforms
Meta is the owner of Facebook, Instagram, and WhatsApp. Through addictive algorithms and the promotion of harmful content, Facebook and Instagram lure young users into staying on their platforms for hours and continually checking their social media throughout the day.
Our Facebook addiction lawyers are have filed multiple lawsuits against Meta for causing mental health harm in children and teens.
TikTok
TikTok is an incredibly addictive platform that serves users with numerous types of harmful content, including the dangerous “blackout challenge.” This dangerous practice of encouraging users to strangle themselves until they lose consciousness continues to cause the preventable deaths of young children.
The Social Media Victims Law Center represents more than 10 families of children who died while attempting the blackout challenge in TikTok lawsuits. Plaintiffs include a 12-year-old girl who developed an eating disorder that led to hospitalization and permanent damage to her reproductive organs after TikTok detected her interest in exercise and repeatedly served her videos promoting anorexia.
Snapchat
Snapchat is a popular social media platform among youth that is well-known for its disappearing messages, a feature that makes the platform appealing to sexual predators.
Our law firm represents parents whose children have suffered the following harm:
- Sexual abuse
- Suicide, including cases posted live on Snapchat
- Exposure to sexually explicit material through disappearing messages
- Connections with drugs that were laced with fentanyl, resulting in overdose deaths
- Development of a negative body image and harmful eating behaviors
Learn more about our current lawsuits against Snapchat.
Discord
We are currently involved in a case against Discord in which a sexual predator contacted a young girl on Roblox, a children’s gaming app, and encouraged the girl to move to Discord, where she was exploited by several men.
Why Are Social Media Platforms Liable for Addiction?
Plaintiffs allege that the social media companies owed a heightened duty of care because the complaints involve minors. According to the complaint, the social media companies knew or should have known that their products could cause harm, yet they failed to mitigate the risk of harm or warn users about the risk.
The complaints in the current lawsuits seek to hold social media companies liable on the basis of strict liability and negligence for the following:
- Algorithms that promote compulsive use
- Never-ending feeds
- Lack of warnings when users are signing up
- Lack of any method to monitor and self-restrict length and frequency of use
- Barriers to voluntarily deleting or deactivating accounts
- Lack of meaningful age verification processes
- Lack of effective parental controls or monitoring mechanisms
- Lack of labels on filtered images and videos
- Intrusive notification timing designed to lure users back to the platforms