Despite the many benefits of social media, there is a dark side to these online platforms. Over the past several months, national lawsuits have been filed against social media giants like Meta (Facebook/Instagram), TikTok, and others. These lawsuits claim that the product designs of these platforms utilize psychologically addictive properties to target children and young people.
Some of the perils of social media have been evident for some time, such as cyberbullying, for example. However, these lawsuits are based on the underlying dangers of the addictive nature and negligent influence of social networks.
We’ll cover the most recent updates in the multidistrict litigation against social titans, other cases involving the pitfalls of social media, and some concerns being raised.
Social Media Dangers: The Alarming Mental Health Epidemic
The peril of social media is nothing new. On the surface, social media platforms spout the importance of connections. However, what once was an exciting prospect has become a severe mental health hazard. Not only has social media provided an opportunity for negative influences; it has also fueled addictive, dangerous behavior.
Some concerns raised against Meta and other social media platforms could have serious consequences. A few of the concerns and claims against social media companies include:
- Adolescent addiction: Recent lawsuits have claimed that social media is designed to create addictive stimuli targeted toward youths. Claimants assert that the constant use of social media platforms causes a release of dopamine, which correlates to addictive behavior. This addictive property when introduced to adolescents’ brains creates an imbalance in the undeveloped vulnerable minds, claims several lawsuits.
- Cyberbullying: The ease of access social media creates can be an avenue for cyberbullying. Because of social media’s anonymity, cruel and inhumane bullying can occur over social media, seemingly without consequence. However, the ramifications of cyberbullying can be mentally devastating for the victims.
- Negative personal image: Social media promotes the presentation of an edited portrayal of your best self. As Instagram and TikTok influencers often present a “perfect image” of themselves, they can negatively impact other users’ self-esteem and self-image.
- Negligent influence: in 2022, parents sued TikTok over two young girls’ deaths after viewing the “Blackout Challenge.” The parents claimed the social media company allowed dangerous content on the girls’ feeds, contributing to their deaths. The young girls, ages 9 and 8, were found dead after repeatedly watching videos of the infamous TikTok challenge, so the lawsuit claims. The parents claim that TikTok intentionally curated its algorithm to serve dangerous content to the adolescent girls.
- Predatorial targeting: A growing concern is that social networks target adolescents and youth to exploit them for profit. With content geared towards younger audiences, social media platforms could be preying on young people and influencing them to use their application. Social media companies primarily generate revenue through hyper-targeted advertising. So the more users they have engaging on their platform, the higher the premium they can charge advertisers.
- Information privacy: As social media platforms often contain a massive amount of intel and information on their users, privacy concerns are prominent among active participants. Facebook’s parent company, Meta, has agreed to pay $725 million to settle a class action lawsuit due to mishandling of private information. Misuse of user data concerns users greatly regarding the social media giants’ trustworthiness with users’ privacy. More information on this litigation is below.
The concern over negligence demonstrated by social media companies has prompted multiple lawsuits and allegations. As these cases unfold, social media users must weigh the checks and balances of an online social system.
Meta, TikTok, and ByteDance Litigation
Multi-District Litigation Claims Deliberate Youth Manipulation
Meta, the parent company of Instagram and Facebook, and other social media companies are facing litigation once more. This multi-district litigation (MDL) claims that repeated use of Meta’s products results in addiction and puts adolescent users at risk. Furthermore, this lawsuit claims that Meta knowingly and willfully utilizes addictive properties in its programming to target and acquire more users who are susceptible to social media influence.
The lawsuits filed against the social networking companies claim that they willfully manipulated younger users with features and tools on their platforms designed to be addictive. These addictive properties trigger the release of dopamine to create a gratifying feeling. Lawsuits claim that Meta is culpable in the creation of social media addiction in youth.
The allegations against Meta come amid a wave of other lawsuits against social media giants. TikTok and ByteDance are currently under fire for a failure to filter content considered mature or dangerous, specifically, lawsuits pertaining to content related to suicide or self-harm. This lawsuit claims that the companies neglected to filter harmful content, which could influence young people to harm themselves.
Further claims allege that Chinese-based TikTok has misleading disclaimers in regard to user privacy. This follows pressure from Washington on the short-form content application’s parent company. This claim alleges that user data is not secure and could be accessed by the Chinese government. With the recent data concerns surrounding Facebook and Instagram and now TikTok, the potential lack of security for social media users’ data is alarming.
Alabama School Districts File Lawsuits Against Social Media Companies
Three Alabama school districts are filing lawsuits against social media companies Meta, TikTok, YouTube, and Snapchat. The lawsuit claims that these online platforms have contributed to the growing mental health crisis within the school system by implementing strategies that make their platforms more addictive.
The growing mental health concern has caused schools to administer mental health programs and resources to combat the effects of social media, claims a representative. The lawsuit claims that the associated social media companies lack proper safeguards to protect young people from harmful content.
Tuscaloosa, Montgomery, and Baldwin County School Districts Spearhead Legal Action Against Social Media
Tuscaloosa, Montgomery, and Baldwin County school systems have issued lawsuits against the social media giants. These lawsuits claim that Meta, TikTok, YouTube, and Snapchat exploit young people for profit and blatantly disregard the mental health effects of social media on adolescents, which can be seen in the form of depression, low self-esteem, dangerous influence, and even suicide.
Tuscaloosa City Schools Increase Spending to Help Mental Health Crisis
Tuscaloosa City Schools Superintendent Dr. Mike Daria says, “The Tuscaloosa City Schools, much like other school systems across the U.S., have witnessed the mental health crisis that is occurring in our youth.” Daria goes on to say that Tuscaloosa City schools have had to increase spending on mental health programs due to the influence of social media.
The school districts are among some of the first in the nation to file such claims against the social media companies.
Parents Sue TikTok Claiming the App Influenced Their Son’s Suicide
In a truly horrific case, Long Island parents are suing the short form media company over the wrongful death of their 16-year-old son. The couple claims that the social media platform prompted over 1,000 videos on suicide, depression, and self-harm to appear in their son’s personal feed. Despite the 16-year-old not searching for these terms, the couple claims the application influenced their son to take his own life in 2022.
“TikTok Ignored Our Son’s Search for Uplifting Content”
Michelle and Dean Nasca claim that the TikTok algorithm directed depressing videos about suicide to their son’s curated feed, which influenced his suicide in 2022. The Nasca family claims their son never searched for terms related to suicide, but the TilTok algorithm prompted the supply of dark content to their son’s personalized “For You” page. The plaintiffs further say that TikTok “ignored” their son’s many searches for uplifting content in the weeks leading up to his suicide. After sending a Snapchat to a friend saying, “I can’t do this anymore,” Michelle and Dean Nasca’s son stepped in front of a train, taking his own life.
The Long Island couple seeks to hold TikTok accountable for promoting dangerous images and glorifying suicidal content. The Nasca family hopes to help prevent TikTok from potentially influencing others to take their own lives through the content that the platform allegedly directed to their son’s feed.
The Mounting Disenchantment with Social Media
The Nasca lawsuit is in addition to the looming question of TikTok’s data security. Mounting concerns from Washington suggest that China might use the short-form media application to gather information on the United States. Dozens of states have banned or restricted the application due to these concerns. The death of the Long Island teenager may contribute to a nationwide ban on the social media platform.
Meta’s Mishandling of Private Information
Social media titan Meta (Facebook/Instagram), has agreed to a $725 million payout to resolve a class action lawsuit for data mishandling. The social media company allowed third-party businesses to have access to private user data and personal information. The lawsuit claims that upwards of 87 million users may have been affected by Meta’s mishandling of their personal data.
The $725 million payout is potentially the largest achieved to date in a data privacy class action lawsuit. While Meta did not admit to any fault in the settlement, the massive payout does set a precedent in this historic case.
TikTok’s Failure To Keep Children’s Data Safe on Platform
TikTok was recently fined in the United Kingdom $15.9 million for allegedly mishandling the data of adolescents under the age of 13. In the United Kingdom, the Information Commission Office (ICO) is responsible for upholding public information rights and data regulation. The ICO claims that more than 1 million children in the UK under the age of 13 had their data compromised or improperly used through the use of TikTok. The agency contends that personal data was used without parents’ acknowledgment or consent and that the social media platform neglected to install appropriate protection for young users.
Social Media Neglects Child Data Protection
The ICO Children’s Code in the UK requires that websites must have parents’ or caregivers’ permission when offering information to children under 13 years old. The ICO claims that TikTok failed to adhere to data protection laws and violated users’ privacy.
According to the ICO investigation, a concern was raised internally with senior employees over user data for under 13-year-olds. However, the concerns were not reported adequately.
Talk About Your Legal Options
If you or someone you love has been a victim of social media negligence, you deserve to have your voice heard. Our team of experienced attorneys can help you better understand your rights and various legal options. On a daily basis, we oppose corporations that abuse their influence and authority. We have the resources and skills to talk about your case. Examine your legal prospects and speak with one of our attorneys today.