A Nashville school district became the latest education system to sue several social media companies over growing mental health concerns among students, joining more than 40 nationwide districts demanding accountability from big tech.
The Clarksville-Montgomery County School System (CMCSS) reportedly filed a lawsuit against Meta, Facebook, Instagram, TikTok, Snapchat, Google, WhatsApp, and YouTube due to the “damages and growing mental health crisis among students.”
“Over the past few years, we have observed and experienced a rise in mental health issues, threats of school violence, cyberbullying, inappropriate content, and other challenges, damages and disruptions linked to students’ use of social media – and the lack of protections and controls thereof,” CMCSS Director Dr. Jean Luna-Vedder said in a news release reported by local media.
“Without cooperation and support from social media companies, CMCSS has been fighting an uphill battle. We need to protect our children, our schools and our society,” Luna-Vedder added.
Attorneys at the Lewis Thomason law firm representing the district said the issues from such social media platforms had caused further disruptions, increased costs, and safety concerns in the school district.
“The Clarksville-Montgomery County School System is taking a brave and proactive step to seek accountability and marked changes in the way social media giants interact with children,” Chris McCarty of Lewis Thomason told a local Fox affiliate.
According to a Daily Mail analysis, more than 40 school districts across the U.S. have filed lawsuits against social media giants, including Facebook, Snapchat, and TikTok, alleging the platforms knowingly cause harm to children with “malicious” algorithms.
Children have been subjected to dangerous social media trends, including the “Blackout Challenge,” which persuades minors to strangle themselves. Other trends promote self-harm content, such as suicide, self-injury, and eating disorders.
CLICK HERE TO GET THE DAILY WIRE APP
Earlier this year, Seattle Public School District officials filed a 91-page-lawsuit in a U.S. District Court against Facebook, TikTok, Google, Snapchat, and YouTube, arguing that the platforms have caused a public nuisance affecting the district.
“Defendants have successfully exploited the vulnerable brains of youth, hooking tens of millions of students across the country into positive feedback loops of excessive use and abuse of Defendants’ social media platforms,” the lawsuit reads. “Worse, the content Defendants curate and direct to youth is too often harmful and exploitive.”
Such content alleged in the lawsuit accuses social media companies of promoting a “corpse bride” diet, eating 300 calories daily, or encouraging self-harm. Other mental health harms the lawsuit accuses big tech of cultivating among the youth include anxiety, cyberbullying, and suicide.
U.S. Surgeon General Vivek Murthy issued an advisory in May, citing a link between spending time on social media and the nation’s mental health crisis among the youth.
“I’m issuing this advisory because we’re in the middle of a youth mental health crisis, and I’m concerned that social media is contributing to the harms that kids are experiencing,” Murthy told The Hill.
According to a 2019 study, adolescents between the ages of 12 and 15 who spent over three hours on social media per day had double the risk of developing symptoms of depression and anxiety, and the blame should not be placed entirely on parents who attempt to manage a healthy dose of social media for their children.
“It’s an unreasonable expectation because prior generations never had to experience and manage the rapidly evolving technology that fundamentally changed how kids thought about themselves, how they thought about their friendships and how they saw the world,” said Murthy.
Antigone Davis, head of safety at Meta, told the Daily Mail that the platform aims to reassure every parent that it has developed more than 30 tools to support teens and their families with safe and supportive experiences online.
Such tools include allowing parents to decide when and for how long their teens use Instagram, age verification technology, automatically setting accounts belonging to those under 16 to private when they join Instagram, and sending notifications encouraging teens to take regular breaks.
Other technology the company claims it has invested in assists in finding and removing content related to suicide, self-injury, or eating disorders before users report such incidents on the app.
“These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families,” Davis said.
A Snapchat spokesperson said the company curates content from known creators and publishers and uses human moderation to review user-generated content before it can reach a large audience, which the company claims reduces the spread and discovery of harmful content.
“Nothing is more important to us than the well-being of our community,” the spokesperson told the Daily Mail.