Massive new big tech lawsuit demands restitution for cost of ‘youth health crisis’ social media offloaded onto schools, teachers


A complaint filed Tuesday by the Charleston County School District accuses TikTok, Instagram, Facebook, YouTube, and SnapChat of creating platforms which facilitate social media addiction by encouraging “compulsive use,” putting underage users in danger of exploitation, harming adolescents’ user self-image and general mental health, and intentionally making it difficult or impossible for parents to control usage or consumption.

The complaint says that school districts have had to shoulder increased costs and responsibilities around mental health services for their students, as well as increasing disciplinary services to regulate problems which begin and play out on the social media platforms offered by the companies named in the complaint.

The lawsuit, filed in a federal district court for South Carolina, also alleges that the defendants “design their apps to attract and addict youth,” and that the platforms are intentionally defective in a way which “facilitate[s] and contribute[s] to the sexual exploitation and sextortion of children.”

“Facebook and Instagram owe their success to their defective design, including their underlying computer code and algorithms,” the complaint reads. The lawsuit alleges that those company’s parent company Meta’s business model is to deliver as many ads as it can to its users. As a result, it has “every incentive to—and knowingly does—addict users to Facebook and Instagram.”

The complaint cites one feature called Messenger Kids as evidence for their argument that Meta’s intention is to hook kids on their platforms from as young as the age of six. According to a 2018 Meta meeting summary, the complaint alleges, Meta’s goal is to have users “move through” the company’s platforms as they grow. 

“Every detail” of Meta’s interface, the lawsuit alleges, “is designed to increase the frequency and length of use sessions.” Because so many of Meta’s algorithms are secret, the complaint alleges, the details about the “defective, addictive, and harmful design of Meta’s platforms” will be revealed during discovery.

Meta didn’t respond to questions about whether their design was intentionally addictive.

The complaint also cited SnapChat’s “Snap Streak” feature among other “defective features” which promote compulsive and excessive use. It also claims that the “Snaps” and “My Eyes Only” features encourage destructive behavior among teen users.  

“Disappearing Snaps do not operate as advertised,” the complaint alleges. “This is particularly harmful to adolescents,” the complaint says because they “rely on Snap’s representations when taking and sending photos” that the photos will disappear.

“Snap could, but does not, warn users, including children and teenages, that Snaps may not necessarily disappear,” the complaint alleges.

The complaint also says that the “My Eyes Only” feature “unreasonably increases the risk that SnapChat’s adolescent users, many under age 13, will be targeted and sexually exploited and/or trafficked by child predators.” That feature lets users store and hide content “from their parents” in a tab that requires a passcode, according to the complaint. If a user accesses the folder with the wrong code, the content self-destructs.

“‘My Eyes Only’ has no practical purpose or use other than to hide potentially dangerous content from parents and/or legal owners of the devices used to access Snapchat,” the complaint alleges.

“We aren’t an app that encourages perfection or popularity, and we vet all content before it can reach a large audience, which helps protect against the promotion and discovery of potentially harmful material. While we will always have more work to do, we feel good about the role SnapChat plays in helping friends feel connected, informed, happy, and prepared as they face the many challenges of adolescence,” SnapChat told the Daily Dot.

The lawsuit also alleges that TikTok’s “Post-in-Private” feature facilitates the spread of child sexual abuse material [CSAM], citing press reports detailing the practice where predators share logins for these accounts and pass them around.

“Although TikTok’s defective features enable predators, TikTok does not have any feature to allow users to specifically report CSAM,” the complaint alleges.

The complaint goes on to accuse Google and YouTube of designing their interface to “indundate users” with content and features that “maximize ‘watch time.’” This, along with the accusations leveled against all the companies, is the common charge that the school district levels against the companies. The result of this alleged behavior by the platforms is a “profound disruption to the educational environment of schools.”

TikTok didn’t respond to a request about whether their platforms were designed to be intentionally addictive or limit parent controls over usage.

“Protecting kids across our platforms has always been core to our work,” Google said in a statement provided to the Daily Dot. “In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls. The allegations in these complaints are simply not true.”

The conduct of the companies, the complaint alleges, “has created a public health crisis” in the Charleston County School District. Problems include a “surge” in anxiety and depression, suicide attempts, and pervasive online bullying facilitated by the social media platforms.

“The funding needed to address these harms should not fall to the taxpayers of South Carolina,” the complaint finished. Instead, it said, the companies named in the complaint should bear the cost of remedying the wrongs their platforms created.

Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.

Source: https://www.dailydot.com/debug/social-media-addiction-charleston/