News & Insights
Can Social Media Companies Evade Accountability?
Social media — apps like Facebook, Instagram, Snapchat, Twitter, YouTube, and TikTok — has become an integral part of our daily lives and transformed the way we communicate, connect with others, and consume information.
And while social media can provide many benefits, it also has potential negative impacts on mental health. It can affect a user’s self-esteem, give a sense of isolation, and contribute to anxiety and depression. PBS Newshour addressed these issues in a February 14, 2023 segment called “Social Media Companies Face Legal Scrutiny Over Deteriorating Mental Health Among Teens.”
According to a 2022 study by the Pew Research Center of nearly 750 13- to 17 year-olds found that 97% of teens use the internet daily and 35% of teens use at least one social media app constantly. Although social media provides a means for teens to connect with friends, learn, explore their interests, and find entertainment, these activities often come at a cost to their sleep, school performance, participation in extracurricular activities, and opportunities for face-to-face social interactions. Since it’s difficult to constantly monitor a teenager’s smartphone usage, parents may not realize that 24/7 connectivity with their peers frequently exposes them to 24/7 bullying that might lead to depression, self-harm, eating disorders, or even suicide.
The content on apps is currently considered “third-party,” meaning Section 230 of the 1996 United States Communications Decency Act (CDA) does not apply to social media companies. As a result, they are not liable for harm caused by the content on their apps. To illustrate, if a teen commits suicide as the result of cyberbullying on Instagram, Instagram cannot be held liable.
Several new lawsuits, however — including one being considered by the U.S. Supreme Court — may begin holding social media companies accountable for the harm they knowingly cause their users.
The Supreme Court is set to hear arguments on February 21, 2023 on the CDA and its application to social media companies in Gonzalez v. Google, a case involving ISIS recruitment videos. The case could potentially change the entire landscape of online content going forward.
Additionally, a California court could soon decide whether algorithms that promote and recommend content on social media sites can be considered defective products. If the court allows In re: Social Media Adolescent Addiction//Personal Injury Product Liability Litigation to proceed, the plaintiffs with be permitted to test a novel legal theory that these algorithms are defectively designed, that they are designed to be addictive, and that social media companies that develop the algorithms can be sued under product liability laws.
Should the plaintiffs win, the implications would be widespread, impacting the development and regulation of software, as well as the ways in which the upcoming generation of users engages with social media.