Facebook has had a rough month—and deservedly so. The company has earned a special place of distrust in the hearts of many: A CNN poll published last week found that 3 out of 4 U.S. adults say Facebook is making American society worse.

In an October Senate hearing, former Facebook employee Frances Haugen made explosive allegations that the company’s own research documented the harms its site inflicts upon users. In other words, Facebook itself allegedly knew that its business harmed others in concrete and preventable ways, like promoting photo sharing that damages the mental health of young people, especially girls. How has Facebook gotten away with it?

Part of the answer lies with Section 230 of the Communications Decency Act, the controversial federal law that essentially gives websites broad protection against liability for content posted by others. The law shields Facebook from the responsibility and liability of a traditional publisher.

Though a newspaper might be sued for libel over a defamatory article, Section 230 protects online platforms from liability for the content they distribute as long as they did not create it. In effect, Facebook has received a federal subsidy in the form of Section 230, which largely protects it from an important form of societal regulation: lawsuits.

Social media companies have escaped these lawsuits mostly unscathed. For example, Facebook was sued by a victim of sex trafficking who had connected with her abuser through the site. In June the Texas Supreme Court dismissed most of her claims based on Section 230 immunity. In a different case, family members of victims killed by terrorist attacks sued Twitter, Facebook and Google, alleging that these companies provided material support to terrorist organizations. The 9th Circuit ruled, also in June, that most of the claims were barred by Section 230.

But there are grounds for civil liability lawsuits against Facebook outside the scope of Section 230. While 230 lets social media companies off the hook for harmful content posted by users, Facebook’s internal documents and Haugen’s Senate testimony suggest its business model and products are themselves harmful and addictive.

The “like” button and the endless scrolling feature might have negative consequences for mental and physical health by keeping users glued to their screens, as noted by tech insiders such as Tristan Harris and former Facebook executive Chamath Palihapitiya. The company’s product design also rewards misinformation. When Facebook overhauled its algorithm to increase user engagement, it boosted amplification of divisive and provocative content.

Facebook’s products and what the company says about them should be fair game for product liability lawsuits. People who suffer physical or emotional harm from those products—especially teenagers and young adults who are particularly vulnerable to the site’s features—should be able to sue the company without getting bogged down by Section 230.

Certainly Section 230 needs to be modified. It is currently written so that courts interpret it too broadly to mean blanket immunity even when the claims against a company are not based on publisher or speaker liability. The law should be updated to clarify that companies are responsible for their business practices and products, a line that could be drawn without upending the important protections for free speech and content moderation that 230 provides.

But legislative reform won’t happen fast, and accountability for Facebook shouldn’t have to wait. In addition to compensating injured victims, lawsuits serve another purpose—they will compel the famously evasive company to disclose more information on what it knows about its own products.

Nancy Kim is a law professor at Chicago-Kent College of Law, Illinois Institute of Technology.

0
0
0
0
0

Recommended for you