In Everything, Moderation, Including Moderation: the Balance of Section 230
Since 1996, Section 230 of the federal Communications Decency Act has put in its 10,000 hours (and legal challenges). Section 230 is a mere 26 words: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” It protects platforms from lawsuits over speech they did not produce and allows them to moderate content as they see fit for their business model.
Efforts to weaken the law have been unsuccessful thus far, but there is a renewed push in Congress to overturn the protections of Section 230. Without the balance of the statue, free speech and the digital free market would be imperiled, and platforms would be motivated to moderate content by fear of reprisal rather than in the best interest of their users.
Understanding what Section 230 is matters—but so does understanding what it is not. The law is not a free pass for sites to knowingly host illegal content. Critics of the law paint in broad strokes and argue that it allows child exploitation, illegal drug sales, and all manner of unlawfulness to thrive, when this is simply not the case. Should a platform knowingly promote or facilitate unlawful content, they will be held liable.
The internet has opened up new ways for people to exercise their first amendment rights, connect, and express themselves. It has also come with great risks and new challenges to privacy and safety. Section 230 is a prudent acknowledgement of the reality of the digital age, and seeks to create an environment where moderation is available as one tool of many for creating a positive online experience.
Those in favor of repealing Section 230 rely on mischaracterizations of the law in practice, and should consider the real-world implications of its absence. Free speech would fall first. Platforms, fearful without the protections of 230, would over-moderate in order to avoid the ceaseless litigation that responsibility for all user-generated content brings. A free exchange of ideas would be replaced with a carefully curated and timid discussion—or none at all.
Smaller platforms would be particularly impacted. Unable to keep pace with the requirements and cost of compliance, many would simply cease to operate. Few innovators would be motivated to enter such a litigious market. The online world would be left with a few large companies, and a lot of censorship. Zooming out, this absence of a flourishing and diverse internet ecosystem would inevitably have global ramifications. Long considered a leader of the internet, the United States would grapple with the market impacts of a stagnating tech economy.
Section 230 creates a landscape where consumers can choose the platform that most aligns with their desired online experience, while platforms have the confidence to continue improving their models and moderating as they see fit without fear of excessive litigation. The laws of the land still apply: no site can knowingly host or aid in violations of the law. So too do the laws of the free market. If consumers find that a site is not delivering the goods they desire, they can take their business and speech elsewhere. Protecting Section 230 is vital to a flourishing and balanced internet.
Links to Learn More:
Section 230 is Good, Actually | Electronic Frontier Foundation