There’s been a lot of discussion about Section 230 lately. President Trump wants it repealed, and other politicians have expressed similar concerns from differing ideological perspectives. But what does Section 230 do, what would be the effects of repealing it, and how does this all effect web masters?
What is Section 230?
In simple terms, Section 230 is a section of federal law that protects web publishers from liability for the things that third party users post to their websites. Basically, this means that I don’t go to jail for things other people put in the comment section. There are some limits and exceptions to this, of course, but web publishers aren’t immediately liable for something someone else happens to post. Court orders can still force a takedown of anything from slander to copyrighted information – but the web site’s owner isn’t liable for the time frame from when it was first posted until they have a legal order delivered to them.
History of internet comment regulations
To say that 230 has shaped the legal environment that the internet has grown up in would be an understatement. Today, we all take comments and user generated content for granted, but when the internet was new courts and politicians weren’t sure what to do with it.
Early internet laws evolved directly from related laws that governed newspapers and other media outlets. One important case was Smith vs. California, that established a difference between publishers and distributors: publishers could be legally liable as they were expected to be familiar with the content, but distributors who weren’t involved in the content’s creation were not liable.
Websites had successfully defended themselves from lawsuits over comments by claiming they were merely distributors, not publishers. It made some sense, too, because the owner of a website isn’t directly involved in the creation of the comment content.
This defense was relatively successful, too, until Oakmont vs. Prodigy. The difference, Oakmont’s lawyers claimed, was that Prodigy actively moderated their message boards. Since they had deleted other comments and exercised editorial discretion over the discussion already, the courts found that they were liable as a publisher, not distributor.
The immediate response from web publishers was to just stop moderating anything. As long as they let the users post whatever they wanted, there was no risk of liability for the illegal things those users would inevitably post. Any attempt to delete illegal content would only open up liability for all the others that had slipped the moderator’s view.
Naturally, it was a bit of a mess. Spamming became an overwhelming problem, and discussion boards often veered wildly off topic. It wasn’t long before tech companies were begging Congress for better legislation.
Proposed alternatives to Section 230
While there are people on all sides of the political spectrum who take issue with the effects of Section 230, they tend to have completely different problems with it. Just because you can find Democrats and Republicans who want to get rid of 230, doesn’t mean they agree on why… or what it should be replaced with:
Trump calls for free speech
While Trump has been a vocal critic of Section 230, his words on the matter indicate that he might like to go back to the legal framework created by Oakmont vs. Prodigy. His complaints began early in his presidency, but they became louder and more urgent when Twitter slapped a warning label on his false and misleading Tweets.
If 230 were to be repealed with nothing in its place, this might make it impossible for sites like Twitter and Facebook to mark misleading information and “fake news.”
Trump and his allies seem to prefer this strategy as they focus on how Section 230 “enables corporate censorship.” As a webmaster, I just need an opportunity to delete spam and hateful messages, thank you very much.
Others call for increased liability
A more “moderate” and bipartisan effort has been established between Republicans and Democrats in Congress. This approach is also calling for repeal and/or modification of Section 230, but with completely different goals. The bipartisan talks have focused on increasing the liability of web publishers who host illegal or harmful content.
Most of these proposals and ideas would have a similar effect: dramatically increasing the costs of moderation and the risks that webmasters face when opening a discussion or submission form of any kind.
Section 230: The bottom line for webmasters
As it exists, Section 230 is probably the best regulatory framework for web companies, big and small. While there are a few problems with bad actors who intentionally harbor illegal user-generated content, there are also legal remedies to hold them accountable.
Even under 230, webmasters are still required to participate in and assist with any legal investigations. We also still have to follow any legally produced content takedown orders. Those who refuse to cooperate will ultimately be found liable, but Section 230 gives us that time and chance to cooperate without being dragged straight to court, first.
It also gives us an option to moderate user-generated content at all, which is something that many websites simply didn’t do after Oakmont v. Prodigy.
Considering the fact that the primary complaints about Section 230 are offering diametrically opposed solutions, it may just be that Section 230 is already the best compromise for everyone involved.