05302020What's Hot:

The Section 230 Illusion.

Another day, another act of corporate censorship. Steven Crowder was just demonetized by YouTube, after Vox’s Carlos Maza launched a broad-based campaign to get him taken off the platform, with the help of his fellow blue-checked aristocrats.

In a video response filmed with Tim Pool, Crowder urged that the government force social media companies to admit that they are, in fact, “platforms,” and therefore are not protected by Section 230 of the Communications Decency Act. Crowder is not the only one to suggest this; this talking point has spread far and wide, even among wartime conservatives who are inclined to take the fight to Big Tech.

But what if everyone’s focused on the wrong thing? What if the publisher/platform distinction is a red herring?


Congress passed the Communications Decency Act in 1996. Famously, subsection (c)(1) of Section 230 states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” under state and federal law.

It is understandable that influencers like Crowder want the government to affirmatively declare that Twitter and Facebook are publishers, and not platforms.

Subsection (c)(1) makes defamation lawsuits against online platforms nearly impossible to win, if the suit is based on content uploaded by “users.” Defamation cases are an uphill battle under normal circumstances; if you are a “public figure,” you have to prove that the defendant acted with “actual malice.” “Actual malice” in defamation law has nothing to do with feelings of anger or hate. Rather, it is defined as publishing a statement while either knowing it is false or with reckless disregard for whether it’s true or false. 

The additional protection provided by Section 230, which mandates that ISPs are not the *publishers* of defamatory content on their platforms, makes those ISPs nearly invulnerable to defamation claims. And that is where the action is, because suing the individuals who post defamatory content is seldom worthwhile, financially.

It is understandable that influencers like Crowder want the government to affirmatively declare that Twitter and Facebook are publishers, and not platforms.

But this won’t solve the problem.

Josh Sorenson. Creative Commons.


Sure, making Facebook and Twitter “publishers” would expose them to defamation liability based on user-created content. But again, winning defamation lawsuits is still almost impossible! 

Exposing platforms to liability for third-party content is a band-aid on a bullet wound.

If a “public figure” sues a social media company for defamation based on a user’s post, they would still have to prove that the platform itself bore “actual malice” towards the plaintiff in permitting the post to be made and stay online. That’s basically impossible; how would Facebook know a user’s statement was a lie, when they weren’t the one making the statement? Congress is unlikely to make online platforms vet every piece of user content for truth or falsity. 

Moreover, removing this liability protection doesn’t actually solve the problem of censorship, which is separate from the issue of liability for third-party content. If Facebook decides to ban you because it doesn’t like your politics, tweaking subsection (c)(1) wouldn’t give you, the banned user, any way to get your account back.

The point of the “Platform Access is a Civil Right” initiative is to identify an approach that would solve the deplatforming problem. Exposing platforms to liability for third-party content is a band-aid on a bullet wound. The goal should be getting to a world where wrongfully banned users have the right to walk into court and get a court order forcing Facebook or Twitter to immediately restore their account. Failing that, platforms should be subject to damages for unjustified banning and shadowbanning. 

In fact, however, there’s another part of Section 230 that doesn’t get much attention – and it’s that provision that conservatives should focus on.

Panumas Nikhomkhai. Creative Commons.


As we have discussed, subsection (c)(1) shields platforms from liability for the acts of third persons – typically, for “publishing” third party content that is defamatory or otherwise tortious. 

Legislators and influencers should focus on writing new laws that protect users from bad faith deplatforming.

But there’s a more important liability protection in section 230. This provision protects platforms from liability for acts of the ISP’s themselves.

That’s subsection (c)(2). It mandates that:

“No provider or user of an interactive computer service shall be held liable on account of… any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

Subsection (c)(2) should guide our attempts to fix social media censorship. This provision permits social media companies to “restrict access” to “obscene…or otherwise objectionable” material,

But it also requires ISP’s to act “in good faith” – and right now, they are not. Thus, there is nothing in Section 230 that should prevent individual states from passing legislation to protect their citizens from “bad faith” deplatforming. State-based consumer protection legislation already exists for any number of different industries; it should apply to social media companies, too. There’s a strong argument, in fact, that it already does.

If there is any question as to whether subsection (c)(2)’s “good faith” requirement permits individual states to protect their citizens from arbitrary bans, Congress should clarify the point by amending the statute. Congress should also make it clear that, bad faith or not, nothing in subsection (c)(2) permits de-platforming people based on political ideology or offline conduct.

The debate about whether social media companies are “publishers” or “platforms” is beside the point. Legislators and influencers should focus instead on writing new law that protects users from bad faith deplatforming.

Let’s take the fight to the book burners of our age. 

Let’s make platform access a civil right.

Ron Coleman is a partner at Mandelbaum Salsburg, P.C., where he practices commercial litigation. He has established an international reputation relating to the use and abuse of intellectual property as a tool of competition and free speech. Ron is best known as the lead lawyer for band leader Simon Tam in his successful appeal of the U.S. Patent and Trademark Office’s refusal to register the trademark THE SLANTS on First Amendment Grounds. 

Will Chamberlain is a lawyer and the publisher of Human Events.

Source: Human Events

comments powered by HyperComments

More on the topic