Not Only Are Social Media Companies Not Liable for Spreading Defamatory Material, but They Can Actually Profit From It
Our main headline asks: Can the State of Texas Force Facebook to Post Lies About Joe Biden?
That is effectively what the states of Texas and Florida are arguing before the Supreme Court this week. This argument goes under the guise of whether states can prohibit social media companies from banning material based on politics.
However, since much of Republican politics these days involves promulgating lies, like the “Biden family” Ukraine bribery story, the Texas and Florida law could arguably mean that the state could require Facebook and other social media sites to spread lies.
There is no obvious reason why CNN can be sued for carrying an ad falsely claiming that a prominent person is a pedophile, but Twitter and Facebook get to pocket the cash with impunity.
There is a lot of tortured reasoning around the major social media platforms these days. The Texas and Florida laws are justified by saying these platforms are essentially common carriers, like a traditional landline telephone company.
That one seems pretty hard to justify. In principle, at least, a phone company would have no control over, or even knowledge of, the content of phone calls. This would make it absurd for a state government to dictate what sort of calls could or could not be made over a telephone network.
Social media platforms, however, do know the material posted on their sites. In fact, they make conscious decisions, or at least have algorithms that decide whether to leave up a post, boost it so that a broad audience sees it, or remove it altogether.
There is no obvious reason why CNN can be sued for carrying an ad falsely claiming that a prominent person is a pedophile, but Twitter and Facebook get to pocket the cash with impunity.
The social media companies arguing against the Florida and Texas laws say that they have a First Amendment right to decide what material they want to promote and what they want to exclude, just like a print or broadcast outlet. However, there is an essential difference between the factors that print and broadcast outlets must consider in transmitting and promoting content and what social media sites need to consider.
Modifying Section 230
People can sue print and broadcast outlets for defamation for transmitting false and harmful material to individuals or organizations. Social media sites do not have this concern because Section 230 of the Communications Decency Act protects them.
This means that not only are social media companies not liable for spreading defamatory material, but they can actually profit from it. This is not only the case for big political lies. If some racist decides to buy ads on Facebook or Twitter falsely saying that they got food poisoning at a Black-owned restaurant, Mark Zuckerberg or Elon Musk get to pocket the cash.
The restaurant owner could sue the person who took out the ad if they can find them (ads can be posted under phony names), but Facebook and Twitter would just hold up their Section 230 immunity and walk away with the cash.
The same story applies to posts on these websites. These can also generate profits for Mr. Zuckerberg or Mr. Musk since defamatory material may increase views and make advertising more valuable.
The ostensible rationale for Section 230 was that we want social media companies to be able to moderate their sites for pornographic material or posts that seek to incite violence without fear of being sued. There is also the argument that a major social media platform can’t possibly monitor all the hundreds of millions, or even billions, of posts that go up daily.
Removing protections against defamation suits does not interfere with the first goal. Facebook or Twitter should not have to worry about being sued for defamation because they remove child pornography.
As far as the second point, it is true that these sites cannot monitor every post as it goes up. However, they could respond to takedown notices from individuals or organizations claiming they were defamed.
A Model
There is an obvious model here. The Digital Millennium Copyright Act requires that companies remove material that infringes on copyright once notified by the copyright holder or their agent. If they remove the material promptly, they are protected against a lawsuit. Alternatively, they may determine that the material is not infringing and leave it up, risking litigation.
We can have a similar process with allegedly defamatory material, where the person or organization claiming defamation must spell out exactly how the material is defamatory. The site then has the option to take the material down or leave it posted and risk a defamation suit.
This would effectively be treating social media sites like print or broadcast media. These outlets must take responsibility for the material they transmit to their audience, even if it comes from third parties. (Fox paid $787 million to Dominion to settle a lawsuit primarily over statements by guests on Fox shows.)
We can also structure this sort of change in Section 230 to favor smaller sites. We can leave the current rules for Section 230 in place for sites that don’t sell ads or personal information. Sites that support themselves by subscriptions or donations could continue to operate as they do now.
Social Media Network Effects
This would help to counteract the network effects that tend to push people towards the most prominent sites. After all, if Facebook and Twitter were each just one of a hundred social media sites that people used to post their thoughts, no one would especially care what posts they choose to amplify or remove. Users who don’t like the editorial choices may opt for a different website, just as they do now with newspapers and television stations.
Because these social media sites have such a massive market share, their decisions take on so much importance.
If we can persuade Congress to restructure Section 230 in a way that downsizes these giants, it will go far toward ending the problem. Modifying Section 230 won’t fix all the problems with social media, but it does remove an obvious asymmetry in the law.
As it now stands, print and broadcast outlets can get sued for carrying defamatory material, but social media sites cannot. This situation does not make sense. It should be changed.
This article from Dean Baker was originally published on the blog of the Center for Economic Policy and Research.