How Big Tech’s Bullying of Parler and Non-Compliance with Section 230 Could Lead to its Demise

UKRAINE - 2021/01/10: In this photo illustration the Parler logo seen displayed on a smartphone with the Google, Amazon and Apple logos displayed on PC screen in the background. Google, Apple and Amazon have suspended the social networking app Parler. Parler became unavailable in App Store, Google Play and Amazon Web Services, reportedly as said insufficient control over user posts that encouraged violence, reportedly by media.
Pavlo Gonchar/SOPA Images/LightRocket via Getty Images

In 2017, Google was sued by the family of Nohemi Gonzalez, a 26-year-old American student studying abroad, who was tragically murdered in the 2015 ISIS Paris attacks that killed 130 innocent people. The terrorists used Google’s YouTube service to actively recruit, plan, incite, and even give instructions for the terror attacks. Incredibly, Google conceded all of these facts but argued — successfully — that it should not be held liable because of protections provided to social media companies under The Communications Decency Act (47 USC 230 or “Section 230”).

The court agreed with Google, finding that it was protected under Section 230, as it was not the “publisher” and didn’t create or contribute to the harmful material disseminated. While controversial, Section 230 is clear: an “internet service provider” such as a social media company, is not required to moderate content and cannot be held liable for 3rd parties who post legally problematic or even dangerous material. In other words, social media companies are permitted to simply be loudspeakers, telephones, headphones, and public squares.  They may exist solely to deliver other people’s content without interference and without fear of legal action.

Section 230 has not changed since 2017, but Google’s mercurial positions certainly have. Most recently, President Trump was banned from Twitter for “incitement of violence” related to the riots and siege of the US Capitol building. Shortly thereafter, Google removed social media app Parler from its application server. In a stunning and irreconcilable shift from its court position of 2017, Google explained that the company “require[s] that apps implement robust moderation for egregious content.” Indeed, it is clear that Google has modified its content policies and has banned many terror-affiliated accounts since the 2017 lawsuit, but it nevertheless maintains its legal position that it committed no wrongdoing in simply allowing third parties to post incitement of violence on its platform. Google has not paid a dime nor admitted any liability to the Gonzalez family for Nohemi’s death.

Now, after effectively using Section 230 as a shield against liability, Google wields it as a sword against Parler, arguing that Parler somehow does not enjoy the same statutory protections of Section 230 that Google relied and still relies upon. In other words, Google appears to now be taking the position that its internal policies related to community standards and/or content moderation somehow supersede the statute that permits Parler to allow 3rd party posts without modification or censorship and without fear of legal action. Google now says that Section 230’s protections, allowing internet services like Parler simply to act as a public square, can be retroactively “repealed” by that all-powerful private company’s policies.  Congressional legislation-be-damned.

Google cannot have it both ways. Either Google is not liable for ISIS’ incitement on YouTube, and neither is Parler liable for incitement by 3rd parties on its site, or both are liable.

But if something does not feel right about Google now saying that it can require Parler to take editorial positions and act as a newspaper, you are not alone. First amendment principles protect the right to speak, protect the right not to speak, and protect your right not to be forced to speak. All of these rights necessarily work together, all are necessarily equally important. If Parler does not wish to take positions, how can Google force it to do so?

Federal courts have explained that Congress’ enactment of Section 230 in 1996 was an effort “to encourage the unfettered and unregulated development of free speech on the Internet, and to promote the development of e-commerce.” Section 230, therefore, was meant to promote, widen and expand the promulgation of speech on the internet, not to restrict, curtail and suppress it. In fact, Section 230 specifically identifies diversity of political discourse as desirable, expressly advocating for “interactive computer services [to] offer a forum for a true diversity of political discourse.”
But restriction and suppression of speech is Google’s very goal here, an accusation they do not even deny.

In taking actions against Parler, and others, so directly antithetical to the express language of Section 230, it seems clear that Google, and possibly other social media companies, are themselves in direct violation of Section 230.

Google is far from alone in its exploitation and abusive use of Section 230.  In 2019, Facebook argued before the second circuit that it was immune to suit under Section 230, after Hamas, a U.S.-recognized terrorist group, incited and created a plan on Facebook that resulted in the murder of five Americans. Hamas were then able to celebrate these murders on the site. As with Nohemi Gonzalez and Google, the court found that even if promotion of the murders occurred on Facebook’s site, Facebook was immune from liability, since it did not create or develop the content of the messages.

The shocking (and arguably illegal) 180-degree reversal by Google and the other social media giants related to content moderation raises serious questions as to the motivations of these companies. Do their positions truly keep reversing, or is there something else here at play?

Given the dubious timing, stunning across-the-board social media coordination, and the seeming obsession with weakening and even blocking right-leaning services and platforms completely, it is worth considering whether anti-competitive collusion between the social media giants exists. If the exclusive syndicate of social media giants that control access to nearly all of the ideas and content exchanges on the internet are conspiring to limit which other companies may, too, enjoy that unique privilege, then the free exchange of ideas on the internet is already dead.

There is hope. The full free and fair exchange of ideas, even if damaged and tampered with, can always be returned to our society. But that process is an arduous and noble one. It begins with an unbiased and truly independent investigation into the behavior of these tight-knit social media companies. The Justice Department is already suing Google for monopolistic practices and its recent behavior surely will not help its cause. But at this point, it seems necessary for the DOJ to broaden the investigation into all of the social media giants and to ascertain whether, indeed, there has been an anti-competitive conspiracy at play. The first amendment and survival of free speech itself depends on it.

Jeffrey Lax is an attorney, professor and department chair at the City University of New York. He has hosted talk radio programs on 770 WABC and 970 WNYM in New York and appears on Newsmax TV as a legal analyst.

The views expressed in this opinion piece are the author’s own and do not necessarily represent those of The Daily Wire.

The Daily Wire is one of America’s fastest-growing conservative media companies and counter-cultural outlets for news, opinion, and entertainment. Get inside access to The Daily Wire by becoming a member.

Already have an account? Login