Amending 230 for Public Safety. By Anokhy Desai from the University of Pittsburgh School of Law, and the overall winner of the vLex International Writing Competition 2021.

By Anokhy Desai

A 2020 Pew study[1] found that about 20% of American adults get their news from social media, with almost half preferring Facebook. While not the majority, that means roughly 46 million adults[2] rely on social media for current events. Therefore, it is crucial that the news they read on sites like Facebook is factual. Facebook, as a “provider…of an interactive computer service,”[3] is currently shielded by Section 230 of the Communications Decency Act, meaning that it is not liable for the consumption and spread of misinformation, colloquially known as “fake news,” or responsible for its correction. With its international reach, dominant market share, and captive audience of 69% of U.S. adults,[4] Facebook has a responsibility to mitigate the rampant misinformation on its platform for the sake of public safety. In order to promote accuracy online and domestic security, Congress should amend §230 to require platforms like Facebook to fact-check potentially dangerous misinformation.

In 2016, 62% of U.S. adults got their news from social media, with the “most popular fake news” articles on Facebook heavily favoring Donald Trump and thus influencing the outcome of the election.[5] These statistics are more ominous given America’s history of falling for conspiracy theories. In 1994, 5% of Americans believed that the Holocaust never happened. In 2007, 33% of Americans believed that the U.S. government permitted or even assisted the 9/11 attacks. Finally, in 2010, about 15% of Americans believed that then-President Obama was not a citizen. Gallup polls found a sharp drop in “trust and confidence” in fair and accurate media reporting among Republicans in 2016, one linked to the rise of misinformation on social media.[6] This illustrates that the ability to distinguish accurate information is not innate and might be attributed to two issues. First, the growing partisan divide has encouraged the consumption of single-sided media, especially on the Republican side.[7] As of 2019, 55% of Republicans agreed that Democrats are “more immoral” than non-Democrats, and 47% of Democrats believed the same of their counterparts.[8] Second, determining legitimacy was not a critical thinking skill developed in many older adults today.[9] The Baby Boomers lived most of their lives with media governed by the Fairness Doctrine, and thus had a reasonable expectation that most of the media they consumed was accurate to some degree.[10] It follows then that younger Americans are twice as good at distinguishing between fact and opinion in the news than Baby Boomers.[11] Younger generations were, after all, told not to believe everything they see online, and primary educators have made a real effort to introduce critical thinking and fact-checking into their curriculum.[12]

In addition to the political impacts, online misinformation creates real public safety concerns. In 2016, Edgar Welch read a conspiracy theory online, dubbed “Pizzagate,” claiming that Hillary Clinton led a sex slavery ring out of a Washington D.C. pizzeria.[13] He drove there from North Carolina and fired an assault rifle inside the restaurant, believing he was on a crusade to save children. He found no such organization and quickly realized he had acted on misinformation. While Welch miraculously did not injure anyone, he admitted that “the intel on this wasn’t 100 percent.” Despite this, Welch was declared a hero in certain circles that remained convinced of Pizzagate out of sheer animosity towards Clinton.[14] In 2018, Robert Bowers, radicalized by anti-Semitic online forums and social media,[15] attacked the Tree of Life synagogue, taking 11 lives and wounding 6. Having posted that he could not “sit by and watch[] [his] people get slaughtered [by Jews],” he faced 63 federal and 36 state charges for his crimes.[16]

Congress enacted §230 of the Communications Decency Act to shield new and developing technologies from liability for third-party content on their platforms by stating that “interactive computer service [providers]” cannot be treated as the publisher of information posted by a user.[17] While intended to support innovation, §230 now serves as a shield for many of the largest and most powerful companies in the world, such as Google, Twitter, and Facebook. Since Facebook is not liable for any of the content posted on its platform, false information often spreads like wildfire before the site takes any action against it, if it ever does. Partially because of this, there has been bipartisan support to amend §230.[18] One of the Justice Department’s proposed amendments to the statute includes allowing good faith moderation of user content, including fact-checking.

To those unaccustomed to it, fact-checking can seem tedious at best and like censorship at worst. However, the public safety issues at play demand to be balanced against these free speech concerns. Given this, Facebook needs to update its current misinformation definition of “news [that] is harmful to [the] community, makes the world less informed, and erodes trust” to explicitly include incorrect and misleading information in both the headlines and content of articles. Addressing headlines is crucial because most users do not read articles before sharing them.[19] Once Facebook updates its definition, it must then consider how to best moderate its platform. While Artificial Intelligence (AI) has shown troubling biases when applied outside of purely academic hypotheticals, Facebook can potentially avoid bias by feeding the AI live examples of content recently approved or rejected by its human team instead of historical examples.[20] Finally, Congress must consider whether to go further and require fact-checking or to allow users more autonomy in sourcing their own accurate information and face the attendant public safety risks. Congressmembers should not forget that Facebook was designed to facilitate the indiscriminate spread of information and that it has played a role in two national elections now and countless conspiracy theory-based crimes. Its role as a facilitator necessitates a Congressional amendment to §230 to require social media companies to take stronger steps to mitigate the spread of misinformation for the sake of public safety.

[1] Mitchell, A., Jurkowitz, M., Oliphant, J., & Shearer, E. (July 30, 2020). Americans Who Mainly Get Their News on Social Media Are Less Engaged, Less Knowledgeable. Pew Research Center. Retrieved fromhttps://www.journalism.org/2020/07/30/americans-who-mainly-get-their-news-on-social-media-are-less-engaged-less-knowledgeable/.
[2] Extrapolated from total adult population in US, US Census Bureau (2019). ACS 1-Year Estimates Subject Tables. US Census Bureau. Retrieved from https://data.census.gov/cedsci/table?q=US%20adult%20population&tid=ACSST1Y2019.S0101&hidePreview=false.
[3] Legal Information Institute (Apr. 11, 2018). 47 U.S. Code § 230 — Protection for private blocking and screening of offensive material. Legal Information Institute. Retrieved from https://www.law.cornell.edu/uscode/text/47/230/
[4] Gramlich, J. (May 16, 2019). 10 facts about Americans and Facebook. Pew Research Center. Retrieved from https://www.pewresearch.org/fact-tank/2019/05/16/facts-about-americans-and-facebook/.
[5] Allcott, H. and Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives. Retrieved from https://web.stanford.edu/~gentzkow/research/fakenews.pdf.
[6] Swift, A. (Sept. 14, 2016). Americans’ Trust in Mass Media Sinks to New Low. Gallup. Retrieved from https://news.gallup.com/poll/195542/americans-trust-mass-media-sinks-new-low.aspx.
[7] Mitchell, A., Gottfried, J., Barthel, M., & Sumida, N. (June 18, 2018). Distinguishing Between Factual and Opinion Statements in the News. Pew Research Center. Retrieved from https://www.journalism.org/2018/06/18/distinguishing-between-factual-and-opinion-statements-in-the-news/.
[8] Pew Research Center. (Oct. 10, 2019). Partisan Antipathy: More Intense, More Personal. Pew Research Center. Retrieved from https://www.pewresearch.org/politics/wp-content/uploads/sites/4/2019/10/10-10-19-Parties-report.pdf.
[9] Gottfried, J. and Grieco, E. (Oct. 23, 2018). Younger Americans Are Better Than Older Americans at Telling Factual News Statements From Opinions. Pew Research Center. Retrieved from https://www.pewresearch.org/fact-tank/2018/10/23/younger-americans-are-better-than-older-americans-at-telling-factual-news-statements-from-opinions/.
[10] Ferullo, J. (Oct. 13, 2019). Good Riddance: The Last Gasp Of Baby Boomer Politics. The Hill. Retrieved from https://thehill.com/opinion/campaign/465558-good-riddance-the-last-gasp-of-baby-boomer-politics.
[11] Supra note 9.
[12] Bedley, S. (Mar. 29, 2017). I Taught My 5th-Graders How to Spot Fake News. Now They Won’t Stop Fact-Checking Me. Vox. Retrieved from https://www.vox.com/first-person/2017/3/29/15042692/fake-news-education-election.
[13] Haag, M. and Salam, M. (June 22, 2017). Gunman in ‘Pizzagate’ Shooting Is Sentenced to 4 Years in Prison. The New York Times. Retrieved from https://www.nytimes.com/2017/06/22/us/pizzagate-attack-sentence.html.
[14] Taylor, D. (Dec. 7, 2016). Trump Supporters Cheer Alleged PizzaGate Gunman. Patch. Retrieved from https://patch.com/district-columbia/washingtondc/trump-supporters-cheer-alleged-pizzagate-gunman.
[15] Lord, R. (Nov. 10, 2018). How Robert Bowers went from conservative to white nationalist. The Pittsburgh Post-Gazette. Retrieved from https://www.post-gazette.com/news/crime-courts/2018/11/10/Robert-Bowers-extremism-Tree-of-Life-massacre-shooting-pittsburgh-Gab-Warroom/stories/201811080165.
[16] Levenson, E. and Sanchez, R. (Oct. 28, 2018). Mass Shooting at Pittsburgh Synagogue. CNN. Retrieved from https://www.cnn.com/us/live-news/pittsburgh-synagogue-shooting/index.html.
[17] Supra note 3.
[18] Reardon, M. (June 21, 2020). Democrats and Republicans agree that Section 230 is flawed. CNET. Retrieved from https://www.cnet.com/news/democrats-and-republicans-agree-that-section-230-is-flawed/.
[19] Facebook (2020). Working to Stop Misinformation and False News. Facebook. Retrieved from https://www.facebook.com/formedia/blog/working-to-stop-misinformation-and-false-news; Dewey, C. (Jun. 18, 2016). 6 in 10 of You Will Share This Link Without Reading It, a New, Depressing Study Says. The Washington Post. Retrieved from https://www.chicagotribune.com/business/blue-sky/ct-share-this-link-without-reading-it-ap-bsi-20160618-story.html.
[20] Manyika, J., Silberg, J., & Presten, B. (Oct. 25, 2019). What Do We Do About the Biases in AI? Harvard Business Review. Retrieved from https://hbr.org/2019/10/what-do-we-do-about-the-biases-in-ai.


Amending 230 for Public Safety was originally published in vLex News and Updates on Medium, where people are continuing the conversation by highlighting and responding to this story.