This is the mobile-friendly web version of the original article.
The Threat of Social Media to Society and National Security: A Call for Social Media Policy and Legislation
Liberty University
Helms School of Government
Reconciling Constitutionalism & Federalism in a Time of Crisis
March 5-6, 2021
Conference Papers
Frank Hernandez
Video: About Liberty University
- Introduction
- Discussing the Issues
- Responding to the Issues
- Recommendations for Analysis and Policy Direction
- Conclusion
Introduction
Social media is a growing existential threat. In the Constitution, We the People affirm ourselves to domestic tranquility, our common defense and general welfare to secure the blessings of Liberty. 1 As such, statesmen are charged with developing policy to ensure social media is safe, accessible and usable to all. Currently, social media companies and use of their platforms pose a threat to national security and the societal fabric of our nation.
No longer just a communication tool to bring communities and people together, social media is now leveraged to disrupt and diminish faith in our democratic republic, its people, systems and institutions. Citizens, public and private organizations, elected officials, alt-leaning media, etc. are able to write, share, and spread misleading-to-false information through various social media platforms without consequence. Social media companies, protected from liabilities of the intended or unintended effects of information found or shared over their platforms through various policies and liability protections, are the sole determinant of content and user reach.
Discussing the Issues
History and Growth of Social Media
The “social media experience” was driven by peoples’ need for connectedness. These internet-based applications created an online environment and space to nurture connections, build communities, and create and exchange user-generated content in a participatory manner. 2 For example, Facebook’s original mission, which evolved slightly in its early years, stated it was a utility to connect people, create social networks, share with people in your life, build communities, and make the world more open. 3
Yet, what were once viewed as tools to create connections, bridge networks, and build a more informed world have become digital societies that
1US Constitution, Preamble. “We the People of the United States, in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defense, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America.”
2 Jose van Dijck, The Culture of Connectivity: A Critical History of Social Media (Oxford: Oxford University Press, 2013), 3-17.
3 Gillian Reagan, “The Evolution of Facebook’s Mission Statement,” Observer, July 13, 2009, https://observer.com/2009/07/the-evolution-of-facebooks-mission-statement/.
have supplanted mass media as the primary provider of information. 4 In this digitized social world, facts become a matter of opinion, likeminded ideologies find an echo-chamber to political and social polarization, misinformation or disinformation stories spread six times faster than real ones, and society becomes increasingly fractured. 5
Additionally, the largest of these companies are now publicly traded, meaning these platforms are or can become, capitalist enterprises with a singular goal; make money off users to increase profits for shareholders. 6 In fact, as use becomes easier, and growth and user access expands, more and more Americans receive their news from social media. At the same time, “social media shatters unity and divides people, set[ting] them at loggerheads with one another… mak[ing] direct engagement (and therefore confrontation) between opposing camps far easier.” 7
Platform algorithms are programmed to cluster users based off user data, profiles, likes, shares and reactions.8 This creates the space to target users with advertisements, “stories,” or “trending topics” to keep users engaged on the site. 9 Revenue growth is directly tied to increased usage, which generates advertising revenues. “[Social media companies],” as Sarah Kreps, Professor of Government at Cornell University, writes, “have few incentives to moderate content since doing so is at odds with a business model that favors sensational content that attracts and keeps users online.” 10
Current Social Media Policies
Current social media policies are designed to promote usage and growth. Many companies permit usage that violates their own terms of service as a means
4 James P. Walsh, “Social media and the moral panics: Assessing the effects of technological change on societal reaction,”* International Journal of Cultural Studies* 23, no. 6 (2020): 840-859.
5 P. W. Singer and Emerson T. Brooking, Like War: The Weaponization of Social Media (Boston: Houghton Mifflin Harcourt, 2018), 130.
6 Amir Hassan Zadeh and Anand Jeyaraj, “Alignment of business and social media strategies: insights from a text mining analysis,” Journal of Business Analytics 1, no. 2 (2018): 117-118.
7 David Patrikarakos, *War in 140 Characters: How Social Media is Reshaping Conflict in the Twenty-First Century *(New York: Basic Books, 2017), 12.
8 Philip M. Napoli,* Social Media and the Public Interest: Media Regulation in the Disinformation Age* (New York, Columbia University Press, 2019), 34-36.
9 Jeff Orlowski, director, The Social Dilemma (Exposure Labs, 2020), 1 hr., 34 min., https://www.netflix.com.
10 Kate Blackwood, “Social media helping to undermine democracy,” Cornell Chronicle, August 20, 2020, https://news.cornell.edu/stories/2020/08/kreps-social-media-helping-underminedemocracy.
of keeping or attracting users to their platforms. Facebook’s community standards policy reads:
The goal of our Community Standards has always been to create a place for expression and give people a voice… We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable. In some cases, we allow content for public awareness which would otherwise go against our Community Standards. 11
Similarly, Twitter rules and policies state:
Violence, harassment and other similar types of behavior discourage people from expressing themselves, diminishing the value of public conversation. Our rules ensure all people can participate in public conversation freely and safely… However, we recognize that sometimes it may be in the public interest to allow people to view Tweets that would otherwise be taken down. 12
Although a number of companies work to remove some users that abuse their terms of service, their policies maximize interest to keep users engaged and subject to advertisements.13 Additionally, the number of social media users relative to content moderators employed by these sites cannot stop dangerous content from reaching the masses. 14 By implementing circumventable policies around safety, companies ensure an ever-present gateway to dangerous content.
In discussing the policies of social media companies, former diplomat, CIA analyst and national security expert, Yael Eisenstat states, “the modern information environment is crystallized around profiling us and then segmenting us into more and more narrow categories,” bombarding us with information, “confirming our views, and reinforcing our biases,” which is similar to the tactics used by terrorist organizations as they work to recruit new members. 15 Social media policies are encroaching upon the ability to think freely and critically.
11 Facebook, “Community Standards Policy,” current as of January 23, 2021, https://www.facebook.com/communitystandards/introduction.
12 Twitter, “Twitter Rules and Policies,” current as of January 23, 2021, https://help.twitter.com/en/rules-and-policies#twitter-rules.
13 Napoli, *Social Media and the Public Interest, 32. *
14 Paul Domer, “De Facto State Action: Social Media Networks and the First Amendment,” Notre Dame Law Review 95, no. 1 (2019): 893+.
15 Yael Eisenstat, “Dear Facebook, this is how you’re breaking democracy,”* TED.Com*, August 2020, https://www.ted.com/talks/yael_eisenstat.
The Dangers of Social Media Misuse
There are several examples of social media platforms being used for ill intent and purpose. Consequently, there are growing instances of users and actors, empowered by social media and policies permitting their use, causing harm, injury or damage to individuals, communities and institutions.
- Foreign Adversaries
The 2016 U.S. Election showed the U.S. is vulnerable to foreign interference and attack. Studies show that U.S. adversaries are fully capable of conducting information or irregular warfare and achieving defined objectives. 16 In the weeks leading up to the 2016 election, Twitter found approximately 400,000 bot accounts and concluded that Russian-generated propaganda to influence the outcome of the election was delivered 452.7 million times to millions of users, including 2.2 million Tweets from Russia’s Internet Research Agency. 17
Russian disinformation operations against the U.S. are not new, however, with the reach and speed of dissemination through social media, Russia’s ability to weaponize platforms and create digital battlegrounds degraded public trust and confidence in democracy, its processes and institutions. 18 Former national security advisor, H. R. McMaster shared, “Russian manipulation was effective because of social media companies’ business models, and narrow focus on functionality without consideration for how their platforms could be used for nefarious purposes,” adding, “the companies’ algorithms do not prioritize truth or accuracy, but instead help disseminate fake news and disinformation.” 19
Following the 2016 election, studies showed social media users were generally unable to identify online disinformation, which is dangerous as the purpose of state-sponsored disinformation was to degrade democratic society, polarize groups, and skew one’s ability to distinguish between fact and fiction. 20 It can then be deduced, foreign actors such as Russia, leveraging the
16 Patrikarakos, War in 140 Characters, 255-267.
17 Singer and Brooking, Like War, 141-146.
18 Christina Nemr and William Gangware, “Weapons of Mass Distraction: Foreign StateSponsored Disinformation in the Digital Age,” Park Advisors (March 2020): 2-28.
19 H. R. McMaster, Battlegrounds: The Fight to Defend the Free World (New York: HarperCollins, 2020), 47.
20 Edda Humprecht, Frank Esser and Peter Van Aelst, “Resilience to Online Disinformation: A Framework for Cross-National Comparative Research,” International Journal of Press/Politics (2020): 2-8.
“nonintervention” policies of social media, harnessed the power of social media to inflict significant damage on American society. 21 Damage still being felt today.
- Domestic Extremist
The Center for Strategic and International Studies published that although global terrorism dropped 50% between 2014-2019, terrorist attacks in the U.S. increased 141% during the same period and is likely to increase as Americans become more polarized. 22 The study also highlights, “most domestic extremist use the internet and social media platforms to release propaganda, coordinate training, raise funds, recruit members, and communicate with others,” placing the nation at great risks as U.S.-based extremist groups continue to expand their reach through the use of digital platforms.23
During the first seven month of 2020, right-wing and left-wing groups were responsible for 87% of the terrorist plots or attacks in the U.S.24 A recent Department of Homeland Security (DHS) Bulletin finds that although domestic, or ideologically-motived extremists primarily targeted individuals with opposing views through First Amendment-protected and non-violent protest activities, it details many cases where such protests led to violence and harm against individuals and structures. 25 As a result, DHS issued a nationwide terrorism alert on January 27, 2021, warning of a heightened threat of domestic attacks.26
In the Federal Bureau of Investigations’ analysis of these attacks and activities, they share that social media platforms are facilitating the ability of domestic extremists or similar hate groups to reach, radicalize and recruit individuals receptive to polarizing messaging, which can be leveraged to mobilize individuals for violent responses to meet group objectives.27 Like foreign adversaries, domestic extremists are not using these platforms to build better communities. They use them to create division, distrust and fracture society.
21 Adam Conner and Erin Simpson, “Results Not Found: Addressing Social Media’s Threat to Democratic Legitimacy and Public Safety After Election Day,” Center for American Progress (October 23, 2020): 2-14.
22 Seth G. Jones, Catrina Doxsee and Nicholas Harrington, “The Tactics and Targets of Domestic Terrorists,” Center for Strategic & International Studies (CSIS Brief, July 2020): 7-9.
23 Ibid., 9.
24 Seth G. Jones, Catrina Doxsee, Nicholas Harrington, Grace Hwang and James Suber, “The War Comes Home: The Evolution of Domestic Terrorism in the United States,”* Center for Strategic & International Studies *(CSIS Brief, October 2020): 2.
25US Department of Homeland Security,* National Terrorism Advisory System Bulletin due to heightened threat environment across the United States,* January 27, 2021, https://www.dhs.gov/advisories.
26Ibid.
27 “Terrorism,” Federal Bureau of Investigations, accessed January 29, 2020, https://www.fbi.gov/investigate/terrorism.
- Individual Anxieties and Distrust
In addition to foreign adversaries and domestic extremists being able to reach and negatively influence the U.S. population, other topics have proven to be equally at risk of disinformation campaigns. A protest-turned-riot in Washington D.C. on January 6, 2021, provides the most recent example of how social media can be used to disinform and polarize individuals, proving detrimental to the safety and wellbeing of Americans and the nation.
On January 6, 2021, during a joint session of Congress to certify the Electoral College vote for President of the United States, a non-violent protest quickly escalated into a violent breach of the Capitol Building, leading to five deaths, millions of dollars in damages, 150+ arrest and over 400 suspects of various crimes.28 Arguably worse, this protest demonstrated that social media plays a very active role in determining whether or not these actions can occur.
Social media platforms permitted influential personalities to share and to spread misinformation about voter fraud, elevating distrust in the electoral process for a large percentage of Americans.29 As distrust, animosity and false claims spread through these platforms, key figures leveraged social media to mobilize individuals and groups to protest against a legal election process.
In the weeks after the riot, multiple affidavits filed by federal investigators note social media platforms such as Facebook, Twitter, Parler, etc. were used to openly coordinate the protest, to incite violence and unlawful entry into the Capitol Building, and to obstruct with subsequent law enforcement investigations. 30
Research and analysis of whether individual users should be held accountable for events before, during and after the riot at the Capitol Building is for another discussion. However, false and misleading information continues to spread through social media, festering individual anxieties, creating distrust in one another and in U.S. institutions. The issue is not just that American society is being fractured by misinformation and disinformation being spread through social media, it is also that social media companies have almost exclusive control of content reach.
28 Pete Williams and Tom Winter, “FBI identifies more than 400 suspects in Capitol riot,” ABC News, January 26, 2021, https://www.nbcnews.com/news/us-news/fbi-identifies-more400-suspects-capitol-riot-n1255757.
29 “Biden Begins Presidency With Positive Ratings; Trump Departs With Lowest-Ever Job Mark,” Pew Research Center, January 15, 2021, https://www.pewresearch.org.
30 US Department of Justice,* Investigations Regarding Violence at the Capitol,* Office of Public Affairs, January 31, 2021, https://www.justice.gov/opa/investigations.
Responding to the Issues
Responses from within the Industry
Social media companies have taken some action following recent events. Multiple social media platforms banned influential personalities for their alleged roles in the events surrounding the riot at the U.S. Capitol Building. Following the ban, preliminary data suggest that online misinformation on election fraud dropped 73%.31 Twitter launched a “community-driven forum called ‘Birdwatch’ that’s meant to combat misinformation and disinformation on the site,” allowing users to add context to information believed to be misleading.32 Social media application hosting sites, such as Apple, Amazon and Google, removed social media platforms, such as Parler, for not doing enough to moderate content inciting violence or threating safety.33
In a 2018 testimony before the Senate, Facebook CEO Mark Zuckerberg stated, “I don’t want anyone to use our tools to undermine democracy; that’s not what we stand for.”34 Regardless of Zuckerberg’s or other industry leaders’ stated positions, many social media companies continue to enact policies and procedures of nonintervention, with some justifying their minimalist approach “on the basis of a narrow interpretation of freedom of expression, while ignoring the numerous harms, such as harassment, hate speech, voter suppression, and violence.”35 Recently, Facebook’s own internal oversight board overturned Facebook’s decision to remove posts that had been removed for violations of its Community Standards on Dangerous Individuals and Organizations, thereby repermitting posts promoting hate speech and harmful misinformation.36
Still, as larger platforms slowly progress in their willingness to moderate content on their sites, studies suggest that alt-leaning individuals, groups and organizations are migrating to smaller social media-based messaging platforms
31 Edward Moyer, “After Twitter banned Trump, misinformation plummeted, says report,” CNET, January 16, 2021, https://www.cnet.com/news/after-twitter-banned-trumpmisinformation-plummeted-says-report/.
32 Shelby Brown and Queenie Wong, “Twitter launches ‘Birdwatch’ community forum to combat misinformation,” CNET, January 25, 2021, https://www.cnet.com/news/twitter-launchesbirdwatch-community-forum-to-combat-misinformation/.
33 Todd Spangler, “Apple CEO Tim Cook Defends Parler Suspension,” Variety, January 15, 2021, https://variety.com/2021/digital/news/apple-ceo-defends-parler-ban-1234886383/.
34 Hearing before the United States Senate Committee on the Judiciary and the United States Senate Committee on Commerce, Science and Transportation, April 10, 2018, https://www.judiciary.senate.gov/imo/media/doc/04-10-18%20Zuckerberg%20Testimony.pdf.
35 Conner and Simpson, “Results Not Found,”: 3-4.
36 Oversight Board, Case Decision 2020-005-FB-UA, January 28, 2021, https://oversightboard.com/decision/FB-2RDRCAVQ/.
such as Telegram, Epik, and Parler, where extreme ideologies can continue to polarize, becoming a greater threat to individuals and society.37
Whether these responses are viewed as necessary or overreaching, social media companies are ineffective in rooting out users and content detrimental to society. Further, some companies continue to demonstrate an unwillingness to take any action, instead invite and promote users and content creating the type of consternation discussed here.
Recognition of a Growing Threat
The government and other experts are also recognizing a need for broader response. In responding to questions regarding adversary threats during his confirmation hearing, Secretary of Defense Lloyd Austin stated, “Russia has threatened U.S. democratic processes and exerted malign influence on the world stage,” adding Russia and other actors will continue to use cyber and information operations to target democratic institutions and exploit our divisions.38 He also noted, “violent extremist organizations continue to pose a threat to U.S. interest; radical ideology across the internet has expanded the reach of fringe groups, threatening the homeland and inciting violence…”39
Recently, the Executive and Legislative Branches attempted to address the polarization of individuals over social media. The Honest Ads Act, designed to prevent foreign interference in elections and improve transparency of online political advertising, stalled in the 115th and 116th Congress.40 Executive Order 13925 was written to prevent social media companies from censoring online content, however legal experts suggest the order is largely unenforceable as social media companies remain private enterprises.41 42
Subject matter experts are also speaking out against social media companies as they are the sole adjudicates determining content and user reach.43
37 Alicia Wanless, “Cut Loose by Tech Giants, Will Far-Right Extremists Be Adrift?,” Carnegie Endowment for International Peace, January 19, 2021, https://carnegieendowment.org.
38 Hearing before the United States Senate Armed Services Committee, January 19, 2021, https://www.armed-services.senate.gov/imo/media/doc/Austin_APQs_01-19-21.pdf.
39Ibid.
40 Honest Ads Act, S. 1356, 116th Cong. (2019), https://www.govtrack.us/congress/bills/116/s1356.
41 Donald J. Trump, Executive Order 13925, “Preventing Online Censorship,” Federal Register 85, no. 106 (June 2, 2020): 34079, https://www.govinfo.gov.
42 Shirin Ghaffary, “Trump’s executive order on social media is legally unenforceable, legal experts say,” Vox, May 28, 2020, https://www.vox.com/recode/2020/5/28/21273878/trumpexecutive-order-twitter-social-media-section-230-free-speech-implications.
43 Daniel L. Byman and Aditi Joshi, “Social media companies need better emergency protocols, The Brookings Institute, January 12, 2021, https://www.brookings.edu/blog/order-fromchaos/2021/01/12/social-media-companies-need-better-emergency-protocols/.
Other researchers write, “[the] exploitation of social media platforms is an important regulatory question for government and the private sector.”44 Adding, “while some extremists end up barred at the discretion of hosting platforms,” responding to extremism on social media requires a balance between stakeholders and the regulation/moderation across platforms, “conscious of rights to free expression and the appropriateness of restrictions on speech.”45
Due to its global reach, impacts and capabilities, social media policy must be further examined. Analysis shows social media legislation is warranted due to dangers associated with its current use; a tool capable of inflicting harm, damages, and causalities (physical and emotional), and able to bypass, usurp and dismantle the principles upon which the United States was founded.
Recommendations for Analysis and Policy Direction
To address many of the issues and concerns presented throughout this paper, more research is being done to formulate appropriate response options. Some of the notable discussions include:
Section 230 of the Communications Decency Act of 1996 – Review to determine its appropriateness to the current informational environment.46
Mandated self-regulation – Establish stable and coherent norms across social media companies, including rationales for action and intervention.47
Digital Literacy Campaigns – Adult and childhood education programs to develop the critical skills to work, learn and socialize in a digital world.48
Studies will continue to research, refine, and recommend solutions within these frameworks. However, as policy options are further developed, the threat to the nation is grows, and social media is playing an overly active role.
Justifying Policy Options
Under the U.S. Constitution, the foremost obligation of government is to preserve and protect the Union. Liberty University’s Professor of Government, Dr. Kahlib Fischer, writes, “God has commissioned governing authorities to
44 Bharath Ganesh and Jonathan Bright, “Countering Extremists on Social Media: Challenges for Strategic Communication and Content Moderation, Policy & Internet 12, no. 1 (2020): 7.
45 Ganesh and Bright, “Countering Extremists on Social Media,” 7.
46Napoli, Social Media and the Public Interest, 34.
47 Ibid., 200.
48 Singer and Brooking, Like War, 264-265.
primarily protect the citizens under their care; it is a God-given mandate for civil authorities to fight against unjust aggressors.”49 Additionally, We the People possess inalienable rights to life, liberty, and property, rights which cannot be given nor taken away, to which government exists to protect.50
The First Amendment (Counter) Argument
Before policy options can be discussed, the primary challenge to any government involvement, the First Amendment’s guarantee of free speech, must be addressed. Many constitutional scholars, media experts, analysts and field professionals maintain there can never be enough speech, or the answer to false speech is more speech or counterspeech.51 Others maintain, speech alone, no matter how hateful, does not violate the rights of others. 52
Recent studies suggest government has taken a laissez-faire approach in examining free speech in an effort to avoid overreach, neglecting its responsibilities as a government. As all branches of government consider whether or not to punish or censure information characterized as false, they fear they may inadvertently restrict truth or free speech. 53 Consequently, as in New York Times Co. v. Sullivan, government gives credibility to fake news (which can lead to harm and injury), granting liability protections to false claims, establishing dangerous precedence.54
On the original intent of free speech, legal scholar Richard Epstein states, “the theory of freedom not only grants rights to individuals, but it also insists that there are correlative duties associated with those rights.”55 He adds, the concept of freedom, including freedom of speech, as the Founding Fathers envisioned based on philosophical influences of the time, a person is permitted to do what they will unless they use force, threaten force, or misrepresent another to achieve their purpose.56 In such cases, when appropriately applying the intended principles of the First Amendment’s Freedom of Speech clause, government may prohibit misrepresentation or restrict speech if usage can lead to harm, injury or action. 57
49 Kahlib J. Fischer, “Biblical Principles of Government and Criminal Justice,” Liberty University Journal of Statesmanship & Public Policy 1, no. 1 (July 2020): 10.
50 Ibid., 4-5.
51 Napoli, Social Media and the Public Interest, 80-88.
52 Fischer, “Biblical Principles of Government and Criminal Justice:” 8.
53 Cass R. Sustein, “Falsehoods and the First Amendment,” Harvard Journal of Law & Technology (2020): 387+.
54 Sustein, “Falsehoods and the First Amendment,” 387+.
55 Richard A. Epstein, “Fundamentals of Freedom of Speech, The Symposium,” Harvard Journal of Law & Public Policy 1, no. 1 (2020): 54.
56 Ibid., 55-56.
57 Ibid., 56.
The Right to User Data as Property
However, as discussed, social media companies utilize platform algorithms fed by individual user data, personal and private information, designed to pool, target, and manipulate users into truncated interface loops that limit one’s exposure to “counterspeech” or other content that algorithmic programming determines would likely remove the user from the market.58
As noted, social media platforms are not just communications companies, they are advertising agencies providing user data for targeted engagements by advertisers, adversaries, extremists, or other actors. 59 Generally, users are unaware their data is being collected, let alone used for targeting. Therefore, discussions should not center solely around the topic of free speech, rather how social media companies are marketing user data in a manner that allows intelligence-based algorithms, or similar programs to exploit and manipulate the user experience. This discussion would consider if users have a right to their digital data, as they do with other personal property, and should it be protected as such. Philip Napoli, Professor of Public Policy at Duke University, shares that the notion of “you own your own data” is gaining momentum.60
Although there is a personal responsibility in limiting the type of data shared with social media platforms, studies suggest that social media companies should be treated as other information fiduciaries, maintaining strict privacy policies similar to those in legal and medical industries with strict attorney-client and doctor-patient confidentiality limitations. 61 “Privacy protects us from other harms such as discrimination, public shame and reputational damage. It contributes to autonomy by giving us enough physical and mental space to be ourselves and to develop our views without undue external influence.”62
Given the current state of Congressional affairs, in which there is little traction moving legislative or regulatory measures forward to address these issues, the concept of user data rights is recommended for further research and analysis.63 Foreign adversaries, domestic extremists, and other actors have demonstrated a capability and capacity to divide society, undermine the rule of law, and weaken democracy through the malign usage of user data. Policy
58 Napoli, Social Media and the Public Interest, 107-131.
59 Christian Fuchs, Social Media: a critical introduction, 2nd ed. (Los Angeles: Sage Publications, 2017), 157.
60 Philip M. Napoli, “User Data as a Public Resource: Implications for Social Media Regulation,” Policy and Internet 11, no. 4 (2019): 449.
61 Ibid., 451.
62 Carissa Veliz, “Not the doctor’s business: Privacy, personal responsibility and data rights in the medical settings,” Bioethics 34 (2020): 714.
63 Napoli, “User Data as a Public Resource,” 441.
makers, legislatures, and relevant stakeholders need to consider that individuals have a right to their digital data. If government exists to protect inalienable rights, including property rights, a framework for responsible government action could be established, one in which data rights are better protected from nefarious actors.
Mitigating the Dangers of Social Media Technology
In the U.S., other innovations, scientific advancements, and medical breakthroughs are limited by policy or regulation because of the real and unmitigated threats they pose to life and society. Should similar frameworks be considered when looking at social media?
Social media is arguably the most powerful technology ever created.64 Sadly, it can, and has been weaponized by individuals, non-state and state actors. Mark Zuckerberg writes, “companies such as Facebook face sophisticated, wellfunded adversaries who are getting smarter over time. It’s an arms race; it will take the combined forces of the U.S. private and public sectors to protect America’s democracy from outside interference.”65
During the formative years of nuclear technology, regulations were enacted to encourage the technology’s development for the benefit of civil society, to protect the public’s health and safety from the hazards its use, and to demonstrate its promise as an innovative technology.66 However, due to the potential for harm when mishandled or weaponized, regulatory actions were taken to correct deficiencies and establish a grand strategy to ensure safe usage.67
Additionally, “vaccines are one of the most significant achievements of science and public health.” 68 The positive benefits of this technology to mankind are unmistakable. Yet, regulations are permissible, as they ensure public safety, and help identify misuse from nefarious actors, such as terrorist. 69 As a result, there is generally a global consensus to regulatory frameworks in the interest of protecting the public.
64 Mark Silverman, “Review of Like War: The Weaponization of Social Media, by P. W. Springer and Emerson T. Brooking,” International Review of the Red Cross 101, no. 1 (2019): 383-387.
65 Mark Zuckerberg, “Opinion: Protecting democracy is an arms race. Here’s how Facebook can help,” Washington Post, September 15, 2018, https://www.washingtonpost.com/opinions/mark-zuckerberg-protecting-democracy-is-an-armsrace-heres-how-facebook-can-help-win-it/2018/09/04/53b3c8ee-b083-11e8-9a6a565d92a3585d_story.html.
66 J. Samuel Walker and Thomas R. Wellock, “A Short History of Nuclear Regulation, 1946-2009,” United States Nuclear Regulatory Commission (2010): 1-4.
67Ibid., 51-65.
68Marion F. Gruber and Valerie B. Marshall, “Regulation and Testing of Vaccines,” Plotkin’s Vaccines, 7th ed., edited by Stanley A. Plotkin (2018): 1547-1565.
69 Gruber and Marshall, “Regulation and Testing of Vaccines,” 1564.
Not many would present that social media offers zero benefit to society. It has expanded opportunities in the U.S., toppled dictators internationally, and broadcast evil doings globally.70 However, actors are increasingly capable of entering the digital battlefield and using the munitions of weaponized social media platforms to manipulate, corrupt, and grow disdain for one another and democratic society.71
For this reason, policy makers need to acknowledge the cognitive and physical threats stemming from social media, and also recognize that social media companies might be more responsive to developing solutions if they had normative ethical and social responsibilities to the provision of content, and not a singular focus on generating growth, users and profits.
Conclusion
Social media is a growing existential threat. Still, many in the U.S. fail to recognize it for the massive challenge it is.72 It is incumbent upon responsible statesmen to advocate for solutions that maximize liberties and rights, while maintaining and protecting society and the nation. Examining social media usage and policies provides evidence of government and society being fractured. “Social media companies have become active players in digital war.”73 User data can be leveraged by others with almost no regard to risk. Battlegrounds of aggression are being created at the expense of user wellbeing and societal values. In recent memory, the sharing of opinions over social media was only encouraged, and users sought connections to build network and bridge communities.
However, as highlighted, shared content is increasingly false, misleading, manipulated, and amplified by platform algorithms, foreign adversaries, domestic extremists and other actors. As a consequence, online content, especially polarizing content, is leading to more and more offline actions, causing harm and injury to people, communities, and democracy. Users, policy makers, and technology companies have a covenantal responsibility to take action, limit the threat, protect the people, and preserve the nation. Be warned, maintaining the status quo may further break the nation. And sadly, “[a] broken democracy is, by definition, debilitated in terms of effectively formulating and implementing the policy solutions necessary to fix itself.”74
70 Patrikarakos, War in 140 Characters, 255-267.
71 Singer and Brooking, Like War, 258-276.
72 Laura Rosenberger and Lindsay Gorman, “How Democracies Can Win the Information Contest,” The Washington Quarterly 43, no. 2 (2020): 76.
73 Sarah Oates, “The easy weaponization of social media: why profit has trumped security for U.S. companies,” Digital War (May 2020): 1.
74 Napoli, Social Media and the Public Interest, 201.
Bibliography
“Biden Begins Presidency With Positive Ratings; Trump Departs With Lowest_Ever Job Mark.” Pew Research Center, January 2021. https://www.pewresearch.org.
Blackwood, Kate. “Social media helping to undermine democracy.” Cornell Chronical, August 2020. https://news.cornell.edu/stories/2020/08/krepssocial-media-helping-undermine-democracy.
Brown, Shelby, and Queenie Wong. “Twitter launches ‘Birdwatch’ community forum to combat misinformation.” CNET. January 25, 2021. https://www.cnet.com/news/twitter-launches-birdwatch-communityforum-to-combat-misinformation/.
Byman, Daniel L., and Aditi Joshi. “Social media companies need better emergency protocols.” The Brookings Institute. January 12, 2021. https://www.brookings.edu/blog/order-from-chaos/2021/01/12/socialmedia-companies-need-better-emergency-protocols/.
Conner, Adam, and Erin Simpson. “Results Not Found: Addressing Social Media’s Threat to Democratic Legitimacy and Public Safety After Election Day.” Center for American Progress, October 2020. 1-14.
Dijck, Jose Van. The Culture of Connectivity: A Critical History of Social Media. Oxford: Oxford University Press, 2013.
Domer, Paul. “De Facto State Action: Social Media Networks and the First Amendment.” Nortre Dame Law Review 95, no. 1 (2019): 893+.
Eisenstat, Yael. “Dear Facebook, this is how you’re breaking democracy.” TED.com, August 2020. https://ted.com/talks/yael_eisenstat.
Epstein, Richard A. “Fundamentals of Freedom of Speech, The Symposium: The 1986 Federalist Society National Meeting.” Harvard Journal of Law & Public Policy 10, no. 1 (1987): 53-60.
Fischer, Kahlib J. “Biblical Principles of Government and Criminal Justice.” Liberty University Journal of Statesmanship & Public Policy 1, no. 1 (2020): 1-12.
Fuchs, Christian. Social Media: a critical introduction, 2nd ed. Los Angeles: Sage Publications, 2017.
Ganesh, Bharath, and Jonathan Bright. “Countering Extremists on Social Media: Challenges for Strategic Communication and Content Moderation.” Policy & Internet 12, no. 1 (2020): 6-19.
Ghaffary, Shirin. “Trump’s executive order on social media is legally unenforceable, legal experts say.” Vox. May 28, 2020. https://www.vox.com/recode/2020/5/28/21273878/trump-executive-ordertwitter-social-media-section-230-free-speech-implications.
Gruber, Marion F., and Valerie B. Marshall. “Regulation and Testing of Vaccines.” In Plotkin’s Vaccines, 7th ed., edited by Stanley A. Plotkin. 2018. Humprecht, Edda, Frank Esser, and Peter Van Aelst. “Resilience to Online Disinformation: A Framework for Cross-National Comparative Research.” International Journal of Press/Politics, 2020: 493-516.
Jones, Seth G., Catrina Doxsee, and Nicholas Harrington. “The Tactics and Targets of Domestic Terrorists.” Center for Strategic and International Studies, July 2020: 1-11.
Jones, Seth G., Catrina Doxsee, Nicholas Harrington, Grace Hwang, and James Suber. “The War Comes Home: The Evolution of Domestic Terrorism in the United States.” Center for Strategic and International Studies, October 2020: 1-12.
McMaster, H. R. Battlegrounds: The Fight to Defend the Free World. New York: HarperCollins, 2020.
Moyer, Edward. “After Twitter banned Trump, misinformation plummeted, says report.” CNET. January 16, 2021. https://www.cnet.com/news/aftertwitter-banned-trump-misinformation-plummeted-says-report/.
Napoli, Philip M. Social Media and the Public Interest: Media Regulation in the Disinformation Age. New York: Columbia University Press, 2019.
Napoli, Philip M. “User Data as a Public Resource: Implications for Social Media Regulation.”* Policy and Internet* 11, no. 4 (2019): 439-459.
Nemr, Christina, and William Gangware. “Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age.” Park Advisors, March 2020: 1-44.
Oates, Sarah. “The easy weaponization of social media: why profit has trumped security for U.S. companies.” Digital War, May 2020: 1-6.
The Social Dilemma. Directed by Jeff Orlowski. Exposure Labs, 2020. 1 hr., 34 min. https://www.netflix.com.
Patrikarakos, David. War in 140 Characters: How Social Media is Reshaping Conflict in the Twenty-First Century. New York: Basic Books, 2017.
Reagan, Gillian. “The Evolution of Facebook’s Mission Statement.” Observer, July 2009. https://observer.com/2009/07/the-evolution-of-facebooksmission-statement/.
Rosenberger, Laura, and Lindsay Gorman. “How Democracies Can Win the Information Contest.” The Washington Quarterly 43, no. 2 (2020): 75-96.
Silverman, Mark. “Review of Like War: The Weaponization of Social Media, by P. W. Springer and Emerson T. Brooking.” International Review of the Red Cross 101, no. 1 (2019): 383-387.
Singer, P. W., and Emerson T. Brooking. Like War: The Weaponization of Social Media. Boston: Houghton Mifflin Harcourt, 2018.
Spangler, Todd. “Apple CEO Tim Cook Defends Parler Suspension.” Variety. January 15, 2021. https://variety.com/2021/digital/news/apple-ceodefends-parler-ban-1234886383/.
Sunstein, Cass R. “Falsehoods and the First Amendment.” Harvard Journal of Law & Technology, 2020: 387+.
US Congress, Senate. Secretary of Defense Confirmation: Hearing before the United States Senate Committee on the Judiciary and the United States Senate Committee on Commerce, Science and Transportation. 117th Cong., 1st sess., April 10, 2018.
US Department of Homeland Security. National Terrorism Advisory System Bulletin due to heightened threat environment across the United States. January 27, 2021. https://www.dhs.gov/advisories.
US Department of Justice. Investigations Regarding Violence at the Capitol. Office of Public Affairs, January 31, 2021. https://www.justice.gov/opa/investigations.
US President. Executive Order 13925. “Preventing Online Censorship.” Federal Register 85, no. 106 (June 2, 2020): 34079. https://www.govinfo.gov.
Veliz, Carissa. “Not the doctor’s business: Privacy, personal responsibility and data rights in the medical settings.” Bioethics 34 (2020): 712-718.
Walker, J. Samuel, and Thomas R. Wellock. “A Short History of Nuclear Regulation, 1946-2009.” United States Nuclear Regulatory Commission, 2010.
Walsh, James P. “Social media and the moral panics: Assessing the effects of technological change on societal reaction.” International Journal of Cultural Studies 23, no. 6 (2020): 840-859.
Wanless, Alicia. “Cut Loose by Tech Giants, Will Far-Right Extremists Be Adrift?” Carnegie Endowment for International Peace, January 19 2021. https://carnegieendowment.org.
Williams, Pete, and Tom Winter. “FBI identifies more than 400 suspects in Capitol riot.” ABC News, January 2021. https://www.nbcnews.com/news/us-news/fbi-identifies-more-400-suspects-capitol-riot-n1255757.
Zadeh, Amir Hassan, and Anand Jeyaraj. “Alignment of business and social media strategies: insights from a text mining analysis.” Journal of Business Analytics 1, no. 2 (2018): 117-134.
Zuckerberg, Mark. “Opinion: Protecting democracy is an arms race. Here’s how Facebook can help.” Washington Post, September 2018. https://www.washingtonpost.com/opinions/mark-zuckerberg-protectingdemocracy-is-an-arms-race-heres-how-facebook-can-help-winit/2018/09/04/53b3c8ee-b083-11e8-9a6a-565d92a3585d_story.html.