Disinformation Disruption and Distance: Public Confidence in the U.S. Military in the COVID-19 Era
As COVID-19 spread around the globe—effectively changing the mainstream definition of national security—misinformation spread across the Internet. The pandemic created a digital playground for state actors such as Russia, China, and Iran to sow chaos through disinformation. Key leaders, from the United Nations Secretary-General to former President Obama, raised awareness of the threat of misinformation. Despite these warnings, misinformation seems to have taken root: nearly 30% of Americans believe in some COVID-19 conspiracy theory.
Foreign governments increasingly use the U.S. military as the object of conspiracy, and the era post COVID-19 presents prime targets. Even before the death of George Floyd, China stoked fears of martial law through a direct messaging campaign, and Russian media provocatively questioned the use of military forces on American streets. Since then, Chinese, Russian, and Iranian operations have sought to align social division, governance failures, and longstanding hypocrisies, with American militarism as a centerpiece. Societal perceptions of the U.S. military act as a soft target for disinformation, offering foreign agents and willing U.S. third-party actors an opportunity to redirect the American audience’s strong feelings of positive affect absent sturdy factual foundations into an effective messaging campaign. The COVID-19 pandemic is just the latest manifestation of this phenomenon.
The fears of U.S. military involvement at home come at a time when trust in the military is high and possibly driven by its very absence from everyday American life. The public recognizes U.S. military members bear a disproportionate burden of sacrifice and exposure to hardship and, in turn, repay that debt with confidence, trust, and positive affect. This sentiment may be superficial, hollow, and subject to well-crafted narratives. Amidst the COVID-19 crisis, trust in institutions and leadership face a unique vulnerability that foreign actors are poised to exploit. This article describes a unique nexus of institutional confidence and societal vulnerability to foreign disinformation, the prevalent tactics used to leverage the American information ecosystem, and ways the U.S. military can better support the society it is charged to defend.
The Audience: America’s Hollow Trust is a Vulnerability
Why would anyone target an obvious strength of U.S. society? The first answer to this question originates from the vulnerability’s source—America’s unjustified trust in its military. This trust is unjustified in the sense that many Americans do not really know what they trust or why. Former Chairman of the Joint Chiefs Admiral Mullen captured this reality in his commencement speech to the United States Military Academy’s graduating class of 2011. He assured the newest batch of officers they held the nation’s trust, yet he offered a caution, saying, “I fear they do not know us.”
American confidence in its military is perhaps best defined as a “cordial indifference,” an institutional confidence based on respect and national norms, not facts. Perhaps this lack of detailed knowledge is not surprising. Personal connections to the military have decreased in line with the decreasing size of the military and America’s growing population. New recruits draw disproportionately from the families of veterans. Many citizens are forced to learn about the military through second- or third-hand sources. Despite this distance, however, the military consistently records high levels of confidence and trust.
There are multiple examples of how second-hand information breeds misperception. Many hold the military in high regard despite also believing negative stereotypes about service members. Americans commonly misattribute emotional or psychological instability to veteran sacrifice. A military career is no longer a ticket to the middle class as commonly understood by society. Nonetheless, Americans consume and distribute this affect because it is subject to a highly networked information environment where support for the U.S. military is the societal norm. Institutional trust based on misperception breeds within U.S. social circles and entrenched norms. Thus, a wide segment of the population’s trust is not just built on misperception, but misinformation—incorrect information spread without malign intent. Communal misinformation is not just the stuff of legend—it is also the stuff of myth, conspiracy, and vulnerability. Strong but uninformed positive views of the military present a fertile ground for manipulation; where the military marches, supporters blindly follow.
The Agents: Exploiting a Narrative
Taking aim at this vulnerability is a unique nexus of social access, politicization, and disinformation between like-minded actors who seek to repurpose the U.S. military image for their desired ends. The result is a more robust, opportunistic, and malign information ecosystem—especially for manifestations of the U.S. military institution.
Misinformation and misperception quickly attract disinformation when actors distribute misleading content with harmful intent to a vulnerable audience. Societal perceptions like institutional confidence are akin to a wall. When laying siege, a malign actor targets any and all existing cracks. These weaknesses are underlying but known issues, and when combined they paint a hypocritical picture of the larger narrative. Information operations like Russian New Generation Warfare are continuous, not discrete campaigns, that aim to generate permanent domestic unrest for an adversary. A pandemic or election merely serves as an opportunity, exploiting vulnerable political candidates or salient issues. This mindset is opportunistic, unconnected, and shaped by local conditions. It is a strategy defined by chaos, harnessing vulnerabilities in the target country to achieve an omnipresent image that resonates with a target audience.
The #FireMcMaster campaign exemplifies this latent nexus. In 2017, reports spread of discord between so-called globalist and isolationist camps within the Trump administration. Soon after, online actors subjected Lieutenant General H.R. McMaster—the National Security Advisor—to a full-blown information manipulation operation. The campaign had a domestic audience; it grew within the U.S. alt-right. Russia exploited the campaign, redistributing it with U.S. third-party actors aligned to Russian media outlets and amplifying with online bot activity. The McMaster episode reflects a proliferation of common tactics within a polarized society, but also a change in norms. The U.S. military image did not shield McMaster; rather it silhouetted him as a target. High societal confidence did not dissuade readers; it made it more salacious to associate a U.S. military member with the so-called Deep State—a term used by some to denigrate career civil servants suspected of malicious and partisan intent. The U.S. military is fair game for disinformation, and that is great news for malign actors.
#FireMcMaster is just one example of an increasingly common pattern of information manipulation—distribution of forged or distorted content by various misleading personas, in multiple languages, and via official media channels. This simple framework generates improved access to local context, dissemination, and credibility. This decentralized volume can generate digital force in an information ecosystem. Individual controversial targets, when consolidated, corrode narratives, even those perceived as strongly held.
These techniques feed on third-party content to better channel local information and context. Individual profiteers frame false or misleading content in pursuit of influence. Electoral aspirations incentivize third party innovation in pursuit of more emotionally resonant and targeted voter messaging. It also armors elites with the liars dividend, the ability to propagate falsehoods without reputational cost. This process leverages the most powerful elements of communication as a ritual of shared beliefs not just a means of transmission. Unless a society is unwilling to participate, third party amplification makes disinformation not just more likely, but also more impactful due to its intimate communal connection.
Success brings not just imitators, but alternate points of social entry. Russia achieves relatively greater penetration of the Christian conservative and ultra-nationalist space. Iran is promoting regional political protest to reinforce its soft war narratives. The Chinese Three Warfares extend well within its global diaspora and dominant economic relationships.
The Messages: A Common Target and New Opportunities for Chaos
2/2 CDC was caught on the spot. When did patient zero begin in US? How many people are infected? What are the names of the hospitals? It might be US army who brought the epidemic to Wuhan. Be transparent! Make public your data! US owe us an explanation! pic.twitter.com/vYNZRFPWo3— Lijian Zhao 赵立坚 (@zlj517) March 12, 2020
In this environment, the U.S. military is a common target of interest, and COVID-19 represents another opportunity to sow chaos. It is no coincidence that the U.S. military’s brand is useful to push narratives of U.S. hostility and elite conspiracy. China, Russia, and Iran place the U.S. military at the center of bioweapon conspiracies in Maryland, Kazakhstan, and Wuhan, alleging subversive delivery and ethnic targeting. However, chaos seeks multiple avenues. Russian articles exploit the misgivings of U.S. citizens to have troops policing their streets and of Europeans to see U.S. troops hoarding medical resources. Chinese articles narrate a loss of U.S military capability and irresponsible deployments. Iranian reporting presents the U.S. protecting Israeli troops over their own amid the crisis.
Similar in strategy, target, and tactics, these state actors generate a flattened network; one actor’s narrative is both improved fuel and cultural access for another. Content is cheap and easily reusable to inspire fear, hatred, and division more effectively within an American audience already torn between future options and past outcomes in the midst of an ongoing pandemic. This information ecosystem can generate a narrative synergy strong enough to breach any societal perception. #FireMcMaster shows a sizable segment of the U.S. populace is a willing third party to manipulate the image of its military to profit socially, politically, or financially. As the nation fights both a pandemic and election, it must be prepared for a synergy of online influence challenging its most respected institution
What could a campaign to exploit the American military perception look like? Four structural weaknesses stand out. First, political polarization is important as a tool of domestic parties or political leanings of the institution itself. Second, questionable administration such as Veteran Administration failures or Department of Defense budgets as a share of tightening discretionary spending creates opportunity. Third, military governance failures such as sexual violence, mental health, and false reporting are a perennial problem. Finally, problematic actions such as drone strikes, operational atrocities, post 9/11-era quagmires overseas, and domestic troop deployments are issues ripe for exploitation.
Institutional Awareness: The U.S. Military Must Understand the Terrain it Occupies
The combined challenges of politicization, flattened information ecosystems, and disinformation make disinformation primarily an awareness problem for the Department of Defense. First, disinformation is more than a cybersecurity problem, which is how the Department largely treats it. It is not as clean-cut as echo-chamber theory suggests. Disinformation is about shared truths, and the shared truth about the U.S. military lacks a factual base, generating a vulnerable foundation.
Second, American society is unlikely to repair itself. America’s hollow institutional trust does not occur in a vacuum. It is awash in a domestic ecosystem that both incentivizes and facilitates the manipulation of information. Digital targeting and disinformation are still two sides of the same valuable coin. Instead, domestic and third-party users will increasingly supercharge information in pursuit of a digital edge.
Third, the U.S. military and its high public confidence is a political resource for numerous actors seeking legitimacy. It is inextricably caught in the web that binds together the media, domestic polarization, and the disinformation provided by adversaries. The U.S. military strives to be apolitical, but might instead seek to avoid partisan entanglements. By failing to correct the improper use of its own image, the U.S. military can inadvertently feed perceptions of partisanship.
Finally, fact-checking is necessary but insufficient. In empirical analysis, fact-checking produces limited results and may be as socially contingent as disinformation itself. These methods, while relatively cheap, are hard to make stick, in part because exposure alone drives perceptions of accuracy and fact-checking fails to address underlying partisanship. Information literacy and user context campaigns can bear negative consequences based on the messenger and polarization, as exemplified in counter anti-vaxx programs.
These old problems may find new, more effective, distributive solutions. Leaked information, through hacking and blackmail of senior figures and associations, will find augmentation in a world of deep fakes and generative adversarial networks. Impersonation and manipulation are already underway within the active duty and veteran community through targeted disinformation, phishing, and hacking. In the ongoing battle for the American mind, the impenetrable strength of U.S. military confidence is anything but certain.
Recommendations: Removing the Oxygen from Fire
Faced with potential adversaries or political entrepreneurs who seek to exploit chaos to spark fire, the U.S. military is just another form of societal kindling. A lesson from private industry and academia is to combat not the existence but the virality—to shape behavior and remove the oxygen. Given these realities, two supporting courses of action present a way forward for the Department of Defense.
First, the Department of Defense should ensure its members do not actively contribute susceptible content that generates or supports disinformation. The Department should recognize the current online environment no longer allows for the distinction between approved political activities and professional online conduct. This policy gap allows increased social media interface with the public. However, it also reduces accountability to messaging that can disproportionately skew popular perceptions such as anonymous social media accounts and closed social media groups. New policy should recognize and hold service members accountable to the perceptual stakes of online political behavior like it does for in-person political behavior. This action could bridge potential divisions within the U.S. military and remove content sources from malign actors.
Second, the U.S. military should advise Americans when the military perception is manipulated for malign purposes. One promising technique worth considering is attitudinal inoculation. This models the spread and uptake of disinformation as a virus, and attempts to pre-empt public attitudes before manipulation takes hold. This technique builds resilience by introducing false claims that may challenge popular viewpoints and then countering weakened content examples. Attitudinal inoculation combats gullibility, not the information. It is successfully applied in health, political campaigns, and climate change settings, outperforming information literacy techniques.
This effort would highlight how the U.S. military could be manipulated in future contexts. Research suggests the institution can steer the campaign toward impactful demographics and themes. Demographically, misinformation is acutely influenced by elites. The U.S. military could use its platform to call balls and strikes on potential claims Americans might see from elites outside the chain of command, pre-empting misinformed or incorrect contentions about the military with factual information. This concept uses suasion to exact reputational costs to decrease misinformed dissemination. Alternatively, a counter campaign can focus on building themes. Such a campaign could highlight broad definitions of national service such as teachers, medical providers, and civil-servants; these professions bear the brunt of COVID-19 response. Cross-cutting exposure such as this builds empathy between the service member and non-service member communities, reducing polarization.
The Department of Defense should partner with academia and private industry to research, pilot, and roll-out these initiatives. This public-private partnership can jointly examine effective and ethical delivery methods. By improving the means by which it polices its ranks and advises the public, the U.S. military can better fulfill one of its most important duties—to reach out to civilians and build a shared understanding that grounds a healthy civil-military relationship.
Too often, the U.S. defends against disinformation from its heels—reacting not anticipating, and facilitating not preventing. Inaction will increasingly create costs. The U.S. military currently retains both the popularity and professionalism to responsibly influence this outcome. Many view the public’s good will toward the U.S. military as an institutional benefit. However, in the all-volunteer force-era, this distance between society and its military means individual perceptions do not originate from firsthand knowledge. Society is reliant on a messenger to shape their view. Some argue this is advantageous—the U.S. military benefits from being out of sight and mind of most Americans. Familiarity, the logic goes, breeds both fondness and contempt. Yet foreign actors also stand ready and able to shape that view. Inaction will increasingly bear costs. Over time, persistent cuts to the narrative of a reliable military threaten to drive a wedge through one of the few universal feelings in society.
If U.S. military trust declines, two institutional consequences will gradually emerge. From the bottom-up, the U.S. military risks operational effectiveness garnered by the ability to recruit and retain committed talent. From the top-down, the U.S. military risks strategic influence of best military advice to civilian leadership enabled by popular legitimacy. If the U.S. military loses the confidence of society or its civilian leaders, the institution is at best reduced to an impediment to progress or worse yet, an enemy. If this were to happen, the United States will have many more problems than disinformation from foreign actors; it will be in conflict with itself.
This article appeared originally at Strategy Bridge.
 “United Nations Launches Global ‘Pause’ Campaign to Tackle Spread of MIsinformation,” United Nations, June 30, 2020, https://www.un.org/sites/un2.un.org/files/pause_pr_final_30jun.pdf; Allyson Chiu, “Speak the Truth: Obama Says ‘Biggest Mistake’ Mayors Can Make in Coronavirus Pandemic is to Misinform the Public,” The Washington Post, April 10, 2020, https://www.washingtonpost.com/nation/2020/04/10/obama-coronavirus-misinformation/.
 Katherine Schaeffer, “Nearly Three-in-Ten Americans Believe COVID-19 Was Made in a Lab,” Pew Research Center, April 8, 2020, https://www.pewresearch.org/fact-tank/2020/04/08/nearly-three-in-ten-americans-believe-covid-19-was-made-in-a-lab/.
 Edward Wong, Matthew Rosenberg, and Julian Barnes, “Chinese Agents Helped Spread Messages That Sowed Virus Panic In U.S., Officials Say,” The New York Times, April 23, 2020, https://www.nytimes.com/2020/04/22/us/politics/coronavirus-china-disinformation.html; “Never Mind the HUMVEES! Maryland National Guard Reassures Residents There’s No Martial Law Threat Amid Virus Panic,” Russia Today, March 21, 2020, https://www.rt.com/usa/483714-national-guard-martial-law-coronavirus/.
 Givi Gigitashvili, “Russia, China, Iran Exploit George Floyd Protests in U.S.,” DFRlab, June 4, 2020, https://medium.com/dfrlab/russia-china-iran-exploit-george-floyd-protests-in-u-s-6d2a5e56c7b9.
 “American’s Trust in Military, Scientists Relatively High; Fewer Trust Media, Business Leaders, Elected Officials,” Pew Research Center, March 22, 2019, https://www.pewresearch.org/ft_19-03-21_scienceconfidence_americans-trust-in-military/; Andrew A. Hill, Leonard Wong, and Stephen J. Gerras, Self Interest Well Understood: The Origins and Lessons of Public Confidence in the Military,” The Journal of American Academy of Arts and Sciences 142, no. 2 (Spring 2013): 49-50.
 “War and Sacrifice in the Post 9/11 Era,” Pew Research Center, October 5, 2011, https://www.pewsocialtrends.org/2011/10/05/war-and-sacrifice-in-the-post-911-era/2/.
 James Fallows, “The Tragedy of the American Military,” The Atlantic, January/February 2015, https://www.theatlantic.com/magazine/archive/2015/01/the-tragedy-of-the-american-military/383516/.
 Claes Wallenius, Carina Brandow, Anna Karin Berglund, Emma Jonsson, “Anchoring Sweden’s Post-Conscript Military: Insights From Elites in the Political and Military Realm, Armed Forces and Society 45, no. 3: 453.
 Thom Shanker, “ At West Point, A Focus on Trust,” The Washington Post, May 21, 2011, https://www.nytimes.com/2011/05/22/us/22mullen.html.
 Andrew J. Bacevich, Breach of Trust: How Americans Failed Their Soldiers and Their Country (New York: Metropolitan Books, 2013), 110.
 Dave Phillips and Tim Arango, “Who Signs Up to Fight? Makeup of U.S. Recruits Shows Glaring Disparity,” The New York Times, January 10, 2020, https://www.nytimes.com/2020/01/10/us/military-enlistment.html; Amy Schafer, “The Warrior Caste,” Slate, August 2, 2017, https://www.cnas.org/publications/commentary/the-warrior-caste.
 Lydia Saad, “Military, Small Business, Police Still Stir Most Confidence,” Gallup, June 28, 2018, https://news.gallup.com/poll/236243/military-small-business-police-stir-confidence.aspx.
 Bryant Jordan, “Poll: America Values Vets but Stereotypes Them,” Military.com, June 14, 2012, https://www.military.com/daily-news/2012/06/14/poll-america-values-vets-but-stereotypes-them.html.
 “The American Veteran Experience and the Post 9/11 Generation,” Pew Research Center, September 9, 2019, https://www.pewsocialtrends.org/2019/09/09/deployment-combat-and-their-consequences/.
 “Recruits to America’s Armed Forces Are Not What They Used To Be,” The Economist, April 18, 2020, https://www.economist.com/united-states/2020/04/16/recruits-to-americas-armed-forces-are-not-what-they-used-to-be.
 See Warrior and Citizens: American Views of Our Military, edited by Kori Schake and Jim Mattis (Stanford: Hoover Institution Press, 2016). Particularly Rosa Brooks, “Civil-Military Paradoxes,” 28; Jim Golby, Lindsay P. Cohn, and Peter D. Fever, “ Thanks for Your Service: Civilian and Veteran Attitudes after Fifteen Years of War,” 124; Kori Shacke and Jim Mattis, “Ensuring a Civil-Military Connection,” 263-267.
 This article defines misinformation as false information that is shared without intending harm to the interpreter or target. See Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward An Interdisciplinary Framework for Research and Policy Making,” Council of Europe, September 27, 2017, https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c, 5.
 Timothy Thomas, “The Evolving Nature of Russia’s Way of War,” Military Review (July/August 2017): 37-39.
 Laura Rosenberger, “Prepared Statement Concerning: ‘Undermining Democracy: Kremlin Tools of Malign Political Influence’,” United States House Committee on Foreign Affairs, May 21, 2019, https://docs.house.gov/meetings/FA/FA14/20190521/109537/HHRG-116-FA14-Wstate-RosenbergerL-20190521.pdf, 1.
 Mark Galeotti, “Controlling Chaos: How Russia Manages Its Political War in Europe,” European Council On Foreign Affairs, August 2017,https://www.ecfr.eu/page/-/ECFR228_-_CONTROLLING_CHAOS1.pdf, 2.
 Donald Jensen and Peter Doran, “Chaos As a Strategy,” Center for European Policy Analysis, November 2018, https://www.cepa.org/chaos-as-a-strategy, 26.
 Donara Barojan, “#FireMcMaster, Explained,” DFRlab, August 7, 2017, https://medium.com/dfrlab/firemcmaster-explained-9e9018e507c2.
 For a full account, see Peter Bergen, Trump and His Generals: The Cost of Chaos (New York: Penguin Random House, 2019), 151-153.
 Rosie Gray, “The War Against H.R. McMaster,” Politics, The Atlantic, August 4, 2017, https://www.theatlantic.com/politics/archive/2017/08/the-war-against-hr-mcmaster/.
 In a three day span in 2017, Infowars and Brietbart combined to publish fourteen negative stories on McMaster which reverberated through alt-right media channels and among American media elites such as Lee Stranahan, a paid Sputnik personality. See Barojan, “#FireMcMaster, Explained.” Days later, hundreds of Russian Internet Research Agency-linked twitter bots peddled the #FireMcMaster. See Brett Schafer, “A View From the Digital Trenches: Lessons From One Year of Hamilton 68,” Alliance For Securing Democracy, November 9, 2018, https://securingdemocracy.gmfus.org/a-view-from-the-digital-trenches-lessons-from-year-one-of-hamilton-68/.
 Nika Aleksejeva et. al, “Operation Secondary Infektion: A Suspected Russian Intelligence Operation Targeting Europe and the United States,” The Atlantic Council, https://www.atlanticcouncil.org/wp-content/uploads/2019/08/Operation-Secondary-Infektion_English.pdf, 4.
 Ben Nimmo, “Deliberate Online Falsehoods--Methods and Responses,” February 22, 2018, https://www.parliament.gov.sg/docs/default-source/sconlinefalsehoods/written-representation-36.pdf, 3.
 Ian Sherr, “Facebook, Cambridge Analytica and Data Mining: What You Need To Know,” CNET, August 18, 2018, https://www.cnet.com/news/facebook-cambridge-analytica-data-mining-and-trump-what-you-need-to-know/.
 Robert Chesney and Danielle Citron, “Deepfakes and The New Disinformation War,” Foreign Affairs, January/February 2019, https://www.foreignaffairs.com/articles/world/2018-12-11/deepfakes-and-new-disinformation-war.
 Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward An Interdisciplinary Framework for Research and Policy Making,” Council of Europe, September 27, 2017, https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c, 14-15.
 Bethany Allen-Ebrahimian, “China Takes a Page from Russia’s Disinformation Playbook,” Axios, March 25, 2020, https://www.axios.com/coronavirus-china-russia-disinformation-playbook-c49b6f3b-2a9a-47c1-9065-240121c9ceb2.html.
 Frederik Hjorth and Rebecca Alder-Nissen, “Ideological Asymmetry in the Reach of Pro-Russian Digital Disinformation to the United States Audience,” Journal of Communication 69, no. 2 (April 2019): 168; Heather Conley, “Prepared Statement: ‘Undermining Democracy: Kremlin Tools of Malign Political Influence’,” House Foreign Affairs Subcommittee on Europe, Eurasia, Energy, and the Environment, May 21, 2019, https://docs.house.gov/meetings/FA/FA14/20190521/109537/HHRG-116-FA14-Wstate-ConleyH-20190521.pdf, 5.
 “Iran Military Power: Ensuring Regime Survival and Securing Regional Dominance,” Defense Intelligence Agency, 2019, 23.; Brandon Wallace and Katherine Lawor, “Iraq Situation Report: January 14-16 2020,” Institute for the Study of War, January 17, 2020, http://www.iswresearch.org/2020/01/iraq-situation-report-january-14-16-2020.html.
 Peter Mattis, “China’s Three Warfares in Perspective,” War on the Rocks, January 30, 2018, https://warontherocks.com/2018/01/chinas-three-warfares-perspective/.; Timothy Heath, “Beijing’s Influence Operations Target CHinese Diaspora,” War on the Rocks, March 1, 2018, https://warontherocks.com/2018/03/beijings-influence-operations-target-chinese-diaspora/.; Caleb Slayton, “Africa” The First U.S. Casualty of the New Information Warfare AGainst China,” War on the Rocks, February 3, 2020, https://warontherocks.com/2020/02/africa-the-first-u-s-casualty-of-the-new-information-warfare-against-china/.
 “Throwing Coronavirus Disinfo at the Wall to See What Sticks,” EU vs Disinfo, April 2, 2020, https://euvsdisinfo.eu/throwing-coronavirus-disinfo-at-the-wall-to-see-what-sticks/.
 Sarah Jacobs Gamberini and Amada Moodie, “The Virus of Disinformation: Echoes of Past Bioweapons Accusations in Today’s COVID-19 Conspiracy Theories,” War on the Rocks, April 6, 2020, https://warontherocks.com/2020/04/the-virus-of-disinformation-echoes-of-past-bioweapons-accusations-in-todays-covid-19-conspiracy-theories/; Kasra Aarabi, “Iran Knows Who to Blame for the Virus: America and Israel,” Foreign Policy, March 19, 2020, https://foreignpolicy.com/2020/03/19/iran-irgc-coronavirus-propaganda-blames-america-israel/.
 Luka Andriukaitis, “Pro-Kremlin Media Spins Story of U.S. Military Transporting COVID-19 Test Swabs From Italy,” DFRlab, March 28, 2020, https://medium.com/dfrlab/pro-kremlin-media-spins-story-of-u-s-military-transporting-covid-19-test-swabs-from-italy-548b98c0435d.
 Guo Yuandoan, “COVID-19 to Have Serious Impact on U.S. Military’s Global Influence: Chinese Experts,” Global Times, March 24, 2020.; Liu Xuanzun and Guo Yuandan, “U.S. Sending Destroyer Through Taiwan Straits During a Pandemic a Reckless Move: Experts,” Global Times, March 27, 2020, https://www.globaltimes.cn/content/1183928.shtml.
 “U.S. Military Gives 1 Million Masks to Israel as American Troops Have to Make Their Own Amid COVID-19 Outbreak,” PressTV, April 8, 2020, https://www.presstv.com/Detail/2020/04/08/622584/US-masks-Israel-coronavirus-covid19.
 David Robson, “The Myth of the Online Echo Chamber,” BBC News, April 16, 2018, https://www.bbc.com/future/article/20180416-the-myth-of-the-online-echo-chamber.
 Chris Zappone, “Australia's Bushfires Show Why Democracy Requires Shared Truths,” The Sydney Morning Herald, January 12, 2020, https://www.smh.com.au/world/oceania/australia-s-bushfires-show-why-democracy-requires-shared-truths-20200110-p53qf1.html.
 Anthony Nadler, Matthew Cain, and Joan Donovan, “Weaponizing the Digital Influence Machine: The Political Perils of Online Ad Tech,” Data and Society, accessed April 20, 2020, https://datasociety.net/wp-content/uploads/2018/10/DS_Digital_Influence_Machine.pdf, 4.
 Maddie McGarvey, “Trump Won the Internet. Democrats Are Scrambling to Take It Back,” The New York Times, March 30, 2020, https://www.nytimes.com/2020/03/30/us/politics/democrats-digital-strategy.html.
 Brian Babcock-Lumish, “Uninformed, not Uniformed? The Apolitical Myth,” Military Review, (September-October 2013), https://www.armyupress.army.mil/Portals/7/military-review/Archives/English/MilitaryReview_20131031_art009.pdf,: 48-55.
 Risa Brooks, “Paradoxes of Professionalism: Rethinking Civil-Military Relations in the United States,” International Security 44, no. 4 (Spring 2020), https://www.mitpressjournals.org/doi/pdf/10.1162/isec_a_00374: 24.
 Drew Margolin, Aniko Hannak, and Ingmar Weber, “Political Fact-Checking on Twitter: When Do Corrections Have An Effect? Political Communication 35, no. 2 (September 2017), 196.
 Gordon Pennycook, Tyrone Canon, and David Rand, “Prior Exposure Increases Perceived Accuracy of Fake News,” Journal of Experimental Psychology 147, no. 12 (December 2018): 1865; Stewart Scott, “Hearts and Mind: Misinformation, Polarization, and Resistance to Fact Checking,” DFRlab, June 23, 2020, https://medium.com/dfrlab/hearts-and-minds-misinformation-polarization-and-resistance-to-fact-checking-8868c355d1f1.
 For information on the unintended consequences of information campaigns in promoting vaccinations, see Brendan Nyan, Jason Reifler, Sean Richey, and Gary Freed,”Effective Messages in Vaccine Promotion: A Randomized Trial,” Pediatrics 133, no. 4 (April 2014), https://pediatrics.aappublications.org/content/133/4/e835.
 Karen Yourish, “How Russia Hacked the Democrats,” The New York Times, July 13, 2018, https://www.nytimes.com/interactive/2018/07/13/us/politics/how-russia-hacked-the-2016-presidential-election.html.; Christian Caryl, “Beware of the Dark Art of Russian Blackmail,” The Washington Post, January 11, 2017, https://www.washingtonpost.com/news/global-opinions/wp/2017/01/11/beware-the-dark-art-of-russian-blackmail/.; Christopher Paul and Marek Posard, “Artificial Intelligence and the Manufacturing of Reality,” The Strategy Bridge, January 20, 2020, https://thestrategybridge.org/the-bridge/2020/1/20/artificial-intelligence-and-the-manufacturing-of-reality.
 Ben Schreckinger, “How Russia Targets the U.S. Military,” Politico, June, 12, 2017, https://www.politico.com/magazine/story/2017/06/12/how-russia-targets-the-us-military-215247; Kristofer Goldsmith, “An Investigation Into Foregin Entities Who Are Targeting Servicemembers and Veterans Online,” Vietnam Veterans of America, https://vva.org/wp-content/uploads/2019/09/VVA-Investigation.pdf, 6.
 “DOD Directive 1344.10: Political Activities by Members of the Armed Forces,” Department of Defense, February 19, 2008, https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/134410p.pdf; “ALARACT 058/2018: Professionalization of Online Conduct,” US Army, July 25, 2018, https://www.army.mil/e2/downloads/rv7/socialmedia/ALARACT_058_2018_PROFESSIONALIZATION_OF_ONLINE_CONDUCT.pdf.
 Gerald-Mark Breen and Jonathan Matusitz, “Inoculation Theory: A Theoretical and Practical Framework for Conferring Resistance to Pack Journalism Tendencies,” Global Media Journal 8, no. 14 (2009), http://www.globalmediajournal.com/open-access/inoculation-theory-a-theoretical-and-practical-framework-for-conferring-resistance-to-pack-journalism-tendencies.pdf: 2.
 van der Linden et. al (2017) random control trial investigates the impact of attitudinal inoculation on misinformation for climate change consensus. Their work suggests that sequencing of misinformation and counter messaging exposure is critical in shaping perceptions. Groups evaluated the impact of consensus-treatment (climate change scientific consensus), countermeasures (climate change misinformation), inoculation and the ordering of treatments. Misinformation eroded agreement with the scientific consensus even after introduction of consensus-treatment, but pre-empting countermeasures with ‘inoculating’ information improved perceived scientific consensus. See Sander van der Linden, Anthony Leiserowitz, Seth Rosenthal, and Edward Maibach, “Inoculating the Public Against Misinformation About Climate Change,” Global Challenges 1, (2017): https://onlinelibrary.wiley.com/doi/pdf/10.1002/gch2.201600008.
 Brendan Nyhan, “Why Fears of Fake News Are Overhyped,” February 4, 2019, https://gen.medium.com/why-fears-of-fake-news-are-overhyped-2ed9ca0a52c9.
 Brendan Nyhan and Jason Reifler, “Misinformation and Fact-Checking: Research Findings from Social Science,” New American Foundation, February 2012, https://www.dartmouth.edu/~nyhan/Misinformation_and_Fact-checking.pdf, 2.
 Jim Golby and Heidi Urben, “Thank Them for their Service,” Army Times, March 29, 2020, https://www.armytimes.com/opinion/commentary/2020/03/29/thank-them-for-their-service/.
 Diana C. Mutz, “The Consequences of Cross-Cutting Networks for Political Participation,” American Journal of Political Science 46, no. 4 (October 2002): 838.
 Martin E. Dempsey, “The Military Needs to Reach Out to Civilians,” The Washington Post, July 3, 2013, https://www.washingtonpost.com/opinions/general-dempsey-the-military-needs-to-reach-out-to-civilians/2013/07/02/b10c3bb0-e267-11e2-aef3-339619eab080_story.html.
 Andrew A. Hill, Leonard Wong, and Stephen J. Gerras, Self Interest Well Understood: The Origins and Lessons of Public Confidence in the Military,” The Journal of American Academy of Arts and Sciences 142, no. 2 (Spring 2013): 49-50.
 Norton et. al “Less Is More: The Lure of Ambiguity, or Why Familiarity Breeds Contempt,” 1. Journal of Personality and Social Psychology, 2007, Vol. 92, No. 1, 97–105.