Never before have humans been able to communicate and connect to the extent modern technologies such as the Internet and social media provide today. In 2019, an estimated 2.77 billion people will use social media, an increase from 2.46 billion users in 2017. Increased capacity for human communication through the technological emergence of new social networks brings expanded potentials for civic engagement, promoting political change in local, national, and even global communities. Enhancing the possibility for online engagement requires bridging the “digital divide” between people who do and do not have access to computers and the Internet.
Determining the overall effectiveness of online civic engagement requires reevaluating whether social media platforms lead to meaningful political change, or whether they simply provide an empty form of online “token activism”, where individual contributors act through “likes”, “shares”, or “tweets.” Critics argue such efforts do not create any real political change as they require minimal real-world effort while still providing the illusory good feeling of having “contributed” to a political cause. By contrast, proponents argue digital engagement leads to real world physical change where processes that begin online result in mobilizing individuals to action.
Solving these problems is essential for developing strong online engagement between governments and citizens to strengthen democratic institutions. However, these are not the only threats to such institutions that result from advanced information communication technologies. Recent elections demonstrate pervasive capabilities by foreign actors to interfere with and directly impact citizen trust in the legitimacy of electoral institutions.
The average person spends two hours (116 minutes) daily on social media. With increased capabilities to keep users on various platforms due to highly sophisticated algorithms and marketing techniques, this number will only grow. Such projections provide assailants new opportunities to exploit vulnerabilities, by developing highly sophisticated operations that include cyberattacks and disinformation as well as both covert and overt efforts aimed at electoral interference. The 2016 U.S. Presidential Election, U.K. “Brexit” Referendum, and 2017 French Presidential Election are but a few examples of targeted operations that utilized social media to launch campaigns that influenced electoral results in favour of a foreign state’s interests.
A previous electoral assault also occurred in Ukraine in 2014 when the pro-Russian hacker group CyberBerkut made multiple attempts to influence the May 25 Presidential Election.
The attack began four days before the national vote when files were deleted from the Central Election Commission (CEC) after computers were penetrated. On election day, within hours of results being broadcasted live on television, government experts discovered malicious software on CEC computers. Had the malicious contents not been discovered they would have displayed the ultra-nationalist candidate and leader of the Right Sector Party, Dmytro Yarosh, winning 37 percent of the vote compared to Petro Poroshenko only appearing to receive 29 percent. These fake numbers contrasted the actual election results where Poroshenko won 54.7 percent of the vote compared to Yarosh winning less than 1 percent. When the polls closed, tallied vote totals were targeted as they travelled to the CEC, by attackers using a distributed denial of service attack (DDoS), which overwhelmed the service with online traffic to make it unavailable and delayed the final vote count.
Interference in the 2016 Presidential Election in the U.S. attained higher levels of sophistication, with attackers using a “full spectrum of techniques” that included “military, economic, political, and information resources” in addition to targeting social divisions, utilizing unrestricted measures to advance the interests of foreign actors in the electoral outcome. On February 16, 2018, the U.S. Department of Justice announced a Grand Jury had indicted 13 individuals and the Internet Research Agency (IRA), a Russian company located in St. Petersburg engaged in online political influence operations.
IRA criminal acts in the U.S. included utilizing social media to act as U.S. citizens, creating fake accounts, pages and groups which “addressed divisive U.S. political and social issues, falsely claimed to be controlled by U.S. activists” but were managed by IRA affiliates. Specific examples include impersonating Americans and purchasing political advertisements on Facebook and Twitter (IRA associated Twitter handles), towards the goal of amplifying social divisions, spreading disinformation, and generally sowing chaos in the build-up to the Presidential Election. Russia then utilized government media outlets like RT, Sputnik, and others to amplify the social divisions outlined above. Additional hacks, for example against John Podesta and the Democratic National Committee, aided these operations by using social networks to spread stories shared to further damage the Clinton Campaign.
IRA operatives were involved in the U.S. since at least 2014, “to track and study groups on U.S. social media sites dedicated to U.S. politics and social issues […] to gauge the performance of various groups on social media sites.” Such efforts can be described as Targeted Audience Analysis which works to “empirically diagnose the exact groups that exist within target populations” in order to build “up a detailed understanding of current behaviors, values, attitudes, beliefs, and norms.” Understanding where social divisions are present can help patch vulnerabilities to build defenses against where foreign actors could target.
Russia’s end goals are difficult to determine. Speculation is dangerous especially when it leads to “sculpting the narrative that they [Russia and its proxies] were stronger than they were.” Humans tend to amplify an unknown they do not understand to be greater than it is. This reaction must be resisted when analyzing Russian strategic operations against U.S. electoral interference. Over amplifying capabilities by suggesting Russian disinformation and cyber capabilities in the U.S. Election present a new kind of warfare, risk a similar problem to that which occurred during Russian operations to annex the Crimean Peninsula in 2014. Describing such operations as novel applications of hybrid warfare has the effect of such narratives playing into Russia’s disinformation campaign – which seeks to advance the perception that Russia has mastered some new kind of warfare, thus amplifying its capabilities beyond their true scope.
Future research must analyze how past Soviet-era operational strategies have been tailored to fit modern technologies to achieve Moscow’s goals. For example, Active Measures are Soviet political warfare and manipulation that involves a range of actions – from disinformation to assassination – directed towards the goal of attaining specific ends. Reflexive Control is another Soviet-era theory that involves understanding opponent decision-making capabilities to create an environment where the decisions made by opponents favour Russian interests. Electoral interference in the United States and Ukraine involved both operational strategies, and they must be incorporated into analyzing these specific cases.
Preventative, reactive, and deterrent measures require understanding how actors have developed strategic operations, and how these operations are applied in light of modern advancements in technology. Studying Russian cyber operations, electoral interference, and mass-targeted disinformation campaigns require understanding how operations evolved from past strategies in utilizing modern technologies.
Featured Image: Smartphone in Hand | November 8, 2016, by: Markus Spiske temporausch.com,
(Picture from Pexels Website) – https://www.pexels.com/photo/blur-bokeh-business-connection-230860/
Disclaimer: Any views or opinions expressed in articles are solely those of the authors and do not necessarily represent the views of the NATO Association of Canada.