Centre For Disinformation Studies Joseph McQuade and Ben Patterson

Attempts to undermine Canadian election through disinformation are symptoms of a global problem

The Communications Security Establishment (CSE), Canada’s foreign signals intelligence agency, recently released a report predicting election interference in Canada on a greater scale than our nation has ever seen. The CSE’s job is to covertly collect information through phone, radio, internet, and other forms of signal-based communication networks in order to aid our nation’s decision-making.  Most recently, the CSE’s Centre for Cyber Security, founded in 2018, has warned of the possibility of foreign-directed election interference initiatives originating in cyberspace.

The widely available use of cyber hacking and social media influence tools, as well as the grey areas surrounding proper responses to this kind of interference, make it no surprise that foreign and domestic actors will seek to interfere in our coming electoral process by spreading disinformation meant to mislead the Canadian public about key issues. This type of interference relies on targeting average Canadian voters, so what can Canadian voters do to protect themselves from disinformation?

Disinformation and Canadian society

The answer becomes more straightforward when we examine how the CSE is predicting this interference will happen.  The CSE’s observations are that the goal of foreign attackers is to create divisions within our nation through polarizing the political process. This can be particularly concerning for a nation like Canada, which is home to the competing national narratives of a robust multiculturalism on the one hand and a long history of exclusionary settler racism on the other.

Despite the old cliché of Canada as a mosaic rather than a melting pot, divisions remain. Whether in the divide between progressives and conservatives, settlers and First Nations, Francophones and Anglophones, immigrants and ‘old stock’ Canadians, or a range of other ideological and identitarian fault lines, Canadian society remains vulnerable to the kind of divisive rhetoric that disinformation campaigns actively foster.

So how can Canadians resist the influence of deliberate disinformation campaigns? The answer is easy to propose but harder to implement – Canadians must resist divisive politics at all costs. In the coming election, Russia – and likely other foreign and domestic actors as well – will seek to amplify extreme perspectives and divide Canadian society against itself.

If electoral interference in the US or UK is any indication, so-called ‘troll farms’ and other sources of online disinformation will promote existing divisions and strengthen the voices of those seeking to scapegoat certain communities for the country’s problems. Canadian voters – whether they support the Liberals, Conservatives, NDP, or another party – should think critically about the media they consume and should place a high standard of evidence on any source making outrageous or extreme claims. It is worth remembering that the political process is complicated, and any point that seems obviously simple or straightforward usually deserves closer scrutiny.

Canada’s struggle with disinformation in global perspective

In the United States, the Mueller Report’s recent determination that President Donald Trump did not actively collude with Moscow during the election is welcome news – and not just for the president. However, it does not change the fact that the same investigation has indicted 34 people, including 26 Russian nationals and three Russian companies. Whether or not the Trump campaign colluded with Russian attempts to subvert democratic processes during the 2016 US election has no bearing on the well-documented fact that these attempts by Russia did in fact take place.

Disinformation, or ‘fake news’, is an essential part of Russia’s foreign policy toolkit and has already been deployed as a strategy of destabilization in scenarios as diverse as the UK Brexit referendum, election tampering in Ukraine, the North American anti-vaccination movement, and the Syrian civil war. Russian use of aktivniye meropriyatiya – so-called ‘active measures’ – dates back to the Cold War, but has taken on new life in the digital age. The central goal of Russian disinformation appears to be an undermining of public confidence in democratic institutions and international organizations. Even public health issues that seem unconnected to foreign affairs, such as vaccination, provide useful entry-points into sowing broader discord that undermines political unity and cohesion within NATO and EU member countries.

It would be a mistake to assume that the dangers of ‘fake news’ are limited to foreign-sponsored disinformation campaigns. Alongside the democratizing and emancipatory potential of new digital technologies that have helped combat racial injustice and oppressive regimes, social media also provides enormous potential for non-state actors, political parties, or extreme groups to manipulate information and advance their own interests.

Experts warn that disinformation could play a key role in the upcoming national election of India, the world’s largest democracy. With as many as 879 million voters eligible to head to the polls, false reports, rumours, and deliberately-curated disinformation are currently being disseminated through the popular social media messaging platforms Facebook and WhatsApp at an astonishing rate. At the height of a recent Indo-Pakistani standoff triggered by a terror attack in Kashmir valley, former BBC journalist Trushar Barot Tweeted, ‘I’ve never seen anything like this before – the scale of fake content circulating on one story.’

Facebook has already removed hundreds of false or misleading posts disseminated by the two main political parties in India – the Bharatiya Janata Party and the Indian National Congress – as well as disinformation originating in Pakistan. Beyond the current election, social media has also been the primary vehicle through which extreme Hindu groups have spread false rumours, instigating the lynching deaths of dozens of innocent people – many of them Muslim or lower caste – in recent years.

Similarly, widespread ethnic killings and the displacement of around 700,000 Rohingya Muslims in Myanmar were fueled by fake news spread through social media channels. As a meticulously researched story in The New York Times demonstrated last year, the Myanmar military deliberately used Facebook to fan ethnic unrest through spreading false rumours about Muslims in general and the Rohingya minority in particular. Half a decade in the making, the military’s anti-Rohingya campaign deployed hundreds of personnel to create fake or celebrity accounts and then inundate them with incendiary content at key times of high web-traffic.

The efforts being undertaken by the CSE to protect Canadians from foreign-inspired disinformation campaigns are essential to the protection of our democracy. But more work and attention is needed to understand the full range and potential of foreign and domestic information manipulation in the digital age. As Artificial Intelligence algorithms continue to improve at a stunning pace, disinformation will continue to be a part of our media and social media landscape for the foreseeable future, and may increase in both scope and sophistication. As such, strengthening the public’s durability for resisting and countering fake news will be key to ensuring the maintenance of democracy in the 21st century.

Featured Image: Smartphone user. Via Pexels.com. 

Disclaimer: Any views or opinions expressed in articles are solely those of the authors and do not necessarily represent the views of the NATO Association of Canada.

Author

  • Joseph McQuade

    Joseph McQuade is Editor-in-Chief at the NATO Association of Canada, where he runs the Centre for Disinformation Studies program. He is also the RCL Postdoctoral Fellow in the University of Toronto’s Asian Institute at the Munk School of Global Affairs and Public Policy and a former SSHRC Postdoctoral Fellowship recipient, as well as a Managing Editor of the Journal of Indian Ocean World Studies. Dr. McQuade completed his Ph.D. at the University of Cambridge as a Gates Scholar, with a dissertation that examined the history of counter-terrorism laws in colonial India from an international perspective. This research forms the basis of his first book, A Genealogy of Terrorism: Colonial Law and the Origins of an Idea (forthcoming from Cambridge University Press in November 2020). Dr. McQuade has also published widely in academic journals such as the Journal of Imperial and Commonwealth History, the Journal of World History, and History Compass, and has commented on current affairs for national news broadcasters such as CBC and CTV. Information on his forthcoming book can be found below: https://www.cambridge.org/core/books/genealogy-of-terrorism/BC74FB48CED9CBA2F2DC8B9C580FE714

    View all posts
Joseph McQuade
Joseph McQuade is Editor-in-Chief at the NATO Association of Canada, where he runs the Centre for Disinformation Studies program. He is also the RCL Postdoctoral Fellow in the University of Toronto’s Asian Institute at the Munk School of Global Affairs and Public Policy and a former SSHRC Postdoctoral Fellowship recipient, as well as a Managing Editor of the Journal of Indian Ocean World Studies. Dr. McQuade completed his Ph.D. at the University of Cambridge as a Gates Scholar, with a dissertation that examined the history of counter-terrorism laws in colonial India from an international perspective. This research forms the basis of his first book, A Genealogy of Terrorism: Colonial Law and the Origins of an Idea (forthcoming from Cambridge University Press in November 2020). Dr. McQuade has also published widely in academic journals such as the Journal of Imperial and Commonwealth History, the Journal of World History, and History Compass, and has commented on current affairs for national news broadcasters such as CBC and CTV. Information on his forthcoming book can be found below: https://www.cambridge.org/core/books/genealogy-of-terrorism/BC74FB48CED9CBA2F2DC8B9C580FE714