In this interview, Tiffany Kwok, Program Editor for Cyber Security and Information Warfare, discusses disinformation and the upcoming Canadian federal election with three experts.
Would you say that the government’s measures to counter disinformation are more reactive or proactive right now?
Mr. Jānis Sārts, Director at NATO Strategic Communications Centre of Excellence:
In order to implement proactive strategy against the effects of disinformation and malicious information influence operations, governments need to heavily invest in monitoring, analysis and early warning of the processes in the information environment and most importantly – constantly building and cementing their own narratives, which don’t leave any fertile ground for malicious activities. The situation with awareness of this requirement is getting better, but there is still a long road to go before we will be able to assess the strategies taken by Western governments are sufficient and proactive.
Dr. Joseph McQuade, Postdoctoral Fellow at the University of Toronto:
So far, governments around the world have really been reactive in dealing with the new challenges posed by digital technologies. We have seen this not only in the case of disinformation and online hate speech but also with things such as cyber-bullying, sexual harassment, and the doxing of public figures and private citizens – especially women, visible minorities, and LGBTQ+ people.
The problem is that we are still using, for the most part, twentieth century legal frameworks to deal with twenty-first century challenges that are evolving too fast for legislation or policy to adequately adapt. At least for the foreseeable future, it seems likely that government handling of disinformation will remain reactive. In some ways this is understandable. Alongside combatting disinformation, democratic governments have a duty to uphold the rights of free speech and political expression. So putting forward a sweeping censorship policy without adequately understanding the nature of the problem, for example, would likely only provide a chilling effect on free expression, without necessarily getting at the root of the problem.
The other problem with government-led solutions is that each government will have its own priorities and ideological persuasion. You would think that political debate would hinge around interpretations of mutually agreed-upon facts, but that is not always the case. When you have two political parties down south, for example, that can’t agree on the scientifically verifiable existence of climate change, how do you come up with a mutually agreed-upon standard for what even constitutes disinformation? This is why, while government policies regulating hate speech, addressing privacy concerns, and combatting disinformation are vital, they will never be able to provide the whole picture.
Dr. Michael Morden, Research Director at The Samara Centre for Democracy:
Much of what this Government is doing and what governments across the West are doing is a reaction to recent events and the 2016 US presidential election in particular. So there’s certainly some danger that we’re gearing up to fight the last war. But while it’s easy to knock governments for failing to be proactive, circumstances are changing quickly and these issues are inherently complex. Even reacting is hard. So while many of us recognize a problem, there’s little clarity anywhere about what can be done by Governments about policing disinformation, without coming up against speech infringements. We have to also look past Governments, and recognize the responsibility of platforms, civil society groups, journalists, and citizens themselves to proactively build up societal resilience to these threats.
Prime Minister Justin Trudeau’s two-day visit to France earlier this year unveiled the plan for a digital charter to tackle misinformation and violent extremism, saying there will be financial consequences for social media giants who refuse to comply – is this enough to combat hate speech?
Dr. Michael Morden:
We know too little about what the Digital Charter would ultimately look like, should the prime minister have the opportunity to implement it. This Government attempted an approach that was initially very sympathetic to, and even somewhat deferential to social media platforms. They don’t seem to have been rewarded for it. Google, for example, is refusing to run Canadian election ads so that it doesn’t have to comply with some light regulation that the Government introduced. The Government is talking tougher now, but the mechanics of what’s being proposed have not yet been explained.
Mr. Jānis Sārts:
Penalties as a motivator to be more efficient in tackling the spread of harmful information online should be one of many instruments which would leave no choice for media giants to comply with this requirement.
Is the switch from automatic to manual screening to make sure there aren’t disinformation ads online enough to combat attacks during the upcoming federal election?
Mr. Jānis Sārts:
Experience shows that manual screening, in particular – in combination with automatic monitoring is the most efficient approach, however, implementation of it on so large platforms as we see today seems almost impossible. In any case – this cannot be the only measure taken.
Are there practical ways for everyday citizens to combat disinformation? Often it is those who may not be as tightly integrated into social circles that struggle with differentiating between valid and invalid information.
Dr. Joseph McQuade:
Yes, and in fact everyday citizens are the most important pieces in the puzzle. Fake news only matters when people believe it. Otherwise, it’s nothing but words on a page or lines of code.
It’s certainly true that people who are socially isolated can be vulnerable to online disinformation. This is particularly true when it comes to radicalization. People often think of radicalization in terms of Islamist extremism, but what actually lies at the root of radicalization is community. Twenty-first century societies have experienced the erosion of traditional forms of community membership and in some cases these have been replaced with online communities. Now, online communities can be incredibly empowering. They can connect like-minded individuals across continents and create a sense of belonging for people who may not feel they fit within the community values of their small town or their high school or their urban metropolis. But the danger is when extremist groups such as white supremacists, jihadists, or so-called Men’s Rights Activists use disinformation to attract and radicalize these individuals.
At the same time, it’s not only social outcasts who are susceptible to disinformation. Tight social circles can pose their own problems as we become comfortable with information shared by people we agree with and then become more likely to accept something as true if it comes from a trusted source. If you are a Breitbart reader and someone tells you something they heard on Fox News, you’ll likely believe it. Similarly if you are a liberal person and someone passes along a negative story about Donald Trump, you’ll probably be inclined to accept it as true even without verifying it for yourself. This is the danger of echo chambers.
In both cases, the solution is that average people need to be vigilant about the information they consume and – arguably even more importantly – about the information they share or pass along. Often people who become radicalized don’t begin at the most extreme end – they see more mild versions of these stories shared on social media and they disappear down the rabbit hole.
Not everyone has the time to rigorously fact-check everything they read. But I would challenge readers to think about the fact that maybe if they don’t have the five minutes it takes to see where an article got its information from, maybe they don’t need to share that article right now. If people could take a step back to share on social media two articles that they have looked at a little more closely, rather than retweeting six articles whose headlines confirm their existing beliefs, we’d already be doing much better.
Dr. Michael Morden:
Citizen resilience is the only evergreen solution to disinformation. With disinformation, the nature of the specific threat is always changing. We also have important rights protections that limit the Government’s ability to decide what is and isn’t good information. So much does still come down to citizens. There are a series of practices that citizens can adopt to be more discerning consumers of online information, from closely inspecting sources, to managing emotions and simply waiting on what we choose to share forward. The challenge is that the people most likely to voluntarily adopt these practices are likely those the least in need of them. So there is ultimately a government responsibility, shared with civil society, to push out media literacy in a more energetic way than we have in the past. Ultimately, we have to rely on individual citizens to be critical and thoughtful, but it can be a societal project to get there.
Mr. Jānis Sārts:
Responsible attitude towards the information people react on or spread could be the beginning. Of course – critical thinking and habit to pay attention to the source of information and the way it got on the person’s screen is also very helpful.
Are there different ways to differentiate different kinds of disinformation? And if so, are our pre-existing categories sufficient to understand it all? Should there be different ways of dealing with these different types?
Mr. Jānis Sārts:
Over the last couple of years disinformation has been carefully studied also from the academic standpoint, distinguishing different types of it according to the nature, goals and ways of spreading it. Logically – not all of it should be approached with the same instruments as not all of it is identical in their potentially harmful effects.
How has disinformation evolved over time? Do you personally think that we in the 21st century can hypothesize how disinformation might evolve, or do we need to take it as it comes?
Mr. Jānis Sārts:
Evolution of malicious information activities is closely linked to the development of technology, science, as well as the global power shifts. We can try to imagine about the tendencies within the next decade, but beyond that – it would be a pure speculation.
Dr. Joseph McQuade:
Disinformation is ultimately nothing new. As far back as you want to go in history, there are rumours, superstitions, conspiracy theories. During the spread of the Black Death in mid-fourteenth century Europe, some Christians accused Jewish communities of spreading the disease with poison, leading to horrific anti-Semitic pogroms. A few centuries later, the witch panics of the early modern period resulted in tens of thousands of innocent women being burned at the stake or drowned based on false rumours of practicing dark magic.
The social impact and function of disinformation today is remarkably similar, but what has changed is the technology. The internet has become a breeding ground for all kinds of conspiracy theories, from flat-earthers to anti-vaxers. It also continues to provide a platform for the targeting of vulnerable communities. In the seventeenth century, insecure men targeted witches. Today they target prominent feminists or LGBTQ+ people. In the fourteenth century, people accused Jews of spreading the bubonic plague. Today, Jews and Muslims across North America and Europe are accused of everything from global banking conspiracies to the clandestine spread of shariah law.
From this standpoint, I don’t think it’s hard to predict some of the forms that disinformation may take in the future. Minorities or groups that challenge the status quo will continue to be met with suspicion by those who feel they have something to lose. Similarly, conspiracy theories will be used to mobilize support for political, religious, or ideological goals by those seeking to impose their version of the world on others.
The main problem is that new technologies have amplified the reach of these conspiracies by an order of magnitude. In seventeenth century Germany someone could tell their neighbours that so-and-so was worshipping the devil and that could get someone killed. But today, you can have a single individual or group create a story that can travel to the other side of the world with the click of a button. A fake story about a Muslim doing something sinister can be shared by secularists in France, white nationalists in America, Hindu nationalists in India, and Buddhist extremists in Myanmar, all in a single day – or even a single hour. For better or worse, we are now connected as part of a single global community and that integration will likely only increase. Similarly, with new advances in Artificial Intelligence and Deepfakes, the conspiracy theories are only going to get more sophisticated and more convincing, ultimately making them harder to debunk.
Dr. Michael Morden:
Disinformation is a permanent fixture in politics, and it’s important for us to not lose sight of that. And while the nature of the problem is dynamic right now because of changing information technology, that is also not new. It seems like every time we adopt a new medium in a mass way, the potential for disinformation increases at least for a time, and we experience a bit of a moral panic. This happened with radio, for example. Clearly what makes the problem distinct right now is just the extremely low cost to produce, and potentially to disseminate information of any kind online. We will have to adjust to that, and when we do we’ll probably be confronted by new problems we can’t accurately predict from our current vantage point. We have to be comfortable with just muddling along. It’s a better approach than overestimating our ability to predict the future.
What is one implementation or change you recommend the Canadian government should do in order to prepare both the government and Canadian citizens for the upcoming federal election?
Dr. Joseph McQuade:
Invest in nonpartisan research into the social impact of digital technologies, and promote political transparency as adamantly as possible. Let the public see how decisions are being made and hold all political parties to account when it comes to publishing clear, fact-based platforms so that Canadian voters can make an informed decision. A democratic election should be determined by voters who have weighed their options and who understand what they are voting for – not by vague statements, empty promises, or unclear policy proposals.
All of the major political parties need to hold each other accountable to a certain standard of evidence, and it is the job of the media, universities, NGOs, and other third-party institutions to help ensure accountability and transparency.
But, most importantly, it is the job of Canadian voters to look at the facts, understand their options, and make an informed decision.
Mr. Jānis Sārts:
Raise awareness in the public, engaging with non-government players (media organizations, NGOs, think-tanks). Establish functioning cross sectoral monitoring mechanisms based on your threat assessment and analysis of processes in information environment. Ensure all the relevant cooperation networks (cross-sectoral, government – media, government –NGOs, government – external partners) are in place, updated and functional.
Dr. Michael Morden:
Political parties are a weak spot in our democratic infrastructure, and a likely target of malicious actors. Federal parties are largely exempt from privacy law in Canada, which means they don’t have to meet obligations about how they obtain, handle, or protect our data. This is a huge oversight for lots of reasons, including that it creates a vulnerability at election time. It’s too late now, but I would’ve strongly encouraged the Government to bring parties under some kind of meaningful privacy regulation.
Disclaimer: Any views or opinions expressed in articles are solely those of the authors and do not necessarily represent the views of the NATO Association of Canada.