Sometime in 2016, reports circulated on social media about three highly trained teams that had slipped into the Donetsk People’s Republic, in eastern Ukraine. While the teams had paperwork linking them to the local health ministry, they were, in fact, commandos with the Canadian Security Intelligence Service, tasked with infiltrating one of the world’s most active military conflicts. The republic had declared its independence from Ukraine in 2014, largely with the help of Russia, which provided the separatists with funds and guns. Thousands, including scores of civilians, had died in the fighting between rebels and Ukrainian forces.

The tactical units, composed of some twenty special-forces operators, had been sent to raid separatist positions, sabotage the republic’s infrastructure, and eliminate the country’s eastern checkpoints. It was a surgical military effort to incapacitate the breakaway region.

Something went wrong. One unit tripped a mine, alerting the Donetsk troops to its presence. Snipers took aim at the Canadians. Another unit was detected around the same time, and it was hit by heavy machine-gun fire. The third retreated soon after. In all, eleven special forces were killed in the raid, making it one of the Canadian military’s deadliest incidents since it left Afghanistan.

Trouble is, hardly a word of it is true. Yes, Canada has dispatched troops to Ukraine as part of a joint initiative with the United States and United Kingdom to train Ukrainian security services after Moscow annexed Crimea, a large peninsula in the country’s south. And yes, thousands have died in the ongoing conflict between Russia and Ukraine since 2013. But no Canadian personnel have seen combat. In the two years since it first surfaced, the Donetsk story has travelled throughout the internet—an English translation was shared over 3,000 times on Facebook alone. A similar story blew up on pro-Russia websites this past May. The new iteration, which spread even more widely, suggested that three Canadian soldiers were killed after their car hit a land mine while they were being escorted by the Ukrainian military.

These stories are, in the purest sense, fake news: fabricated reportage made to look credible. Specifically, they were lies that advanced many of Russia’s preferred narratives with regards to the North Atlantic Treaty Organization, the military alliance of European and North American democracies founded after the Second World War. The Kremlin, for example, wants to paint NATO as a dangerous and oppressive aggressor, an occupying force locals should be afraid of. It’s a narrative designed, in part, to weaken Ukraine’s will to resist Russia and, by sparking tensions with host countries, discourage Canada and others from continuing their foreign deployments so close to Russia’s border. If the Donetsk story had worked, and Canadians and Ukrainians had believed it, it would have incited allegations of a cover-up—allegations that NATO was lying about the high cost Canadians were paying to protect Ukrainians.

But while the hoaxes fizzled—CTV News published comments from the Department of National Defence and the Canadian Armed Forces debunking the original Donetsk story the day after it appeared online—there are no assurances that the next piece of Russian disinformation won’t go viral. Moscow, and the vast network of cybertrolls and hackers who operate at varying degrees of closeness to the state, are armed with an arsenal of tactics to boost their version of reality and sow discord around the world. They’re using search-engine-optimization strategies to circulate conspiracy theories, some of them complete nonsense (such as that the US is readying for all-out nuclear war against Russia or that Western firms are using microchip implants to make employees docile). They’re setting up sophisticated social-media campaigns to make their ideas appear popular. They’re spreading the information to a wide array of sites, seemingly across the political spectrum, to create the illusion of consensus. They’re setting up fake think tanks to make it seem as though governments and media are on the wrong side of academic research. It is a campaign, and it is well organized. A 2017 summary of the Kremlin disinformation threat by the Canadian Security Intelligence Service sets up the conflict starkly: “There are no front lines—the war is total—and there is no neutrality.”

In the US, cybertrolls tied to Russian disinformation operations carried out an aggressive effort aimed at putting Donald Trump in the White House. In Western Europe, similar troll accounts have promoted antiestablishment parties regardless of whether they’re pro-Moscow: they’ve boosted Spain’s Catalan independence movement and built up Brexit chatter in the UK. In the Baltics, Russian-language, Kremlin-backed media have attempted to turn local populations against NATO. Wherever it can, Russia has tried to pull countries into its sphere of influence. President Vladimir Putin, a KGB veteran, has shown particular skill not only at leveraging Russia’s various intelligence agencies but also at overseeing an arms-length state run by oligarchs close to his regime. There are many layers to his work. And that work may be finding a receptive audience. According to a 2015 Pew Research survey, the populations of many NATO countries are divided about using military force against Russia to defend a NATO ally.

European countries including France, Germany, and Sweden have taken direct action to fight fake news. Governments are increasingly supporting, and sometimes subsidizing, legitimate news media while simultaneously drafting tougher regulations to combat hate speech and distortion online. The US and the UK, meanwhile, have tried to counter Putin’s attempts at plausible deniability by taking the unique step of releasing actual intelligence reports describing Russian interference and providing consistent updates on cyberattacks and espionage.

In that context, Canada comes across as surprisingly unconcerned. After all the tough talk from Ottawa about ensuring radical transparency on political advertising and auditing Facebook’s handling of its users’ personal data in the wake of the Cambridge Analytica scandal—in which a company amassed information on tens of millions of people for the purpose of targeting ads to potential voters—the government has introduced no meaningful regulations applied to social-media companies. Maybe the most noteworthy effort to fight fake news in Canada is Facebook’s decision to partner with media organizations to help it fact-check suspicious content on its platform. A positive step, given Facebook’s status as the favoured disinformation tool, but one that, on its own, could prove inadequate, given the sheer volume of propaganda that agents are pumping out. According to statements made to US Congress by the social-media company in 2017, Russian-linked posts reached 126 million users in the US over a two-year period spanning the 2016 presidential election. Faced with that output, the pick-and-choose strategy of news verification is like using one traffic cop to stop accidents on a highway where the speed limit is 200 kilometres per hour.

There are reasons for Canada to be complacent. The country’s political climate is muted compared to the tumult in America and Europe. Our russophone population is small compared to Eastern Europe’s, meaning Moscow would have limited success piping Russian-language propaganda into the country. But we’re mistaken if we think Russia doesn’t have an active interest here. Canada is an Arctic power, a founding partner of NATO, a close ally of Ukraine, and the next-door neighbour of the US. Perhaps most importantly, Canada has participated in sanctions against Russia that have cost the country billions of rubles. Last June, the Communications Security Establishment—a Canadian spy agency that dates back to the start of the Cold War—published a report on the high possibility of cyberattacks in the lead up to the 2019 election. “Cyber threat activity against the democratic process is increasing around the world,” the report warns. “And Canada is not immune.”

One play from Russia here could be to help elect a kind of Manchurian candidate—a sympathetic partner in the vein of Donald Trump or former French presidential candidate Marine Le Pen. A pro-Putin prime minister might pull Ottawa back from NATO, maybe open the door to Russian military expansion in the high Arctic. But once you understand how the Kremlin has operated the world over to advance its political interests, you realize that it doesn’t necessarily need a friend in 24 Sussex Drive. Russian agents can plant fake stories, leak damaging information on candidates, and play up artificial crises or ideological divides without an expansive political apparatus. We know that Moscow has used its embassy to push stories to the Canadian media with an aim to weaken our government. We also know that Moscow has tried to distort our social-media landscape to obfuscate its complicity in war crimes. With the next election less than a year away, the worry isn’t that Canada has barely begun to guard against the threat. The worry is that it may already be too late.

In a white cube-like building, behind high walls of the same colour, which sits just across the river from downtown Riga, Latvia, is a major front in the fight against Russian disinformation. The digs are plain and can make it hard to figure out who, exactly, works here. There are a few military insignias around, and while someone occasionally emerges from a door in army garb, most people walking through the halls wear civilian clothes.

This is the home of the NATO Strategic Communications Centre of Excellence, a group launched in 2014 to look into everything from online neo-Nazi extremism to the Islamic State militants’ digital network. Putin’s annexation of Crimea that same year solidified Russian disinformation as one of StratCom’s priorities, and today, its staff of researchers and analysts devote a great deal of resources to the online tactics, organizations, and outlets linked to Putin’s efforts to destabilize the former Soviet republics. On the third floor, for instance, a digital-forensics lab studies the effects of robotrolling—the use of social-media bots to spread disinformation and propaganda, a tactic Russia uses to feed anti-NATO narratives to local populations. This spring and summer, StratCom found that bots were responsible for 49 percent of all Russian-language Twitter activity about NATO’s presence in the Baltics and Poland. To put it another way: were a Russian speaker to search Twitter for news about NATO in that region, half of all tweets they saw would have been created by automated accounts. These bots typically operate in sync with, and are controlled by, agencies with ties to Russian intelligence. The most notorious of these troll farms is the Saint Petersburg–based Internet Research Agency, a group indicted by the US Justice Department for its role in the 2016 presidential election that brought Trump to power. Bots have become prevalent on social media for a simple reason: they are one of the most efficient ways to spread lies. They are fast and, when paired with blogs and websites, can generate the perception that an idea, or a story, is ubiquitous—which can create pitfalls for NATO troops.

In 2017, Lithuanians were subjected to a horrifying tale: German soldiers had sexually assaulted a fifteen-year-old girl who had been living in foster care in a small town near a NATO base. The fake story started life as an email sent to Lithuanian politicians and media workers before spreading online. Russian operatives had, it appears, tried to adapt a previously successful disinformation campaign from 2016—today referred to as the “Lisa case”—which dominated headlines in Germany for weeks. It featured a missing thirteen-year-old Russian German girl who Russian TV reported had been raped by migrants. The story was ultimately disproved but not before sparking protests and a diplomatic row between Germany and Russia.

The newer NATO rumour never took hold, though it did play into the Kremlin-promoted stereotype of NATO soldiers as sexual deviants. Canada had its own turn with this stereotype when a whisper campaign began online about how the country’s deployment in Latvia that same year was headed by former colonel Russell Williams, the convicted murderer and rapist currently in a maximum-security prison in Quebec.

Military officials from Canada have thus far been able to get ahead of such damaging fabrications and to avoid giving Kremlin-friendly outlets fodder. Moscow is hungry to exploit news of NATO soldiers behaving badly, because spreading those stories has a clear outcome: weakening support for the military deployment. “A single incident with a drunk soldier can destroy your strategic communications,” said Jānis Garisons, the state secretary for the Latvia Ministry of Defence. NATO’s Canada-led battle group in Latvia has its own Facebook page that portrays its troops as friendly allies, playing hockey in civilian clothes and venturing out into the community. But the Canadian military’s success in countering misinformation hasn’t stopped Russia from trying to skew and shift the debate around NATO presence. General Petr Pavel, chair of the NATO Military Committee, told me last year that his team has already seen an uptick in activity from Russian agents since NATO arrived in Latvia. “They increased their intelligence collection, they increased their monitoring of the electronic environment, and we observed a number of attempts to hack mobile phones and the networks,” Pavel said.

Tweets flying out of a phone
 

The fake story about Williams is an example not just of the narratives Moscow favours but of the digital infrastructure it uses to deploy those narratives. In Riga, I sat down with Mārtiņš Kaprāns, a researcher at the University of Latvia who focuses on Russian disinformation. He explains that the lines of attack and innuendo—which can range from very sophisticated to clumsy and ham fisted—are based on a central question: “Do they have a potential of virality?” If a particular story, conspiracy theory, or idea catches on, then the operation begins firing on all fronts, from the dizzying network of “weird, marginal” blogs to Russian state-backed news sites and broadcasters like Sputnik and Russia Today, more commonly known as RT. Even Russian foreign minister Sergey Lavrov might weigh in. Suddenly, a news story that begins deep in the Russian Federation takes hold right here, in Canada.

Take the White Helmets, a group of ragtag Syrian medics who operate largely behind rebel lines to treat those injured in violence, such as indiscriminate bombings by President Bashar al-Assad’s military, related to the war in Syria. As international outrage mounted over the Syrian air force’s use of chemical weapons on its own population, Russia—which has unabashedly propped up Assad—used the White Helmets as a convenient scapegoat. The Russian government, media, and even the social-media accounts of its embassies have pushed the false notion that the White Helmets either staged the chemical weapons attacks or attempted a cover-up for the rebels actually responsible for them. Those claims aren’t backed by credible evidence (a joint investigation by the UN and the Organisation for the Prohibition of Chemical Weapons found the Syrian government responsible for three chlorine-gas attacks between 2014 and 2015). Yet they were repeated by Steve Doocy, co-host of Fox & Friends on April 12 of this year. Here at home, in July, Conservative immigration critic Michelle Rempel pushed back on Ottawa’s plan to offer refugee status to White Helmets looking to flee to Canada and questioned whether the security screening was adequate.

A Canadian politician with such an important portfolio should perhaps be more mindful about making comments that feed into damaging conspiracy theories. Yet a big reason concerns about the White Helmets feel normal is because doubts about their true intentions have become pervasive online. And for that, credit is due partly to a Canadian website.

A Google search for White Helmets turns up plenty of websites peddling sketchy research. Among them is Consortium News, which writes about the medics’ complicity in rebel-staged chlorine-gas attacks against civilians. The posts on Consortium News look convincing and come complete with links, sources, and citations. One of those links will take you to the Centre for Research on Globalization, which has published articles tying White Helmets to terrorism. On its face, the Centre for Research on Globalization looks legitimate; it bills itself as “an independent research and media organization based in Montreal.” Yet the website serves as a clearing house for propaganda, bizarre alternative facts, and outright fiction. Visitors are told 9/11 was an inside job and that the US military manipulates the weather to cause earthquakes and hurricanes.

The Centre for Research on Globalization is run by Michel Chossudovsky, a professor emeritus at the University of Ottawa and a frequent guest on Russian state-run media. Chossudovsky’s site has garnered enough profile, reported the Globe and Mail in 2017, to face scrutiny from NATO’s StratCom. The research centre concluded that, by partnering with other websites, the conspiracy site could raise the Google rankings of its stories and “create the illusion of multisource verification.” Along with Consortium News, Chossudovsky’s site links to the Strategic Culture Foundation, a site— registered in Moscow and designed to look like a think tank—rife with conspiracy theories. All three sites frequently cross-post one another’s articles or cite them approvingly as authentic journalism. This troika is notable for another reason: it assisted in one of the most brazen Canadian examples of Russian meddling to date.

To witness moscow’s disinformation machine in action, you have to study the coordinated effort to undermine one of Moscow’s harshest critics on the world stage, Chrystia Freeland. When she was sworn in as minister of foreign affairs on January 10, 2017, after a cabinet shuffle, the change in political temperature was noticeable. Freeland replaced Stéphane Dion just as he had been trying to sell the prime minister on a diplomatic reset with Moscow. But while Dion was advancing his case on Russia, parliamentarians were introducing the Magnitsky Act. As drafted, the legislation would permit Canada to impose sanctions on any foreign official embroiled in corruption and human rights abuses. The act, however, was specifically intended for the oligarchs running Putin’s government. Bill Browder, an American-born financier, led the crusade to have the legislation adopted worldwide (the law is named for his former lawyer who was beaten to death in 2009 while in custody after being arrested for uncovering extensive fraud by Russian officials).

Browder lobbied Canadian political parties in the lead up to the 2015 election but was surprised to learn that Dion, upon entering cabinet, had no plans to follow through on the Liberals’ campaign commitment to adopt the act. Browder told me he believes Dion was trying to “quash it by stealth” and went so far as to call the former minister, who is now ambassador to Germany, a “craven appeaser of Russia.” According to a book by a former policy adviser to Dion, it was in part his persistence in pushing the government toward re-engaging with Moscow that “irritated” Trudeau and hastened Dion’s exit. (Dion could not be reached for comment.)

Dion’s replacement—Freeland—is well known to the Russians. In fact, Freeland has been sanctioned by the Putin regime and barred from even entering Russia. It isn’t just that Freeland has Ukrainian heritage—she is also a former journalist and even worked with Browder to expose corruption in Russian companies. With Freeland in the foreign-minister job, passage of the Magnitsky Act took on new urgency. “Putin was explicit about how much he hated the Magnitsky Act,” Browder said. “Repealing the Magnitsky Act was his single most important foreign-policy priority.”

The day after Freeland was sworn in, Kirill Kalinin, the press secretary for the Russian Federation’s embassy in Ottawa, used the official embassy Twitter account to send me a message. Kalinin and I had always been friendly; he often contacted journalists in Ottawa with tips and suggestions for stories. “Look at this link,” the message began. It was a collection of research about a Second World War–era Ukrainian newspaper called Krakivski Visti.

The point of the research was plain: Freeland’s grandfather Michael Chomiak had been the editor-in-chief of Krakivski Visti. Established in 1940 under Nazi occupation, the newspaper represented Ukraine’s best chance at its own independent press within the confines of an oppressive wartime reality. However, the paper was gradually compelled by its Nazi censors to publish increasingly antisemitic editorials. I was far from sold that a story about Chomiak’s ties to the Germans had any public interest. Freeland hadn’t tried to hide her family’s past—she had helped edit an academic paper on it—and relitigating Second World War sins for political ends felt, at best, unproductive. Even if we were to publicize Chomiak’s shameful complicity with the Nazi occupiers, it mirrored nothing in Freeland’s own life and career.

After I passed on the story, details from it began popping up across the internet. Notably, on the Centre for Research on Globalization, Consortium News, and the Strategic Culture Foundation, as well as on an array of blogs, other websites, and social-media accounts. Some of those voices were willing tools of the Russian state, but others were merely downstream recipients of its narratives. The account about Freeland’s grandfather not only exposed the spiderweb of platforms that play host to Russian disinformation but also raised a vexing question about whether a country like Canada can silence its own citizens when they promote the weaponized stories pushed by a foreign power.

As it happened, I wasn’t the only journalist that Kalinin had been speaking to. Two months after Kalinin pitched me, Globe and Mail Ottawa bureau chief Robert Fife asked Freeland at a press conference whether Russia was trying to smear and discredit her by spreading disinformation about her grandfather. Suddenly, Michael Chomiak had gone mainstream.

On March 7, 2017, two days after the Globe published its piece on the Chomiak affair, Telesur—the Latin American broadcaster considered by critics a propaganda mouthpiece for the Venezuelan government, which remains an ally of Russia—reported that the original information about Chomiak’s Nazis ties came from amateur researchers with the Communist Party of Canada who had dredged it up from provincial archives in Alberta. In fact, the information had already been published, in the late nineties, by Freeland’s uncle in two separate academic articles. But whatever the origin of the material, and however it wound up in Kalinin’s possession, the intent was obvious. The Russian embassy wanted to hobble Freeland, mere days after her appointment. Of course, the smear campaign didn’t work. On October 4, 2017, the Magnitsky Act passed unanimously through the House of Commons.

Pushing the Chomiak story from the Russian embassy would eventually have consequences. In March 2018, London collected intelligence that identified Moscow as almost certainly responsible, or complicit, in the attempted assassination of a former Russian spy in Salisbury using a highly toxic nerve agent, which later killed one British citizen. European Union and NATO countries, including the US, carried out the mass expulsion of over 100 Russian diplomats in solidarity with the UK. Freeland announced that Canada would be expelling four embassy officials in Ottawa and Montreal “who have been identified as intelligence officers or individuals who have used their diplomatic status to undermine Canada’s security or interfere in our democracy.” Among them was Kalinin.

Both the public and the media judge Russian tactics according to a success-or-fail metric. The effort to install Trump into the White House was deemed intentional because it was successful. The attempt to kneecap Freeland out of the gate was deemed either nonexistent or marginal because it failed. But information warfare doesn’t make progress simply by notching wins. It’s a relentless, manifold psychological campaign designed to make alternate realities appear legitimate.

As such, Putin is playing a long game, one of destabilization rather than domination. Using Freeland’s family history against her not only forced Canada to confront its sordid details—the Chomiak story appeared in a number of legitimate news outlets, including the Washington Post—but allowed pro-Putin bloggers and social-media groups to bolster their claims that the minister supported far-right Ukrainian groups, accusations that have remained incredibly resilient online, despite having little basis. (One article published by the Strategic Culture Foundation this past September calls Chomiak “the most notorious of the Nazi collaborators who immigrated to Canada” and Freeland “a well-known Russophobe.” It was penned by University of Montreal professor Michael Jabara Carley, who has also appeared in interviews on Sputnik News and RT.) The ongoing afterlife of the Freeland attack, in other words, showcases a skill at which Moscow has proved itself increasingly adept: stoking controversy.

Moscow has learned it can often find more success mixing its lies in with ample amounts of truth, selectively told, to trick the already susceptible. When the Internet Research Agency launched its efforts to sway the conversation in the 2016 US election, it microtargeted Facebook ads, using them to appeal to social-media users’ patriotism, pride, frustration, race, nationality, and sexuality. In their efforts to play up divisions in the country, and feed misinformation to a captive audience, agents showed a surprisingly nuanced grasp of US politics. But their cultural competencies do not end at the American border. Some of the ads they deployed during the US election also contain Canadian content that reveals a certain familiarity with our political landscape. According to examples obtained by the Canadian Press, Russian trolls stoked the divisive debate around the Keystone XL pipeline and retweeted messages targeting a href=”https://thewalrus.ca/tag/justin-trudeau/” rel=”noopener” target=”_blank”>Trudeau’s policies on refugees and his support for Canadian Muslims. There was no specific policy outcome in mind—Russia doesn’t seem to care if pipelines get built or if refugees arrive—but the intent was to inflame tensions.

Fenwick McKelvey, a professor in the Department of Communication Studies at Concordia University (and also a member of this magazine’s educational review committee), has spent the last several years researching how social-media manipulation can alter debates online. He also cowrote one of the first papers tracking the influence of bots on Canadian politics. McKelvey believes there are plenty of domestic pressure points Russian bots could exploit. “You’ve got language, Indigenous issues,” he says. It’s not a stretch to think such emotionally charged debates could be weaponized, McKelvey adds, because it’s exactly what Russian agents did in the 2016 US campaign. “The Internet Research Agency was looking for wedge issues. They were looking for ways of polarizing.”

Russia would have a lot to work with. Start with Quebec MP Maxime Bernier and the new political venture he is leading, the People’s Party of Canada. Polls show his upstart federal party, founded largely on the back of his harsh talk on immigration, is hovering around 15 percent support. If agents began pushing border-crossing fears, they could boost Bernier’s popularity and upend the national dialogue. If cybertrolls amplified messages advocating for Quebec to secede from the country—or, for that matter, Toronto from Ontario—could it lead to real-world support or real-world blowback? On Indigenous issues, agents could do what they did in the US with Black Lives Matter—pump money into media appealing to legitimate frustrations of Indigenous peoples, then turn around and frighten white Canadians with racist fears over “angry Natives.” If a network of left-wing blogs that serves as a platform for Kremlin propaganda starts peddling a conspiracy theory that Conservative leader Andrew Scheer fudged his taxes—as they tried to do to now French President Emmanuel Macron—how many people would take the bait?

McKelvey says Canada is vulnerable to this kind of exploitation, if it isn’t happening already. “This is part of Russia’s effectiveness. They’re not developing a counternarrative. They’re undermining the existing narrative,” he says. “It’s about spreading confusion.”

When i sat down with Karina Gould, Canada’s minister of democratic institutions, earlier this year, she was very blunt: the government’s priority is to defend its computer systems from outside threats. Gould also added that the government needs to ensure the internet itself doesn’t get leveraged by foreign actors to influence domestic politics. France, Sweden, Germany, and the UK are just a few of the European countries that claim Russia has meddled in their democratic processes. (US Democrats on the Senate foreign-relations committee say at least nineteen countries worldwide have experienced such interference). The attempt to take down Macron during France’s 2017 presidential election is a prime example. Anticipating an attack from Russia, campaign staff prepared a honeytrap—in this case, a trove of innocuous emails laced with fake data. When Russian-linked hackers broke into Macron’s email—as expected—they stole the bait documents and published them along with fabrications of their own. The media refrained from releasing the material, and the effort largely backfired.

The reported meddling has spurred Ottawa to set aside millions of dollars to beef up government systems and hire security teams to detect and thwart attacks. As a part of this strategy, the Communications Security Establishment, for the first time in the spy agency’s history, is set to offer its assistance to political parties to help secure their emails and social-media accounts and to safeguard their websites. CSE will also be working with Elections Canada to strengthen the agency’s defences, although Canada’s paper-and-pencil voting system make any serious cyberinterference difficult.

According to the CSE’s 2017 threat-assessment report, the likelihood of any foreign state taking aim at the Canadian election will depend on the geopolitical landscape and, in a telling line, “on the spectrum of policies espoused by Canadian federal candidates in 2019.” In other words: if the Russian government sees something it likes, it may amplify that message. If it sees something it doesn’t like, it may sabotage the associated candidate. But there are limits to what even the CSE can do. It can offer its assistance to political parties, but the parties are under no obligation to accept the help. “Are any political parties getting their act together?” McKelvey asks, referring to their cybersecurity preparedness. There’s little evidence to suggest they are, he says, which could make them low-hanging fruit for foreign agents. “I had hoped there would be some disclosure that Canadian political parties are taking this seriously.” Elections Canada seems, at least to a degree, alert to the problem. The agency is reportedly planning to buy what it calls a social-media “listening” tool to monitor threats that could have an impact on the 2019 federal election.

As part of her effort to counter foreign influence, Gould introduced Bill C-76, also known as the Elections Modernization Act, which bans foreign entities from spending money or publishing false claims to sway voters and strengthens existing bans on the use of foreign money in election campaigns and advertising. The bill will, she hopes, discourage Russia, or whichever power, from posting polarizing ads akin to those deployed during the 2016 US election. The bill also codifies election-related hacking as a specific crime.

The Elections Modernization Act has been generally lauded. But, as drafted, it lacks a key mechanism to prevent the transfer of foreign money from one third-party group to another in order to obscure the fund’s origins—a weakness highlighted by the chief electoral officer, Stéphane Perrault, who oversees Canada’s elections. In effect, it would be possible for Russia, or any other foreign government, to funnel money through political groups who conduct advertising, and it would be nearly impossible to tell. The legislation is still being reviewed by Parliament, though Perrault underlined in April that “time is quickly running out” to address the issue before the next election.

It’s true that Canada’s election-finance laws, with their strict spending caps and low donation limit, will make it hard for any one entity to tilt the scales. But these issues are dwarfed by Russia’s ability to bend the zeitgeist to its needs. By exploiting the algorithms that run services like Facebook and Google, Russia can extend its influence far behind just blackmailing one politician or infiltrating one government office. It can fundamentally alter a country’s political dialogue. So the question of whether Russia can successfully rig an election is, really, beside the point. The real worry is how an unfriendly foreign government can distort or skew our society. The internet acts like a house of mirrors, in which the Kremlin can shift and move the reflections to distort and expand its ideas, themes, and narratives. It can drop a falsehood— say, that Canada is funding neo-Nazis in Ukraine—onto one site. Then another website picks it up. And another. Soon, a network of Twitter and Facebook accounts and pages share the lie. It stretches out, taking on a life of its own.

Even the most independent-minded researcher can be forgiven for being deceived. After all, we’ve placed an enormous amount of trust on the news feeds that have become lenses for how we view the world. We trust that those streams of information are accurate or representative to some degree—and, even if we don’t, there’s little we can do, given that the source code that generates these feeds is protected by the company. At any moment, you might notice a change; perhaps you’ve begun to see more stories about a particular political candidate or a certain scandal affecting the foreign minister. Is that uptick in chatter indicative of a real-world trend, a real societal need to pay attention to the issue, or is it a distortion created by accident or with malice? How can you know for sure? The subterfuge will only get more sinister. When I visited StratCom, researchers were studying the advent of digitally manipulated audio and video—technology that could make public figures say things they’ve never said in real life. Imagine a video emerges of a href=”https://thewalrus.ca/tag/justin-trudeau/” rel=”noopener” target=”_blank”>Justin Trudeau performing a Nazi salute. It’s fake, of course, but how many people could be convinced otherwise?

So how do we properly defend ourselves? Try to debunk a popular lie and risk vindicating believers. Try to ban a website and run afoul of free speech. The point is, there isn’t a simple solution. When I travelled through Latvia and Ukraine, two countries with a long history of Russian meddling, earlier this year, nobody claimed to have this figured out. Nobody held up an example of a population immune to Russian meddling. Kiev has tried banning sites including RT and Sputnik but has seen little return for the effort. Officials there have tried to provide media-literacy education for citizens and to shore up cybersecurity. There have been attempts by various organizations in Ukraine to fight fake news with real news, reminiscent of Radio Free Europe’s work during the Cold War, and ongoing efforts to fact-check Russian disinformation. But the disinformation persists.

Canada has been lucky. But that luck may well run out. Critical thinking, digital vigilance, healthy skepticism—they can take us so far. Hopefully, it will be far enough.

Justin Ling
Justin Ling is an investigative journalist and host of the podcast Uncover: The Village.
Paul Kim
Paul Kim is the design director at The Walrus.