Truth Matters: Battling Online Misinformation
July 17, 2018
On June 18th, Library and Archives Canada hosted a live Facebook event called “Misinformation Online”, part of the Facebook Canada Hard Questions Roundtable series. Dr. Guy Berthiaume, Librarian and Archivist of Canada, moderated the event, which brought together representatives from the information, journalism, and political sectors and from Facebook. Fittingly, in the former Reference Room, with Alfred Pellan’s “The Alphabets” and “Knowledge” murals overlooking attendees, the panel of experts sat in front of cameras, ready to broadcast live on Facebook.
The cameras turned on, and Nathaniel Gleicher stood up to introduce himself as the Director of Cybersecurity Policy for Facebook, as well as the former Director of Cybersecurity Policy in the Obama Administration. He was recently hired by Facebook after the company promised to double its security staff last year, around the same time Facebook launched ad transparency testing and the Canadian Election Integrity Initiative in early November 2017.
The initiative has the very succinct, yet ambitious, goal of preventing what happened in the 2016 U.S election from being repeated here in Canada – more specifically, preventing politically motivated advertising funded by an outside source (Russia), with the intent to spread misinformation. Very aware of the criticisms aimed at the social media company for not detecting these attacks sooner, Gleicher quickly admitted that, “We [Facebook] were slow to spot the issues with the last presidency. It is our full intent to be vigilant leading up to the 2019 Canadian election.”
Gleicher pointed the audience’s attention to a neatly stacked pile of glossy blue magazines on a table near back of the room. They were labelled Cyber Hygiene Guide – Politicians and Political Parties from Facebook’s Canadian Election Integrity Initiative. The guide’s purpose is to deliver “best practices for politician and political parties on how to keep your Facebook pages and Facebook accounts secure.” Additionally, it lists common cyber threat tactics to watch out for, such as Spear Fishing, and targeting professional staff.
Gleicher called those behind cyber threats ‘threat actors’, who “use information operations to use the best aspects of the internet against us…They create personas to appear like they are a part of a community that they want to target, and [through a network of fake accounts] make their opinions appear more popular.”
In order to distinguish a fake account vs. a real account, Facebook has “a team of people focused on detecting, disrupting, and taking down malicious information operations”, as well as applying a tactic Gleicher called “differential friction”, making behaviors exclusive to threat actors more difficult to carry out.
“We don’t want to make being a part of a community more difficult,” Gleicher expressed, “so we are selective when we use differential friction.”
Chris Dornan, an Associate Professor at Carleton University’s School of Journalism and Communications, stated that he was less concerned with security issues than he was with the divide between opinion and fact, often so difficult to distinguish in such an information bloated society. He dubbed this phenomenon the “Information regime.”
Dornan says that the problem we’re seeing today is because “digital platforms have superseded what used to be authoritative sources of information.” The journalism professor went on to describe a time when citizens had access to only one source of news media that “was the consensus of social conformity, and outliers were not given any voice unless it was to show what not to do.”
He continued that, “the voiceless have now been given a megaphone and now everyone is free to say what they like online. The agreement of consensus has been antiquated, and has created information regimes where extremists flourish.”
Expressing the viewpoint that multiple sources are better than one is Jennifer Anderson, archivist from the Library and Archives Canada and author of the book Propaganda and Persuasion. Anderson began by referencing the Public Service values of excellence, which motivate her in her work as an archivist, along with “the fundamental realization that from knowledge comes power and with that power comes responsibility.”
“Truth matters,” she continued, “and what is happening in an archive is we re-dress the record. We mitigate for bias, we interrogate the source…and we share information with other archives.” Anderson later went on to state that “we also corroborate sources and never use only one source.”
On the topic of misinformation, Anderson says that “of course we [libraries and archives] have been pushing back against Fake News. But there’s a group of people who are louder than all the libraries and the archives and that’s why we are so quick to participate in these types of discussions.”
She was also the first panellist to challenge Facebook to take responsibility in teaching their users how to identify authoritative online sources. “Where is your money going when it comes to encouraging your users to think critically? Do you think that Facebook as an organization should be held responsible for proving [to users] that we can trust it?”
However, Jeremy Kinsman, a former Canadian ambassador to the EU and to the Russian Federation, argued that it’s the users, not Facebook, who should be taking responsibility when it comes to battling online misinformation. “Facebook’s algorithms do seem to reward the angriest and most divisive voices…But my biggest question is, why do people believe this crap? Why are we still buying into it?”
Kinsman believes that it’s because “we’ve been prepped for this by years of ‘info-tainment’, and by distrust of the media, and distrust of the government.” He went on to state that, “Facebook didn’t create this culture, but it is a part of the problem.”
The solution? According to Kinsman, “remedies lie with the public and teaching children critical thinking skills and making sure that fact checking is happening not just reactively…Our job in all this is to save each other from the most destructive tendencies amongst ourselves.”
As for what users can do when encountering misinformation, offensive content, or spam on Facebook, the Cyber Hygiene Guide recommends that it can be reported on the platform, but should also be documented by taking a screenshot should the activity be thought to be illegal. For more information on safeguarding against security threats online, the Cyber Hygiene Guide is available in PDF.
Add a new comment