The Development of Social Media Watchdogs in Canada
Opinion by Daniel Diamond

The advent of social media has ushered in new ways for communication to disseminate across the digital landscape. Throughout the 2010s, websites and apps like Facebook, Instagram, and Twitter have been used to streamline global communications, reaching more people than ever. Unfortunately, within a North American context, it has been proven that social media has also rendered hold the capacity to organize extremist events that threaten the sanctity of democratic institutions.
On January 6th, 2021, Right-Wing Extremists used social media services like Parler, Twitter, and Facebook to organize an insurrection aimed at halting the certification of the 2020 Election results [1]. The insurrection highlighted an ongoing issue within these social media companies to be able to address previously identified flaws in their content moderation capacities [1]. Over a year later, social media was weaponized by white extremist groups, anti-lockdown protesters, and People’s Party of Canada supporters to converge on the streets in the downtown core of Ottawa to protest what some deemed as “discriminatory” COVID-19 restrictions at the southern border [2]. The movement was widely disseminated across social media spaces, utilizing sites like GoFundMe, which allowed for millions of dollars to be raised in support of the convoy [2]. These are only two examples of how extremist protest movements have galvanized our social media spaces, resulting in decreased safety of those localized to these protests.
The ease at which protest movements like these are planned on social media highlights a fundamental flaw in the private sector’s ability to monitor their services adequately, as well as inherent weaknesses in the government’s capability to hold social media giants liable for planned events through their services. In 2015, the House of Commons passed Bill C-51, commonly referred to as the ‘Counter-terrorism Bill’ due to the abilities it gave government agencies to understand the implications of the law in their efforts to stop the spread of radical Islamic terrorism activities within domestic borders via services like Twitter, Facebook, Kik, and WhatsApp [3]. Specifically, C-51 allowed for law enforcement to take action against any activity that focuses on “the undermining of security and safety of all Canadians,” [4] including “any writing, visible representation, or video recording that calls for or promotes the commission of terrorism offenses... or counsels the commission offenses.” [5] The government can pass legislation to monitor digital spaces to increase public safety around growing terrorist concerns, as the government specifically implemented C-51 to address the threat that ISIS’ use of social media posed to Canadian digital safety.
Additionally, the government has proven a capability to institute watchdogs whose sole intent is to make the internet safer for Canadians who use it. For example, in 2021, Liberal MPs proposed the facilitation of a watchdog serving under the Digital Safety Commissioner of Canada to “regulate social media and combat harmful online content in Canada,” specifically regarding child sexual exploitation on services like Instagram, TikTok, and Pornhub [6]. Although not specific to monitoring Right-Wing Extremism online, the watchdog’s proposal highlights a political will for more profound guidelines and structures applied to the digital realm.
What I propose, which differs from other solutions the government provides, is a financial penalty framework that acts as a ‘stick’ for companies that choose not to work on removing Extremist event planning content or reporting the communications to appropriate authorities. The idea of a financial penalty-dealing watchdog is not a new concept. The Institute of Strategic Development proposed the same idea, offering a time window for social media organizations to take action against extremist content. If the said removal efforts fall outside of the window, then the said organizations would face financial penalties [7].
Moreover, the root of the issue, which is private-sector non-compliance with Canada’s Counterterrorism policies, is something that the federal government has recognized as an issue. In a report published in 2022, the House of Commons standing committee on Counterterrorism agreed with the sentiment that private sector enterprises must face regulation to comply with Canadian counterterrorism policy [8]. The government’s awareness of private sector compliance highlights the recognition that companies like Meta and Twitter play in the dissemination of culture and knowledge, as well as frames regulation of social media as a necessity to keep Canadians safe.
In 2018, Alek Minnassian drove a rented truck through a busy stretch of North York, killing ten and injuring an additional sixteen. Before the attack, Minnassian posted to Facebook calling for the deaths of all “Chads and Staceys,”... a reference to himself as involuntary celibate or an “incel” [8]. In response to the attack, Facebook (now Meta) released a statement that they use “rigorous algorithms as well as human monitoring aimed at the removal of content like Minnassians posts [9]. Unregulated, the private sector cannot adequately protect Canadians from terroristic threats that stem from online sources. Both the government and private sector have a role to play in protecting Canadians, and the harmonization of efforts between the two areas will only make Canada a safer nation both digitally and in person.
Bibliography
[1] Timberg, Craig, Elizabeth Dwoskin, and Reed Albergotti. 2021. “Inside Facebook, Jan. 6 violence fueled anger, regret over missed warning signs.” Washington Post. https://www.washingtonpost.com/technology/2021/10/22/jan-6-capitol-riot-facebook/.
[2] Scott, Mark. 2022. “Ottawa truckers' convoy galvanizes far-right worldwide.” Politico. https://www.politico.com/news/2022/02/06/ottawa-truckers-convoy-galvanizes-far-right-worldwide-00006080.
[3] Littlewood, Jez, Sara K Thompson, and Lorne Dawson. Terrorism and Counterterrorism in Canada. University of Toronto Press, 2020.
[4] Foley, Rebecca I M. “(Mis)representing Terrorist Threats: Media Framing of Bill C-51.” Media, war & conflict 11.2 (2018): 204–222. Web.
[5] Platt, Brian. “Terrorism Bill Aims to Help Prosecutors; National Security Old Legislation Too Vague to Hold up in Courts.” National Post (Toronto), 2017.
[6] News Staff. 2021. “New federal watchdog proposed to keep harmful content off social media platforms.” CityNews, July 29, 2021. https://toronto.citynews.ca/2021/07/29/new-federal-watchdog-proposed-to-keep-harharmful-content-off-social-media-platforms.
[7] Hart, Mackenzie, et al. 2021. “An Online Environmental Scan of Right-wing Extremism in Canada.” Institute of Strategic Dialogue. https://www.isdglobal.org/wp-content/uploads/2021/07/ISDs-An-Online-Environmental-Scan-of-Right-wing-Extremism-in-Canada.pdf.
[8] Thompson, Elizabeth. 2022. “Do more to counter violent extremism in Canada, MPs recommend.” CBC, June 30, 2022. https://www.cbc.ca/news/politics/public-safety-violent-extremism-1.6506323.
[9] Swain, Diana. “What we know about the man charged in the deadly Toronto van attack.” CBC, 24 April 2018. https://www.cbc.ca/news/canada/toronto/toronto-van-attack-driver-profile-alek-minassian-1.4632435.