top of page

How Do We Address Extremism Online?

Opinion By Alexandra Wilson. This piece is part of Digital Hate, a series exploring the rise of right-wing extremism online.

An image of a dark keyboard with illuminated yellow keys.

On January 6th, 2021, a mob of right-wing extremists stormed the US Capitol in an attempt to overturn the 2020 election results [1]. The attack was premeditated, with right-wing extremist groups openly planning on social media platforms for weeks prior to the riot [2]. However, even before rioters began discussing the attack on public forums, there were clues as to what was to come. Donald Trump’s rhetoric and policies normalized right-wing extremism, which has been festering online since his inauguration in 2016 [3]. However, the terrifying reality is that online hate does not just exist on social media websites and other online forums. It reflects and exacerbates hatred in people's daily lives. Maya Roy, the CEO of the YWCA Canada echoed this point, stating that “what we were hearing, particularly from queer, trans, and BIPOC youth, is that not only were they experiencing online hate, but fundamentally that online hate is an extension of what's happening in real life.”


In the aftermath of the siege of the Capitol, major social media companies quickly sought to remove right-wing extremism from their platforms, leading to Trump’s indefinite suspension from Twitter, Facebook, YouTube, and Instagram on January 8th, 2021 [6][7][8]. Additionally, 70,000 accounts associated with the QAnon conspiracy movement were taken down by Twitter and 600 militarized social movements were removed from Facebook [9]. The alt-right soon found themselves de-platformed from mainstream social media. As a result, far-right extremists began to migrate to social media apps such as Gab and Parler, which purport to protect free speech, making them havens for right-wing extremist perspectives [10]. Parler witnessed such a dramatic rise in downloads after the social media crackdown that it became the number one app in the App Store [11]. The dramatic rise came to a swift end when Amazon AWS cloud declared that it would stop hosting Parler and Google and the App Store removed the app from their web stores, making the app inaccessible [12].


With these new apps becoming increasingly inaccessible, right-wing extremists migrated again, this time to the encrypted web, where messages and information are end-to-end encrypted, making it impossible for outside parties to track the activity [14]. All of this has made it increasingly difficult for law enforcement to monitor these groups [15]. Right-wing extremist groups are now attempting to organize and build a following on encrypted messaging apps like Signal and Telegram where “you have to know exactly what you’re looking for… to join [16].” These networks “can [additionally] be inundated by messages and internal codes and language… so when people get there, they suddenly become part of this inner circle… where anything goes [17].” As a result, those who join begin to feel increasingly validated in their views, causing rhetoric to become more extreme [18]. Eliminating Signal and Telegram from mainstream App Stores might seem the obvious solution, but, unlike Gab and Parler, encrypted messaging applications are important tools in certain contexts. For example, journalists and humanitarian workers often use apps like Signal and Telegram when working in high-risk locations where their communications might be monitored [19].


When it comes to a social media “ban,” there are two broad policy options for government and social media companies. First is to continue enforcing bans on social media and accept that smaller groups will fester beneath the surface. This solution has significant flaws, in-part because social media bans are hard to maintain [20]. Banned users can quickly create new accounts and new hashtags can fool the artificial intelligence algorithms used by social media companies to eliminate potentially hateful content [21]. Daniel Trottier, an Associate Professor at the Department of Media and Communication of Erasmus University Rotterdam, says that users quickly learn to “use other expressions the filters won’t pick it up... causing some of these mechanisms to be a cat and mouse game.” For example, those who follow the QAnon conspiracy have begun using coded language replacing ‘Q’ with the number 17, as it is the 17th letter in the alphabet, or spelling it as ‘CueAnon’ rather than ‘QAnon’ in order to avoid Twitter’s filters [22][28]. Thus, while de-platforming has the potential to decrease the reach of these extremist groups, its effectiveness is heavily reliant on social media companies investing in the efficacy of their own bans [23].


Social media bans, however, may have little capacity to de-radicalize or reduce the risk of violence by current members [24]. As mentioned above, by forcing these right-wing groups onto platforms such as Signal and Telegram, the ban has effectively created an echo-chamber in which members will never encounter opposing opinions that may have a modestly moderating influence [25]. Furthermore, certain groups are empowered by bans from mainstream social media platforms such as Facebook, Twitter, and Instagram [26]. Carleton PHD candidate Brandon Rigato, whose research focuses on digital media and the alt-right, believes that “we might find in the long run that [the social media ban] invigorates them.” Other ‘self-sealing’ groups, such as QAnon, see this ban as evidence of their conspiracy theory and “wear their bans as a badge of honour” [27]. Hateful perspectives being hidden from the public eye does not fundamentally challenge their existence and may not reduce the danger that they pose.


The second policy option is to revert to the norm and allow these extreme perspectives back onto mainstream platforms, where they can continue in the public eye. There are ways to help mitigate the spread and damage of hate speech online without a ban (some of which we will be exploring later in this series). However, according to Trottier, simply “leaving [right wing extremists] on highly accessible and visible corners of sites like Facebook… [could result in] normalization… shifting people and coverage even further in their direction.” Perhaps even more obvious, the second challenge with a maintenance of the status quo is that extremist voices will continue to harass folks online.


Clearly, both policy solutions have significant flaws and governments, social media companies, and experts on right-wing extremism must work together to find better long-term solutions to the spread of right-wing extremism online. Although we currently have no concrete answer to the issue of right-wing extremism and digital hate, both policy solutions and more will continue to be explored throughout the series. While this series was born in response to the January 6th attacks, the discussion does not end here. Further conversations with experts and issues related to right-wing extremism and digital hate will be had. From mainstream media’s impact on the normalization and expansion of right-wing extremism to the Canadian Liberal Government’s proposed regulatory measures, this series will try and touch upon what is happening and what is being done to help. The series’ next piece will be a conversation with Maya Roy, the CEO of the YWCA Canada, regarding the Block Hate Initiative.


Alexandra Wilson, Ben Beiles, and the whole KPR team would like to thank Daniel Trottier, Maya Roy and Brandon Rigato for their time and expertise.

 
  1. Babaro, M., & Frenkel, S. “Is More Violence Coming?” The Daily, January 13, 2021, https://www.nytimes.com/2021/01/13/podcasts/the-daily/capitol-attack-social-media-parler-twitter-facebook.html?a uth=login-google1tap&login=google1tap

  2. McEvoy, J. “Capitol Attack was Planned Openly Online for Weeks - Police Still Weren’t Ready,” Forbes, January 7, 2021, https://www.forbes.com/sites/jemimamcevoy/2021/01/07/capitol-attack-was-planned-openly-online-for-weeks-polic e-still-werent-ready/?sh=6cc7bafc76e2; Timberg, C., Harwell, D., Nakhlawi, R., & Smith, H. “‘Nothing can stop what’s coming’: Far-right forums that fomented Capitol Riots voice glee in aftermath,” The Washington Post, January 7, 2021, https://www.washingtonpost.com/technology/2021/01/07/trump-online-siege/

  3. Babaro, M., & Frenkel, S., January 13, 2021; Greenblatt, J. “White Supremacism Is a Domestic Terror Threat That Will Outlast Trump,” Time, January 7, 2021, https://time.com/5927685/white-supremacism-threat-outlast-trump/

  4. Babaro, M., & Frenkel, S., January 13, 2021; Greenblatt, J., January 7, 2021

  5. Greenblatt, J., January 7, 2021

  6. Graham, T. “Social media giants have finally confronted Trump’s lies. But why wait until there was a riot in the Capitol?” The Conversation, January 7, 2021, https://theconversation.com/social-media-giants-have-finally-confronted-trumps-lies-but-why-wait-until-there-was-a -riot-in-the-capitol-152820; Edelman, G. “Big Tech Can’t Ban Its Way Out of This,” Wired, January, 16, 2021, https://www.wired.com/story/big-tech-cant-ban-its-way-out-of-this/

  7. Brooks, D. “Now that he’s been banned we can say it: Donald Trump was a genius at Twitter,” The Guardian, January 12, 2021, https://www.theguardian.com/commentisfree/2021/jan/12/banned-donald-trump-genius-twitter

  8. Graham, T., January 7, 2021

  9. Babaro, M., & Frenkel, S., January 13, 2021

  10. Babaro, M., & Frenkel, S., January 13, 2021

  11. Babaro, M., & Frenkel, S., January 13, 2021; Nicas, J., & Alba, D. “Amazon, Apple and Google Cut Off Parler, an App That Drew Trump Supports,” The New York Times, January 9, 2021, https://www.nytimes.com/2021/01/09/technology/apple-google-parler.html?searchResultPosition=26

  12. Perrigo, B. “Big Tech’s Crackdown on Donald Trump and Parler Won’t Fix the Real Problem with Social Media,” Time, January 12, 2021, https://time.com/5928982/deplatforming-trump-parler/; Babaro, M., & Frenkel, S., January 13, 2021; Fischer, S., & Gold, A. “All the platforms that have banned or restricted Trump so far,” Axios, January 11, 2021, https://www.axios.com/platforms-social-media-ban-restrict-trump-d9e44f3c-8366-4ba9-a8a1-7f3114f920f1.html; Carlson, B. “Why social media platforms banning Trump won’t stop - or even slow down - his cause,” The Conversation, January 14, 2021, https://theconversation.com/why-social-media-platforms-banning-trump-wont-stop-or-even-slow-down-his-cause-1 52970

  13. Babaro, M., & Frenkel, S., January 13, 2021

  14. Babaro, M., & Frenkel, S., January 13, 2021

  15. Babaro, M., & Frenkel, S., January 13, 2021

  16. Babaro, M., & Frenkel, S., January 13, 2021; Frenkel, S. “Fringe Groups Splinter Online After Facebook and Twitter Bans,” The New York Times, January 11, 2021, https://www.nytimes.com/2021/01/11/technology/fringe-groups-splinter-online-after-facebook-and-twitter-bans.html

  17. Babaro, M., & Frenkel, S., January 13, 2021

  18. Babaro, M., & Frenkel, S., January 13, 2021

  19. Babaro, M., & Frenkel, S., January 13, 2021

  20. Courty, A. “QAnon believers will likely outlast and outsmart Twitter’s bans,” The Conversation, July 23, 2020, https://theconversation.com/qanon-believers-will-likely-outlast-and-outsmart-twitters-bans-143192

  21. Courty, A., July 23, 2020

  22. Courty, A., July 23, 2020

  23. Perrigo, B., January 12, 2021

  24. Perrigo, B., January 12, 2021

  25. Babaro, M., & Frenkel, S., January 13, 2021

  26. Babaro, M., & Frenkel, S., January 13, 2021

  27. Babaro, M., & Frenkel, S., January 13, 2021



bottom of page