In Conversation: Tackling Hate on Social Media

Updated: Jul 5

Opinion by Alexandra Wilson. This piece is part of Digital Hate, a series by Alexandra Wilson exploring the rise of right-wing extremism online.


A phone sitting on a table. The phone screen is displaying social media apps such as Snapchat and Instagram.

As part of an ongoing series on right-wing extremism online, KPR Writer Alexandra Wilson and Editor Ben Beiles interviewed Maya Roy, the CEO of the YWCA Canada. The following is an edited transcript of their discussion about the YWCA’s Block Hate Initiative, a four-year research and knowledge mobilization project funded by Public Safety Canada’s Community Resilience Fund. It will work with a variety of sectors to co-create concrete solutions for online hate speech and hate crime in communities across Canada. Learn more about the project and YWCA’s mission on the YWCA website [1].


What is the Block Hate Initiative and what does it aim to tackle?


We did a project around tech-facilitated gender violence several years ago, and Block Hate actually came out of that project as we were speaking with women, girls, and gender-diverse people from across the country. And what we were hearing, particularly from queer, trans, and BIPOC youth, is that not only were they experiencing online hate, but fundamentally that online hate is an extension of what's happening in real life. There were pejorative racial epithets or slurs being directed at them, or their image was being shared without their consent.


We aim to go fairly deep into the actual systems (social media platforms), and it was explained to us that a lot of the reviewing is actually done by human reviewers who, at the time, were predominantly Caucasian. So one issue is about trying to hire a more diverse staff. Another issue is that some of this hate speech is really community and geographically specific. For example, one of the young people had to spend time explaining the racial slur used against people in her Indigenous community in Manitoba to a moderator on a social media platform. It was also pointed out that slang develops so quickly, especially right now online, but even more so when there are multiple languages. So our colleagues from Québec mentioned the doxing that they're experiencing when they spoke out about the mosque shooting. They were getting a lot of racial slurs in French but because the human reviewers are doing that work in English, there was a language barrier. So we thought, OK, how can we work with Facebook to feed the algorithm? And that's how Block Hate started.


What are some of the strategies that the Block Hate Initiative will employ to combat online hate speech?


Right now we're in the data collection phase, working with 20 partners across the country and also our service users to basically start putting together a list of racial epithets and slurs. We're also doing work with the Institute of Change leaders at Ryerson to create a safer space that Queer and BIPOC folks could use to share their experiences. We are also working to create a comeback generator. It's a chatbot on the YWCA Facebook group that not only could potentially refer you to a service in your area if you've experienced a hate crime, but actually give you, you know, some saucy comebacks. So we're actually working with a tech startup of all Indigenous computer scientists out of York University to create this. Because for people to be able to stand in their own power is so important, especially now when our only connection to the world is our phone in our laptop or tablet.


How can social media platforms and other online forums be a force for good?


That's a really good question. You know, I'm cautiously optimistic. But I think tech firms need to hold themselves to a higher standard. They're not just platforms, they need to be compliant with international human rights frameworks. They need to pay taxes. We know that if something is impacting our day-to-day life, it's going to reduce economic productivity and it's going to create mental health issues. We've seen how quickly countries and communities can become destabilized. So there does need to be some push back, and the only way we can push back is for community members to identify how it's impacting them—gathering the data, doing their own data analysis, and creating algorithms that are actually going to reflect much more respectful Internet. Now, I'm not naïve. I know that the minute we do this, you know, either the platform will change or the alt-right will just simply move somewhere else. The flip side to that, we've also seen the way Black Lives Matters has just completely dominated global discussion. So we see the positive side to it as well.


How do we balance the creation of community and defending people's rights in online forums?


Yeah, that's a really, really good question, and it’s a conversation that we've been having for over 150 years at the YWCA. James Baldwin has a famous quote about disagreement that reads “we can disagree and still love each other, unless your disagreement is rooted in my oppression and denial of my humanity and right to exist.” And for me, it's that second piece that's really important. Absolutely, free speech is a protected right and incredibly important, but when BIPOC individuals have to experience hatred on a daily basis, that's not OK. We need to have parameters in the same way that you cannot yell fire in a crowded movie theater. However, if you look at how hate speech is defined in the Criminal Code, it's very specific and doesn't necessarily match the lived experiences of people. Legislation has impact, and when all of a sudden it becomes OK to break certain agreements, in this case provincial and territorial human rights codes and legislation, it can be very much a slippery slope. So I think for us, it's just about creating a space for BIPOC people to have those conversations and feel comfortable and empowered to work on solutions that they know will ultimately benefit their own community. All of us institutions, whether it's YWCA, Facebook, or the federal government, need to do better and we need to learn from those conversations.


Alexandra Wilson, Ben Beiles, and the whole KPR team would like to thank Maya Roy for her time. This conversation not only shed light on how online hate manifests in Canada, but also on the amazing work that is being undertaken by the YWCA to combat it. Creating viable solutions to combat hate and vitriol for young people in vulnerable communities when they see it helps to not only build confidence, but also resilience, stymying hate in all forms when it arises.

  1. YWCA Canada. “Addressing Online Hate.” n.d. https://ywcacanada.ca/what-we-do/projects-initiatives/block-hate-building-resilience-against-online-hate-speech/