Opinion

I’m Begging You to Stop Stitching Racists, TERFs & Misogynists Just To ‘Prove Them Wrong’

Pinterest LinkedIn Tumblr

Folks, we need to talk. In the past week, I’ve seen self-described conservative Freya Leach on my TikTok For You Page too many times to count. She’s been making videos about why she’s voting No at the Voice to Parliament referendum, sharing reasons that… you’d expect from a self-described conservative. But they’re not on my feed in their own right. It’s because of the damn stitches. Well-meaning, self-described progressives and Yes voters who are pushing snippets of Leach’s videos into my feed so that they can point out the flaws and covert racism in her arguments. 

The trend is not limited to the Voice referendum. Every day online I’m seeing snippets of outrageous things said by manosphere grifters, tradwives, misogynists, racists and bigots, TERFs, conspiracy theorists, and angry conservative commentators… quickly followed by someone saying how wrong that view is.  

It’s not helpful. You’re only amplifying these harmful ideas, helping them reach more people and build more power. 

Whitney Phillips is a media studies scholar who wrote the definitive report on how online misinformation spreads, called The Oxygen of Amplification. It explains how hate groups, conspiracies theories and Internet trolls successfully manipulated the media between 2016 to 2018. Essentially, by convincing journalists to cover them, these antagonists got exactly what they wanted: a bigger audience to hear their ideas. 

Phillips points out that even when a story was critical of these people, the simple act of covering “makes bad actors more visible and influential than they would have been otherwise”, which is always a net-benefit to them.

The report was written in a pre-TikTok world, but now ‘stitching’ on the app works in the exact same way. Even the most based call-out still benefits the troll or agitator.

For an example of just how effectively oppositional amplification works, let’s look at it from the other side. Julia Mazur had 7000 TikTok followers and a small, growing podcast talking about enjoying single, child-free life at 29. Good for her! Then, far-right commentator Matt Walsh posted one of her videos to his 2.4 million Twitter followers, calling her life stupid and depressing. Julia endured some horrific trolling, but also grew her audience by 400% and was interviewed by international media outlets. Walsh wanted to put her in her place, he wanted to shut her down… but he ended up helping her grow her platform.

That’s the lesson: your intention does not matter. Whenever you share someone’s content, you help them grow.

I know you know that digital platforms use engagement signals to decide what content to push out. This encourages all users – but especially trolls, bigots and the alt-right – to say increasingly outrageous and hateful things. People get mad and comment, repost, or stitch. It’s Rage Farming 101. They are rage farming you.

Evan Thornburg PhD, aka Evan the Bioethicist, explains it much more eloquently using pick-me grifter Pearl Davis as an example. When creators like Ethan Klein or Destiny have Pearl on their shows to ‘debate’ her problematic views, this is still a form of rage-farming. “They’re using the rage that their audiences have towards someone like Pearl, who already has made their platform through rage-farming, as a way to garner interest and views.”

On top of this, any pressure or backlash that a creator like Pearl faces is easily played off as bullying. “To those of us who are already opposed to Pearl’s views, her discomfort looks like she’s being humbled. But to people who are not sure or are sympathetic to her, it looks like she’s being bullied,” Thornburg explains, in this video worth watching in full. 

Giving an agitator the opportunity to say they’ve been attacked or their free speech has been threatened helps these people to gain followers and media attention. Next thing you know, they are invited on Q&A or Sky News, getting profiled by international media, meeting with politicians and maybe even running for office themselves. Attention builds more attention.

This is before we even take into account the distress caused to other users forced to hear overt racism, transphobia, or misogyny everytime you stitch a problematic creator. Many Black people have explained how sharing videos of police violence against Black people are traumatising to see. It’s no longer the case that you can simply opt out of watching the news to avoid it – the clips, some effectively “snuff films”, pop up everywhere online. Even when shared in a quest for accountability, justice and righteous anger, the exposure still causes harm.

I do not choose to watch manosphere podcasters talk about why 30-year-old men should be allowed to rape 16-year-olds – why are you reposting it for me to see? I don’t want to hear a random woman using racial slurs against her Asian nail technician. Why are you forcing it into my feed, just so you can point out that you don’t agree?

We should not bury our head in the sand about these beliefs, but amplifying problematic, anti-social ideas is does not help. Instead, there are proven strategies that work we must commit to:

  1. If you are not part of the targeted group, calling people in (or out) is meant to be done in your community, through conversation. That means if a friend makes a racist joke, say something to them. If someone you know posts misinformation online, send a message to them. Your attempt to right a wrong does not always require an audience. Disinformation expert Jiore Craig has told us conversations with people you know are more likely to be successful than random, public call outs.
  2. If you see hate speech on a digital platform, just report it. Even if the content is not taken down, reporting is always a better option than commenting, stitching or reposting because it does not provide the engagement fuel that antagonists thrive on.
  3. Make content that addresses the topic without amplifying the person trying to rage-bait. You can say “Russell Brand has denied the sexual assault allegations – here is why fake allegations are incredibly rare” without making everyone watch a clip of his YouTube rant first.
  4. Use the power of amplification to grow the platform and ideas of people you do agree with. This is an underrated strategy! If we spent as much energy lifting up progressive voices as we do calling out conservative voices, we would have much healthier, more optimistic digital and media landscapes. Guardian Australia has 52,000 YouTube subscribers, while Sky News Australia has 3.56 million… imagine if those numbers were switched?

That is not to say you should never use online engagement to interact with people you don’t agree with. The friendly challenging of views within groups, and adding knowledge to what other people have shared makes online platforms such dynamic learning environments. I’ve personally learned so much about culture, politics and philosophy from the people I follow, and I know that Zee Feed has been a source of learning for hundreds of thousands of others.

But we need to think more critically about who gets our attention.


Smart people read more:

Stop Playing Violent Videos of Black People’s Deaths – Slate Magazine

Adults and Teenagers Were Not Supposed To Share Space Online

How journalists should not cover an online conspiracy theory – Guardian US

Comments are closed.