There has been a real focus on how technology is harming us in the past five (or fewer) years – it’s certainly a topic that the Zee Feed team thinks about, and has written about, a lot. In the discourse it can be easy to forget that tech is not inherently good or bad; these are tools, and it’s how we deploy what we develop ultimately determines what the outcomes will be. A perfect example of this is dating app Bumble’s partnership with a data portal created by a former FBI counterterrorism agent.
Rest assured: this is not a ‘scary tech bad’ story. In fact, it’s proof that we absolutely can use technology to keep us safe, make digital spaces more positive, and potentially even improve human behaviour both online and IRL.
Empowering survivors of abuse
The tech platform in question is Kodex – a secure data exchange portal, created by former agent Matt Donahue, which Bumble uses to communicate with law enforcement agencies around the world. Now, the portal has been opened up to selected organisations working with victims and survivors of physical, sexual or technology-facilitated abuse.
“It’s a secure, credible way for those organizations to make referrals to us about dangerous individuals, so we can make sure they’re not having access to more victims on our platform, or that they’re not misusing our platform for bad acts that they want to engage in,” says Kenya Fairley, Bumble’s Associate Director of Safety Partnerships.
It’s essentially a tip line. In Australia, the first NGO to be plugged in is domestic and gender-based violence network, WESNET.
The security of that line is crucial. “I always joke how the first time I had to fax a subpoena to a telecom provider, I literally thought they were joking. But I really had to fax it to them! The amount of sensitive information sent through unsecured email is comical at best and scary at worst,” Matt says.
“That’s why I started [Kodex] – this isn’t mass surveillance. It’s about wiping out the abuses on these platforms while also protecting the user privacy, giving an avenue to push back on an overly broad [legal] request, for example, and protecting the privacy of those users who aren’t doing anything wrong.”
Giving survivor organisations an active role in making digital platforms safer is a huge step in the right direction. While responding to law enforcement requests is one thing, abusive, harmful or dangerous behaviour is not always illegal and attempts to ‘police’ online spaces through legislation allows too much horrific-but-technically legal behaviour to continue.
“Our goal is to reduce the emotional burden of survivors from having to report or engage with another system to go through that process of reporting, potentially retelling their story again,” Kenya explains. With organisations able to tip-off Bumble on their behalf, survivors can focus on healing knowing their identity is not connected to the process.
Nothing like banning my abuser from dating apps to start my Valentine’s Day🥰
— Katt Taylor (@katt_taylaaa) February 14, 2023
Tech-based reform?
It comes back to the concept of tech-as-tools. The attitudes and behaviours that underpin technology-facilitated abuse have always existed in the real world. Harassment, stalking and intimidation would continue even if dating apps had never been invented.
“As corny as it sounds, I joined the FBI because growing up I wanted to catch bad guys,” Matt says. “So much real-world harm is facilitated by everyday companies that never intended to be abused in these ways – bad actors have abused platforms for over a decade.”
So, what to do with a bad actor once they’ve been identified, and warned or removed?
Not only are they notified about why their account has been flagged or banned, but Bumble takes the opportunity to provide educational resources and direct them to relevant support organisations too. “If we’ve taken action on their account for something related to sexual harassment, then we send them information about that. Because ultimately, we want our members to improve their behavior,” Kenya says. “We don’t just want to remove those bad actors as quickly as possible… We also want to then try to educate them about what’s appropriate for healthy, equitable, safe relationships in order to change their behavior moving forward.”
The possibility of reform and rehabilitation is a feature of the way this technology is used. There’s something really hopeful about that. We’re constantly told the negative human biases built into new technology is making the world worse, but what if we made an effort to build in healthy expectations?
Reinforcing social norms
Bumble is playing with these ideas. If you send another user a nude selfie, it lands in their inbox completely blurred – the recipient gets to choose whether they want to see it or not before the image is revealed. The tech not only normalises, but actively enforces consent: no dick pics against your will.
“We have a new compliments feature where members can give a compliment of someone’s picture. If we detect that the content of that compliment could potentially be lewd or a bit egregious, then it gives a nudge to say ‘Is this how you want this message to go through? Do you want to revise what you’re saying?’”
Sexy compliments are still allowed to go through; explicit messaging is welcome, so long as it’s consensual. Using tech to prompt users to think twice and consider what respectful behaviour looks like is a way of subtly reinforcing social norms.
It’s the same as calling out sexist joke, or pushing back on casual racism – a reminder of what we won’t accept, online and off.
Comments are closed.