Opinion articles provide independent perspectives on key community issues, separate from our newsroom reporting.

Opinion

Discord failed to keep Gig Harbor teen safe with its deadly design | Opinion

A popular online messaging platform says it’s going out of its way to protect kids with new teen safety features. For a Gig Harbor teen whose family and law enforcement says was coerced into suicide by another user on the platform, it’s about five years too late.

The company, Discord, has a track record of tragedy when it comes to child safety. Its updated policies don’t go far enough.

Tech companies have a responsibility to ask themselves: what would a person intent on evil do with this product? Failure to take that question seriously allows bad actors to use their products as weapons.

Based on a lawsuit filed by the family of Jay Taylor in Pierce County last month, it’s clear that Discord’s leadership did just that. They failed. An established network of users, bent on extorting and coercing kids into creating images of self-harm and sexual content, got its hooks into Taylor as well as many other children who used Discord.

Read Next

The facts the Taylor family alleges are unbearable. About six months after creating an account on Discord in 2021, Jay Taylor was contacted by the abusive group. According to the family’s complaint, the group manipulated Taylor into hanging himself while livestreaming from a Safeway parking lot in Gig Harbor within hours of making contact with him. The user who allegedly orchestrated the death, himself a minor at the time, is currently on trial for those and other crimes in Germany.

Taylor, who loved to crochet, had joined the service to connect with other young trans people and share his interest in craft projects, according to the lawsuit. The chasm between what Taylor apparently hoped for and how things ended speaks to a disastrous lack of foresight on the part of Discord’s leadership.

The pandemic saw a huge surge in teen users on the platform. The problem was, the system wasn’t geared toward them. Discord had features like unmonitored livestreaming and voice calls. It sent messages from unknown users directly to their inboxes. And it treated adult-oriented content as the primary threat to teens, instead of sadistic bullies regardless of their age. Those are all features that helped enable this abuse.

As far as I can tell, the company has taken only partial steps to change course.

Since 2021, the company says, Discord has acted to eradicate the group that targeted Taylor and many other kids from its platform, and others like it. It also said it works with law enforcement agencies and improved its controls for removing abusive content and users, as well as keeping them from getting back on the service.

But that’s reactive. Discord should have thought ahead about the consequences of giving teens these tools. The galling thing is that tech companies knew how to deal with this kind of behavior. They’ve been building up those capabilities since long before Discord first learned of abusive networks on its systems.

Read Next

I covered online privacy and cybersecurity for a major tech publication for over five years. I’ve spoken with countless, very smart people who studied deceptive online behavior. Many of them researched how to suss out and stop bad actors who manipulate social media for fraud or political purposes.

Those efforts were also often reactive. But by the time Discord saw its surge in teen users, there was a huge field of professionals who understood how to detect and prevent networked, abusive behavior. When vulnerable young people are on your platform, that prevention needs to be in place from day one. No one can say we haven’t learned that yet.

Protecting teens from abusers of any age

In 2024, testimony from Discord co-founder and then-CEO Jason Citron to the US Senate Judiciary Committee acknowledged the company didn’t require proof of age, and only detected lies when they were individually flagged by other users or parents. That’s set to change later this year, with all users being treated like teens unless the platform can confirm a user is an adult. That will put them about three years behind Meta platforms Facebook and Instagram. It’s also not enough.

One of the darkest aspects of Discord’s teen safety problem is that other teens are often the threat. Two of the most prominent people charged with leading the network that targeted Taylor and other teens with extortion and abuse were also minors at the time of some or all of their crimes.

I reached out to Discord about teen safety. The company responded with a list of ways it’s addressed harm-oriented groups like the one that targeted Taylor. That’s crucial, and I give them full credit for those efforts.

I followed up to ask my larger questions about its teen safety policies again. Does the company leave livestreams unmonitored by its internal scanning tools? Does redirecting messages from strangers to a separate inbox protect teens from contact by people who want to hurt them? And do the new age-verification policies protect teens from other teen users who are abusive?

The day after this column first published, Discord responded by confirming that live video and voice calls are not monitored by software or human reviewers (the company can’t access the contents of those live communications at all, because they’re locked down by encryption). The company also affirmed it believes its new approach works to protect minors.

Discord is an unconventional and sometimes confusing social media platform that allows unique pathways for strangers to reach teens. All of that might surprise teens and their parents alike.

Like water flowing downhill, monsters will find a path to vulnerable children through the internet if there’s one available. The important thing is to recognize ahead of time that you’re creating one.

Editor’s note: This story has been updated to reflect Discord’s statement one day after it was published.

This story was originally published March 10, 2026 at 5:00 AM.

Laura Hautala
Opinion Contributor,
The News Tribune
Laura Hautala is the Opinion Editor at The News Tribune. Contact her at lhautala@thenewstribune.com
Get unlimited digital access
#ReadLocal

Try 1 month for $1

CLAIM OFFER