The Future

1%
The proportion of Reddit communities responsible for 74 percent of conflict on the site
The Future

Isolating trolls works, and we should get better at it

The front page of the internet may also be the front lines in the ongoing moderation wars

It starts when one person sees something outside their bubble—a redditor who frequents one community, or subreddit, happens to see something that conflicts with their outlook on a different subreddit. Whether they are mad about feminism or 9/11, that person posts a link to the conflicting content in their home subreddit. Sometimes they give explicit instructions, and sometimes it’s implied, but the course of action is the same: go harass people in the enemy subreddit.

This practice is known as “cross-linking,” and it happens constantly. The problem is that when people assigned to moderate the angry, antagonizing subreddit don’t remove these links, users will go en masse to the linked subreddit. The results can be horrifying: hundreds of comments including slurs, threats of rape and other violence, or encouragement of self-harm or suicide.

“Just as I suspected, You block/mute/ban people for no god damned reason; And you assume stuff about people who you have only known for a few hours,” one now-deleted comment read in a cross-link-initiated attack. “Your a R-----, Plain and Simple. Fuck you, And your Re-----ness.”

The way subreddits end up harassing each other speaks to clashes on the wider internet. When fringe opinions get siloed in their own, inclusive communities on a platform, it’s more peaceful than when the groups are all thrown together in one giant melting pot — like, say, Twitter or Facebook. It’s well proven that when they intersect that things get messy, but there is more to it than that: According to a paper self-published last month by four researchers from Stanford University, just 1 percent of subreddits are responsible for 74 percent the site’s conflict.

The moderators of an antagonizing subreddit can, and sometimes do, remove cross-links. But nothing brings people together like a common enemy.

A deleted post from /r/Conspiracy, which once contained a cross-link to /r/Documentaries.

A deleted post from /r/Conspiracy, which once contained a cross-link to /r/Documentaries.

Preventing the meta-conflict between subreddits requires seeing a much bigger picture of community management: the risk of leaving them up is that Reddit itself might decide they cause too much trouble, and the subreddit gets shut down entirely. (Protecting people who disagree with their community is more of a happenstance consequence.) Reddit’s content policy forbids harassment, threats, doxxing, and promotion of violence, but Reddit’s own staff is rarely quick to act.

Basically, the safety of reddit users rests upon whether mods for 1 percent of Reddit remove links. But when they don’t, that means exponentially more work falls upon moderators of the victimized subreddit.

One frequent target of cross-linking is /r/AgainstMensRights—a subreddit community that combats the ideas of /r/MensRights. According to moderator /u/feministathrowaway, the subreddits subject to thousands of cross-links each week. Entire subreddits, such as /r/AMRsucks, which are dedicated almost entirely to cross-linking and harassment. For moderators, it’s not clear exactly what they need to do to keep the subreddit safe.

Users come from r/MensRights under the guise of “debate,” AMR mod feministathrowaway said. “99.9 percent of the time, I've shut it down because they started... telling me stuff like they were glad I was raped.”

The communities most prone to initiating conflict were what the Stanford researchers called "controversial topic" subreddits, which often deal with politics, social justice issues, and conspiracies, such as /r/Conspiracy, /r/TheRedPill, /r/MensRights, and /r/The_Donald.

Some subreddits, like /r/The_Donald, explicitly ban cross-linking, but they still tend to initiate them constantly. The responsibility of enforcing the rules ultimately falls upon a few dozen moderators, but it’s not evident that the moderators see cross-linking to other subreddits as a huge problem.

The rules for /r/The_Donald include clauses about cross-linking and harassment.

The rules for /r/The_Donald include clauses about cross-linking and harassment.

Some mods appear to just not take the meta-politics of moderating very seriously. User /u/theothermod, who helps moderate /r/MensRights (cited dozens of times within the research dataset of as being the host of cross-linking incidents), told The Outline his job is “mostly not too difficult… Apart from weeding out the occasional troll, it mostly runs itself as long as we keep enforcing the rules.”

Moderators of target subreddits described the moderation process is much more complicated. User /u/cojoco, who helps moderate the subreddit r/Documentaries, said in a Reddit message that they aim for a “midde-of-the-road” approach to moderation, but it’s difficult to balance free expression with the safety of users.

“The ethos of /r/Documentaries is ‘free speech,’ but the subreddit is too large to let it descend into trolling and personal abuse,” /u/cojoco said. “If trolls can keep a civil tongue in their heads, then they can present controversial ideas. Reddit is a good place for discussing controversial ideas, and trolls are good at presenting them. I think conflict is good, it allows people to be passionate and engage with the content.”

For subreddits run and dedicated to the interests of historically marginalized users, such as women or people of color, it’s not that easy. A recent (deleted) post on /r/DACA—a community that aims to be a safe space for DACA recipients and immigrants—expressed frustration the subreddit is constantly barraged with trolls. While some users fear the subreddit becoming an “echo chamber,” others are frustrated by a moderation strategy where trolls slip through the cracks.

“Trolls get banned as soon as we spot them,” /r/DACA moderator /u/OmenOfHope responded. “What we're not doing, is banning every single person that disagrees with us. We're not stooping to the level of other subreddits that ban users just because they don't agree with every single thing we say or do.”

Moderators do have some assistance from bots and tools designed to make their lives easier. User /u/feministathrowaway said that she uses the Reddit bot Totes Messenger, which notifies her when /r/AgainstMensRights has been cross-linked; Automoderator is another similar tool. A Reddit message from user /u/cmd-t, who moderates /r/TotesMessenger, said that the tool is useful to moderators, but noted that they can only take action after a link has been posted to a “trouble-making” subreddit.

“[Moderators] can lock a thread or lock the entire subreddit, but that's it,” /u/cmd-t said. “To be effective, the mods should possess more sophisticated tools, but there should also be a consistent communication channel with the Reddit admins, who are often unresponsive and offer almost no help to moderators.” According to the Stanford researchers, people should be looking into ways to proactively address harassment before it happens; tools that they might develop for Reddit might be extensible to other parts of the internet.

“[Our machine learning] model could be used in practice to create early warning systems for community moderators to predict potential negative mobilizations as soon as the cross-linking post is created, and help moderators to curb the adverse effects of inter community conflicts,” the researchers wrote. And since harassment takes a palpable human toll on moderators, the appeal of a pre-emptive tool is clear.

For platforms such as Reddit, deleting evidence of wrongdoing doesn’t get to the root of the problem. And preemptive approach to harassment may put up a wall between trolls and victims, but it doesn’t address the fact that antagonizing power is concentrated in just 1 percent of the site.

Reddit’s self-proclaimed ethos is to be an “open environment” and a “home to some of the most authentic content anywhere online.” Yet the authenticity of just 1% of the site continues to take a systematic priority over the safety of the rest of its users. After almost thirteen years of Reddit’s existence, the platform still hasn’t designed a real solution to this problem. What matters online may not be keeping people apart, but keeping a smaller, more volatile subset of them away from everyone else.