Reddit’s millions of users became a content moderation army

0
51

Reddit, which is the self-proclaimed top page of the internet’s front page, has one of the biggest problems: deciding what should and shouldn’t appear in its feeds.

Reddit takes a different approach to content moderation than other large social media platforms.

Reddit is not like Facebook, where moderation farms do most of the work. Instead, it relies on its subreddits (or communities) to police itself. Reddit has created and enforced set values to guide volunteer moderators in their efforts.

 

This model has been criticized by some as being laissez-faire and lacking accountability. Chris Slowe, Reddit CTO says that this is a complete mischaracterization.

It may seem crazy to say this about the internet, but on average humans are quite good. Reddit is a large community that has people who are creative, funny and collaborative – all things that make civilization function,” he said to TechRadar PRO.

“Our fundamental approach is to allow communities to create their own culture, policies, and philosophical systems. We need to give tools and abilities to deal with [antisocial] minorities in order to make this model work.

Different beast

Slowe, the first Reddit employee ever, was hired as an engineer in 2005 after he rented out two rooms to Alexis Ohanian and Steve Huffman. They had met in the first run of the now-infamous accelerator program Y Combinator. Slowe was left with fond memories, but also a failed startup. Slowe still has a lot to fill.

Slowe took a break in Reddit from 2010 to 2015, but his experience gives him an unique perspective on how Reddit has grown and the changes that have occurred over the years.

He says that in the beginning, infrastructure was built to handle traffic growth. From 2016 to now, however, trust, security, and user safety have been the main focus of his second tenure.

We provide tools for users to report content that is not in accordance with site policies or rules. However, not all cases are reported. He explained that in some cases the report may indicate that it is too late.

“When I returned to Reddit in 2016, one of my main jobs was to figure out how Reddit communities function and define what makes it healthy. After identifying the symptoms of unhealthiness, it was time to start working.

 

Self-policing

Reddit uses a multi-layered approach for content moderation. This is different from other social media platforms and is intended to be as close as possible to its “community-first” ethos.

Users are the most basic form of content vetting. They have the ability to upvote and downvote items that they like. This process can boost popular posts and suppress unpopular ones but it is not always the mark of propriety.

Community mods are the second line of defense and have the power to ban or remove users who violate guidelines or the content policy. According to Slowe, the most popular subreddit rule is “don’t be rude”.

According to the Annual Transparency Report of the company, mods account for approximately two-thirds all post-removals.

Reddit admins are directly employed by the company to catch harmful content that was missed by mods. These staff members are trained to spot-check, but also have the technological tools to identify and arrest problem users.

 

Self-policing

Reddit uses a multi-layered approach for content moderation. This is different from other social media platforms and is intended to be as close as possible to its “community-first” ethos.

Users are the most basic form of content vetting. They have the ability to upvote and downvote items that they like. This process can boost popular posts and suppress unpopular ones but it is not always the mark of propriety.

Community mods are the second line of defense and have the power to ban or remove users who violate guidelines or the content policy. According to Slowe, the most popular subreddit rule is “don’t be rude”.

According to the Annual Transparency Report of the company, mods account for approximately two-thirds all post-removals.

Reddit admins are directly employed by the company to catch harmful content that was missed by mods. These staff members are trained to spot-check, but also have the technological tools to identify and arrest problem users.

When things get tough

The problem of content moderation is one that no social media giant can claim to have solved. This was demonstrated by the controversy surrounding Donald Trump’s accounts as well as the banning Parler from apps stores. Reddit also got involved in these discussions and eventually decided to ban the subreddit r/DonaldTrump.

Reddit’s community-first approach is powerful, but there are significant conflicts. Although the company strives to give communities complete autonomy, it is eventually forced to make editorial decisions on where to draw the line.

Slowe said that he didn’t want to be an arbitrator or capricious judge of content. Slowe said, “But at the exact same time, we must be able to enforce [rules]. This is a delicate line to walk.

Reddit attempts to keep its content policy concise to eliminate loopholes, make enforcement easier, and to reduce confusion. However revisions are not uncommon. Ex-CEO Ellen Pao banned revenge pornography on Reddit in 2015. The company also added a clause last year that prohibited the glorifying of violence.

Slowe explained, “Being true and consistent to our values also means iterating them, reassessing their value as we encounter new methods to game the system or push the edges.”

“When we make a shift that involves moving communities from one side to the other, it is the end of a long and tedious process of finding holes in our content policies and going backwards.

While most people will agree that there is no revenge porn and that violence was incited on r/The_Donald. Both examples show that Reddit must engage with moderation in the same way as Facebook, Twitter, or any other platform.

Reddit knows that when hard questions are asked, it can rely on itself.

 

LEAVE A REPLY

Please enter your comment!
Please enter your name here