Special Snowflake Safe Space: The Rise and Fall of Reddit's Most Notorious Political Group
By Maxwell Thompson
Photo by Kon Karampelas on Unsplash
Throughout the course of my limited personal experience with Reddit, it has never struck me as “social media”; rather, the mechanics have always appeared so different and unique that it would be difficult to place it in the same category as Instagram or Facebook. Reddit found a major spotlight during the 2016 presidential election, when it became home to a community of supporters of presidential candidate Donald Trump, known by their subreddit name “The Donald.” These supporters were often seen as far more dedicated to the candidate than members of other support communities that arose on other networking sites. One theory as to why this subreddit wide intensity was made possible is the theory of collective intelligence: the shared intelligence and ideas that emerge from the shared actions of many individuals. Mob mentality, dangerous in real-world interactions, becomes intensely dangerous in online internet communities — and in summer 2019, r/The_Donald was quarantined. Now, Trump is by no means a non-provocative candidate, and some write off the behavior presented on r/The_Donald as an example of Trump’s heightened rhetoric. However, this claim fails to take into account the various problems with Reddit’s system. The downfall of Reddit’s Donald Trump community was inevitable, because the voting algorithms and anonymous posting mechanics on Reddit are the perfect breeding ground for a very dangerous type of collective intelligence.
If Reddit is to blame for r/The_Donald and its eventual implosion, this is in part due to the complex processes by which it determines what users do and don't see, processes that can lead to the perpetuation of false information and forged posts. The site’s front page, r/all, functions like a news site, allowing users to scroll through endless pages of everything from cat videos to creative writing posts. Reddit describes itself as “a source for what’s new and popular on the web” (Reddit), but this is not entirely true. What any given user sees in their feed is determined by “upvoting” and “downvoting” posts, all of which is done by the fully anonymous site users. The total number of points a post accumulates is equal to the total number of upvotes minus the number of downvotes (Reddit), and posts are seen in order of how many points they have (starting with those with the most points). Thus, while Reddit’s voting system is designed to gauge popularity, it is not necessarily designed to evaluate a source’s age or truthfulness. A fake source with twenty thousand upvotes will still be seen by an extremely wide audience, regardless of its authenticity. This large scale, anonymous voting system is what makes Reddit so unique, but also makes it volatile.
This volatility stems in part from Reddit’s longstanding position of keeping the site as free from outside influence as possible -- it is this policy which further makes its communities so toxic. In order to help with some of the problems created by Reddit’s voting system, the site uses the help of moderators, volunteer users who oversee the site’s various communities. Moderators are given the capacity to remove posts or comments “if they [the moderators] find them objectionable” (Reddit), but this is the only information given. Moderators are not required to remove posts containing fake or plagiarized content, and more importantly, the posts which they remove are entirely up to their own judgement. There are two final options if both a site and its moderators are consistently failing to comply to Reddit’s code of conduct: quarantine and banning. Reddit executives, the only ones who have more power over subreddits than the moderators, have the power to quarantine a subreddit — in essence, make it so that you have to go through a warning acknowledging the toxicity of the subreddit before being able to access it (Reddit). For even more extreme behavior, the executives can choose to ban the subreddit, permanently removing it from Reddit. However, this use of this option is extremely rare. Reddit states that “the best way to counteract negative information on the internet is to post correct, positive information next to it” (Reddit), which emphasizes the executive stance of attempting to keep the site’s various communities as organic as possible. It was this lenient stance that led to the rise of r/the_donald.
r/The_Donald was no different than other Reddit communities; however, it rapidly became a very problematic part of Reddit’s identity. The subreddit was created in 2015, and proceeded to grow in numbers as Trump’s campaign progressed. The subreddit, as of the fall of 2019, has 750,000 subscribed users-- however, its size would lead to increasing notoriety, bringing the entire subreddit to the attention of Reddit administration. In 2017, Vox’s Aja Ramano wrote an article about a recently banned subreddit, r/incels. In that article, Ramano notes that the reason for that subreddit’s ban was “for the posting violent content” (Ramano). Here, Ramano gives an example of Reddit using its code of conduct to remove subreddits that have a violent atmosphere. However, Ramano also writes about r/The_Donald, saying that “Between its hostility to outsiders, perpetual rule breaking (including sexism and the posting of fake news), perpetual drama, and perpetual hate speech, r/The_Donald has become, statistically speaking, the most unpopular community on Reddit” (Ramano). It is worth noting that “unpopular,” in the sense that Ramano is using it, means that subreddit is discussed in an overwhelmingly negative light on other subreddits and messaging boards. Ramano wrote this in 2017, and this culture continued to such a point that in mid-June 2019, Reddit quarantined the community. This marked the culmination of Reddit’s interactions with the subreddit, but left one critical question still unanswered: what were the factors had led to the community’s dramatic collapse.
The behavior of r/The_Donald led to a serious punishment from the Reddit administrators, but the question of what had transformed the community from a normal political discussion group to a racist cesspool still remains. Some would make the argument that it was Trump’s rhetoric that pushed the community down this path. With many of the now-president’s campaign promises dedicated to stopping the flow of illegal immigrants in combination with the strength of his verbal rhetoric, an argument can be formed that culture formed on r/The_Donald was fostered primarily by the candidate himself. However, this theory is flawed simply because of the scale and level of devotion displayed on r/The_Donald. The community has continued to gain subscribers after the implementation of a warning screen telling those who want to access the page that the content on it is toxic. The campaign rhetoric would most certainly have created a culture around the campaign, but it is very hasty to say that the nature of the subreddit is purely rhetorical. Consider, for instance, that Reddit’s system for voting encourages active participation in other’s posts. r/The_Donald users would have been incentivized to actively participate in their subreddit community, taking the power away from the words and turning them into action.
The concept of “collective intelligence” provides a new look into why the behavior on r/The_Donald was both unique and very widespread. Collective intelligence, purely by definition, is that many individuals acting together will all eventually have a shared mindspace, a sort of mental state which influences their actions not just as individuals, but as entire groups. On the internet, this is a massive phenomenon due simply to the unbelievable size of the communities. Some who study groupthink on the internet believe that this may not be a bad thing, necessarily. In Collective Intelligence: Mankind's Emerging World in Cyberspace, French cyber theorist Pierre Lévy argues that the computerization of society will “promote the construction of intelligent communities in which our social and cognitive potential can be mutually developed and enhanced (Lévy 17).” In other words, Lévy believes that because the internet gives us unimaginable information instantly that online communities will be learning-oriented, centered around the pursuit of knowledge. Lévy is only partially correct in this argument: his theory that the internet’s information will lead to the creation of communities where collective intelligence will further the proliferation of learning and discovery is idealistic at best. His book, published in 1999, had no way to see the fake news epidemic that is currently sweeping the internet or the unique culture created online in 2019. Collective intelligence, then, did not usher in a new age of enlightenment: rather, it allowed for the perpetuation of an entirely different type of internet culture, far from what Lévy had envisioned.
It is also possible for collective intelligence to change the societal view of what is appropriate. This is because the type of culture fostered by collective intelligence on the modern day internet is far more negative than Lévy could have imagined. Consider for a moment those traits which led to the quarantine of r/The_Donald: the posting of fake news and the seeming lack of concern for concrete evidence as opposed to their own radicalized views. There is actually an article about this exact type of culture- written by Farhad Manjoo for The New York Times in 2016, that article discussed the nature of modern-day internet culture. Manjoo starts the article strongly, stating quite simply that “the internet is distorting our collective grasp on the truth” (Manjoo). This use of the word “collective” is very much intentional: Manjoo is not just writing about the way that any given individual perceives internet information, but instead, the way that the internet has shifted the ways that entire communities process information to better suit what they already wanted to hear. Manjoo explains that this dynamic leads to a feedback loop, noting that once we find something that suits our own biases, “then we all share what we found with our likeminded social networks, creating closedoff, shoulder-patting circles online” (Manjoo). Manjoo’s concern for the way that these feedback loops affect us is generally applicable to large portions of the internet, but is also very applicable with relation to r/The_Donald. The reason that r/The_Donald fell victim to Reddit’s anti-harassment protocol is that their community as a whole became too problematic for Reddit to handle. As a group, they learned that the behavior that they would see occuring was acceptable, and continued to behave in that way until it crossed the line.
What is more interesting about Manjoo’s argument, however, is that it shifts the blame for the culture cultivated on Donald Trump’s subreddit towards the design which governs Reddit. As previously discussed, Reddit is fundamentally many small communities functioning as a much larger community. Because of the fact that each subreddit is oriented towards a specific topic, people are able to search for those things that would interest other members of a given subreddit. Moderators, who are asked to remove any posts or links that are outside of the subreddit’s subject area, keep the community extremely saturated. Richard Mills provides evidence of the near incestuous nature of subreddits, noting in his article “Pop up political advocacy communities on reddit.com” that “Moderators of /r/the_donald were also observed to ‘sticky’ rising posts at the top of the subreddit, manually selecting posts that they wanted to appear on /r/all and bringing these to the attention of the community”(Mills). Interestingly, Mills uses this particular piece of evidence as a comparison between two subreddits (r/SandersForPresident and r/The_Donald). However, the point holds true for any political online community. Reddit’s moderator system creates an incredibly unique type of echo chamber-- one in which the user is not actually selecting the information they receive, but, due to the very topical design of subreddits, being fed information that feeds into their biases. However, subreddit design is not the only piece of Reddit’s algorithm that feeds into collective intelligence.
Another part of Reddit’s design is its voting system, which is another important factor which significantly contributed to the mob mentality on r/The_Donald. Mills comments on some of the things that he observed when looking into the culture of r/The_Donald in his article. The voting on the subreddit, Mills notes, “was much more evenly spread than those of other examined subreddits.” Mills goes on to explain how he believes this occurs: “This,” notes Mills, “indicates that some users may have been bulk up-voting posts on /r/the_donald — instead of browsing through pages, assessing each post, up-voting good posts and down-voting bad, these users may be quickly up-voting almost everything they see on these pages (Mills).” Now, while Mills makes the point that users on r/The_Donald are bulk upvoting posts, this goes to his argument that the subreddit is realistically not that different from r/SandersForPresident. Perhaps even more interesting is the fact that the voting system is done completely anonymously. There is no way to tell which users upvoted which posts, creating what is arguably an extremely safe environment for people who want to be completely anonymous in their own political leanings. Trump’s ideology is by no means subject matter that is very comfortable to talk about in public, and it safe to believe that there are groups of people who would feel very uncomfortable acknowledging that they support Trump. This anonymity could be a very large factor as to why the extremely toxic culture was created on r/The_Donald.
While some would argue that with a group that has been demonstrated to be as rabid as r/The_Donald, the anonymity would be irrelevant-- many of these people would be content to be just as devoted in public as online. However, it is impossible to argue with the fact that simply by the principle of collective intelligence, people are more likely to act with those around them. The anonymity and bulk voting on r/The_Donald removes all societal pressure that could be felt in a public, offline setting-- or even on another website. People are able to feel completely safe to surround themselves with ideas that may have no truth value, but are to them, comforting. It is this part of Reddit’s algorithm that makes it a prime breeding ground for mob mentality and toxic internet culture.
Reddit’s design creates a system by which their users are contained within echo chambers. It was this system that led to the toxic behavior and eventual quarantine of the popular Donald Trump subreddit, r/The_Donald. Many assumptions about the way that groupthink and collective intelligence worked prior to the internet say that collective intelligence has the potential to make us smarter. The ability to confront beliefs that do not match our own, however, is incredibly difficult. In an age when we are able to have a world of information at our fingertips, it is also far easier to live with views that confirm our own notions of how the world works. Taking the route of simplicity can lead to only seeing views which confirm pre-held notions. Reddit is a uniquely secular site, with hundreds of small communities. The size and specificity of these communities can make it uniquely easy to fall into biases. r/The_Donald is not the only subreddit that is an example of this, but the page’s uniquely hateful rhetoric and the way that its moderators performed their duties meant that its downfall as a result of mob mentality was inevitable.