If you’ve spent some time on the social web, you are probably familiar with the concept of up- and down-voting content. Major sites like Reddit, YouTube, and Digg use this system as a form of community moderation. The idea is that if a piece of content is valuable in any way, a reader will upvote it. If, on the other hand, it is offensive, spammy, or low-quality, it will be downvoted. On some sites, if a post gets enough downvotes, it disappears.
The problem with this system, however, is one of human psychology. Let’s use a fictional post as an example.
Someone posts something about the Tea Party. It happens to be an excellent analysis of the direction of the Tea Party movement; thought-provoking and well researched. People who read it are starting to upvote it because it is well-written and useful, regardless of their personal politics.
But then, less reasonable people who are vehemently anti-Tea Party come along and start downvoting it, without even reading it. They are people who may be “lazy trolls”; after all, clicking a single button is a simple way to troll with no effort whatsoever. Instead of leaving nasty comments (which takes effort), they can just click a single button and move on to the next victim. The end result is that the possibility exists that good content will be downvoted into oblivion. If you’ve spent any time at all on Reddit, Digg, or other large communities that allow downvoting, you’ve very likely seen this in practice.
The big problem comes from the fact that the downvote contains as much weight as an upvote. It’s much easier to find trolls than it is to find people who want to promote something, yet the troll is given as much power as a legitimate user; and again, all they have to do is click. A few trolls are all it takes to bury what may have been perfectly good, relevant content.
Trolls can be more than just annoying; there have been valid concerns about various content producers using coordinated downvoting attacks to game the system; they bury their competitors’ content and let theirs rise up above the fray. There is nothing really stopping this from happening.
Back to the psychology factor: Many times, people don’t read the full post, or they’ll vote based on the headline. This means that if a post starts getting downvoted for any reason (including wrong reasons), it’s a snowball effect; it will quickly get buried. The only times a post can resist is that if it is an extremely well-crafted headline or something very easy to digest (such as a cute image of a funny kitten doing something silly). If it involves effort (reading the post, returning to the aggregator, and then voting), there’s a good chance the post is going to wither into obscurity.
There is no need for downvoting. A moderated community with a solid community management role will be just fine without users being able to cull out bad or unpopular content. If a post is unpopular or low quality, it can just sit at the bottom and be buried naturally by the rise of new, better content. The only time a post needs to be removed is if it’s offensive, spam, or somehow breaks the rules or guidelines of the community. That’s what community managers are for.
Users should definitely be empowered to upvote content, but having them downvote content is essentially asking for trouble.