If your business operates online or focuses heavily on having an online presence, you’ve probably heard the term “Content is king” way too many times by now. Yes, content is important and it’s also getting increasingly difficult to produce informative, useful and original content these days with the number of online content increasing rapidly every second. So when user-generated content (UGC) became more popular, many saw it as godsend as it meant that platforms could publish content at a much faster rate and receive more views from the public. However, this convenience came with a price.
The dark side of user-generated content
The number of people that have access to the Internet is approaching 4 billion and that number shows no sign of slowing down as our world gets increasingly reliant on digital products. As platforms such as social networks and media hosting websites start reaping the many benefits of relying on their users to generate content on their behalf, they will eventually also face the issues that come with a large user base and granting them the freedom to upload content of their own.
Social media giant Facebook came under fire in April this year for taking 2 hours to remove a video of a man being murdered from their platform. Unsurprisingly it was not the first time that the world’s leading social network has failed to act fast enough to moderate harmful content that are being uploaded by irresponsible users.
How do we stop bad UGC from being uploaded?
If your business depends on UGC, the simple answer is – you can’t. You can’t stop users from trying to upload whatever they want onto your platform. But what you can do is stop the content from being published under the brand and community that you and your team protects daily. There are many ways other exemplary companies safeguard their users from unwanted content.
For example, review companies like TripAdvisor and GlassDoor rely on pre-moderation. This means that when a user submits a review about a hotel or a company, it goes through a review process first before being published live for others to see. This can take up to a few days to process depending on the number of content queued up to be reviewed vs. the size of the team entrusted with this task.
Another popular method is what we like to call peer-to-peer (P2P) moderation. This is when other users are expected to flag or report content that they find offensive. Then, another group of users with higher levels of access to the respective message board has the administrative rights to remove the content that violates their rules. This is commonly used by forums such as Reddit or even a publicly available knowledge base such as Wikipedia where users are allowed to easily make changes to the content that’s published.
Most commonly, companies rely on post-moderation, whereby the content is moderated only after the user has uploaded the content and it has been published on the platform. This is what the world’s most popular websites like Facebook, Twitter and YouTube use. It’s the most efficient way to ensure that users get what they want instantaneously while ensuring that each content adheres to the guidelines of the individual platforms. However, it is also what caused Facebook to land in hot water time and time and again. It’s also not the only website to face backlash for allowing certain pieces of content to be published.
How is moderation done?
With the many ways content online is being moderated, they are all tied together by one common element – human intervention. We spoke about this before when we pointed out the many ways computers and artificial intelligence fail to automate things such as detecting when a video is violent or abusive.
So every time you share that image on Instagram or post that article on Medium, there are people who are tirelessly screening through your uploads, making sure that the safety of billions of people is maintained. Of course, it’s hard to image just one person going through every single piece of content day in and day out.
This is why Supahands has over 1,300 people to share the responsibility. In fact, we have helped a client review over 300,000 images in a month with just a tiny fraction of our entire workforce whom we lovingly call our SupaAgents. So, imagine what we could achieve together if we mobilise every single one of our SupaAgents. If you’re curious to find out more, we’d love to chat! Simply fill up this form and let us get in touch with you. Speak soon!