YouTube Isn’t Over – The Delicate Art of Moderating Communities
The dust has mostly settled since the massive #YouTubeIsOverParty that lit up across social media earlier this month. The hashtag and surrounding uproar made news after YouTube increased its transparency with content creators around the demonetization of their videos. That is, some videos that had previously been collecting a portion of the profits from the advertisements running beforehand had that profitability removed because the videos were automatically detected to be in violation of YouTube’s Advertiser-Friendly Content Guidelines.
The issue that sparked the recent upset, however, wasn’t that YouTube had demonetized these videos. It was that this automatic detection and demonetization had been happening since 2012, and it was only now, four years later, that the video’s creators were being informed of the actions taken on their videos.
Content creators worldwide expressed their frustration at the missed revenue and the way the practice was communicated, prompting YouTube to take steps to made amends with the community. At ModSquad, we’re looking at the larger picture of how this incident speaks to the complexity that is community management.
The Secret Sauce of Community Management
The secret is: there is no secret sauce. There’s not one “correct” way to manage or communicate with a community, especially one as diverse and expansive as YouTube’s user base. Designing the right community management strategy takes time and involves taking into consideration things like organizational goals, user demographics, community size, platforms managed. It also requires a readiness to evolve and adapt that strategy as those variables change.
At ModSquad, we work with a broad range of clients, all with different audiences, objectives, and needs. When we begin a new partnership, we dive into the brand and its community – the company’s values and mission, where they’re present digitally, how they and their customers communicate with one another, and more – in order to develop a strategy that will foster a healthy and engaged community. We encourage brands to be as transparent as possible in their communication, but we also understand that certain situations call for varying levels of discretion. Moderation, however, is where the importance of transparency cannot be overlooked.
Best Practices for Moderating Communities
Moderation keeps communities safe and helps shape the community ethos. With this in mind, YouTube’s demonetization actions are completely understandable, as they need to protect both their paying advertisers and their content creators. However, in many scenarios, moderation is perceived as censorship, especially by the user whose content was moderated. This is where timely, transparent communication helps to mitigate backlash and foster better understanding of the guidelines in place.
While every community is unique, there are a number of practices that apply generally to moderating them. Whether your community comprises 50 users or 50 billion users, keep in mind the following:
Community guidelines should be easy to locate and understand. These are the terms by which your community will abide. While we’ve probably all been guilty at some point of scrolling to the bottom of user agreement and clicking “I agree” without reading it, it’s important that community rules be readily accessible to all and written in a way that can be understood. If your company supports multiple languages, make sure your community guidelines are localized into each language you support.
Community guidelines should be enforced in a timely manner. Establishing rules means nothing if they aren’t enforced. If inappropriate content isn’t removed or modified promptly, other users will begin to assume that such behavior is tolerated and follow suit. In other cases, users who’ve had similar content removed may perceive it as a case of favoritism or inconsistency.
Users should be notified promptly if their content is moderated. Prompt and transparent communication around moderation not only reinforces community guidelines, but also helps users understand where they went wrong and how they can improve their behavior in the future. In YouTube’s case, it was the lack of timely notification that drove such ire within the community.
Don’t over-automate; keep humans involved. Because of the volume of content YouTube needs to monitor (a recent study found YouTube users share 400 hours of new video every minute), it’s understandable that some automation was needed. However, because their demonetization efforts were entirely based on an algorithm, the subjectivity involved in effective moderation was removed. Involving humans with solid judgment skills and experience is invaluable in maintaining fairness in the moderation process.
Establish a fair appeal process with quick turnaround. Whether it’s algorithms or humans in charge of moderation, mistakes sometimes happen, and it’s important that users have a method by which to appeal an action taken on their account. YouTube has done this by allowing content creators to submit their demonetized video for manual review. When developing an appeal process, it’s important to set reasonable expectations for when a user can expect a response and for users to have confidence that their case will be properly considered.
All in all, YouTube’s move towards greater transparency is a good thing for content creators, who can now appeal demonetization actions quickly. For all communities, transparent and prompt communication around moderation helps prevent perception of censorship and helps the community better understand guidelines and expectations. For a community to thrive, not all communication needs to be 100% transparent, but when it comes to moderation, a human touch keeps the process fair and effective.
Aliza Rosen
Digital Strategist
Tasia Karoutsos
Digital Manager