Post Jobs 100% Money-Back Gurantee
Bookmark and Share Subscribe to the Up Your Game RSS Feed
May 19, 2010
How to Moderate Social Engagement
 

Moderation, at its core, is about ensuring that published content on a particular site, typically submitted by the site’s users themselves, meets the site’s terms of service. This function is all too often seen as an analog task: groups of moderators sitingt at terminals clearing content submission queues that ask simple yes/no questions.

But ensuring a safe and fulfilling experience for site visitors is more than just the analog yes/no moderation. Here are nine methods that you should be applying as you develop any community. The question isn’t which one of these methods to apply, the question is in what ratio do you apply each.

Governance


This is the starting point, and all too often the stopping point, with preparing moderation processes: Terms of service, community guidelines, and other formal documents meant to define the concept of “appropriate behavior." By far, my favorite governance example is Flickr’s community guidelines. They're fun, clear, and shareable.

Engagement


Engagement includes general community management practices, development of culture, encouraging positive and discouraging negative activities, and participation from the company. Engagement occurs, albeit differently, in all three levels of your presence framework.

Processes


Any moderation system in going to include human activities. From standard content-review processes to engaging multiple people in the approval of content, each of these processes can be tweaked. Consider the overall efficiency of your current processes and whether there are points in your current processes where insights can be captured and shared.

Positioning


Moderation is as much about providing a sense of security and safety as it is about simply deleting inappropriate content. Social experiences have a culture and when the culture is one of positivity, the overall experience tends to have more positivity. It’s not enough to simply have great moderation processes, you need to show you have them as well.

Algorithmic Tech


Use technology to discover and use patterns of tone, structure, users, response times, and other such data points to automatically identify potential problems and/or filter those problems out before moderators even see them.

UX Tech

Improved methods of user-facing technology such as a “like” button, report abuse functions, on-topic buttons, and other tools that give users a chance to actively participate in the identification and reporting of problems.

Reputation Systems


In any online social experience, reputation is crucial. Whether that’s simply a culture reputation amongst community members or specific points/badges collection, reputation can help with a range of activities in community building. Moderation efforts can be significantly helped by applying UX Tech and Algorithmic Tech together with reputation status.

For more on reputation systems, be sure to pick up the new book, "Web Reputation Systems."

Tool Consistency


Undedicated moderator resources (moderators who don’t work on just one property day in, day out) spend a surprisingly large percentage of their time simply wrestling with poorly designed moderation tools that lack consistency across properties. Moderators can clear multiple pieces of content per minute, so every minute lost to a struggle with bad content is time spent in entirely the wrong way.

Programs


Specifically designed programs such as the Facebook Community Council or Disney’s Mom’s Panel grant additional powers to select groups of partners, customers, or users.

Gaming/Application 
Moderation functions wrapped in a shell of activity that users can enjoy as a game or useful secondary application. Google Image Labler, for example, creates a game out of adding descriptive words to Google’s large collection of index images. recaptcha gives webmasters a way to ensure that people filling out forms online are, in fact, humans and not spamming tools. They ask users to input two words, one of which is a word the recaptcha software already knows and the other word is one it hasn’t been able to understand.

Remember, the question isn’t which one of these methods to apply, the question is in what ratio do you apply each. 


Bookmark and Share Subscribe to the Up Your Game RSS Feed
blog comments powered by Disqus

Jake McKee is the Chief Idea Officer at Ant’s Eye View, a strategic consulting firm that helps companies develop and execute customer engagement strategies including social media, community building and customer service. Jake blogs at communityguy.com, and you can find him in print in the 10th anniversary reprint of The Cluetrain Manifesto where he contributed the afterword, based on his experiences apply the Cluetrain principles at LEGO.



RELATED ARTICLES  
The Social Media (Selling) Event of the XeeSM

How to Partner With Your Legal Department to Avoid Risk

Content Creator? Then Forget Analytics. Use Digg.

You Had Me at Talent Zoo!
TalentZoo.com Advertising