TalentZoo.com |  Beyond Madison Avenue |  Flack Me |  Digital Pivot Archives  |  Categories
What Facebook is Doing to Ramp up Groups
By: Co.Design
Bookmark and Share Subscribe to the Beneath the Brand RSS Feed Share

I’m not sure when it happened. But at some point in the last couple of years, my use of Facebook in its most familiar form—posting on my wall and those of other members—has dwindled. Instead, I spend most of my time in Facebook groups devoted to a variety of my interests, from old cartoons to e-bikes.

There’s no question which Facebook groups are the best. They’re the ones that are managed by administrators and moderators who care enough to have a strong point of view that manifests itself in how they cultivate conversation. That includes how they deal with trouble, from minor tiffs between well-intentioned members all the way up to full-blown troll attacks.


And Facebook takes such folks and their needs seriously. Last week, in conjunction with its annual Communities Summit—normally an in-person gathering and this year a virtual event—it announced a bunch of features for groups. Some aim to make it easier for admins and moderators to do their work; others are about breaking down barriers that prevent users from finding and participating in groups they might like.


On the first front, a new set of tools called Admin Assist will tend to some of the heavy lifting of moderation. For example, a group’s creator will be able to automatically reject posts that use specific keywords or come from members who have recently joined or been troublemakers in the past. Also new are various features for real-time chat, ask-me-anything-style Q&As, and conversations driven by shared photos.


Facebook is also introducing a way for admins to make money from their groups—not through advertising, but by giving them access to an existing service called Brand Collabs Manager. In the past, influential individual members have been able to use this service to strike promotional deals with brands that want to reach specific audiences; now, groups will be able to do so as well.


Then there are the tweaks designed to get more users into more groups. “Related Discussions” will push material from groups into users’ news feeds, exposing them to new groups and conversations. And public groups will now let new members join without being approved by an administrator or moderator. (People who run groups can still use techniques such as asking screening questions to vet newcomers before allowing them to post.)


What all of these diverse changes have in common, VP of the Facebook app Fidji Simo told me, is that they reflect the evolution of Facebook Groups since the feature’s introduction a decade ago.


“When you go back to when we created Groups, it was really meant to be a space for you to connect to groups of people already in your life—your family, your soccer team, your book club,” she says. “But since then, Groups has evolved massively to be a place that is not just about connecting with people you already know, but with people all over the world on any topic that is important to you.” Today, 1.8 billion people use the feature each month, and half of all members belong to at least five active groups.
 

The dark side of community


Now, it must be noted that there’s nothing inherently ennobling about groups on Facebook. Indeed, they have proven an efficient way to spread hate and misinformation. The most appalling recent example: Facebook’s failure to swiftly act against a group that coordinated plans to shoot protesters in Kenosha, Wisconsin, as reported by Buzzfeed’s Ryan Mac and Craig Silverman. More recently, The New York Times’s Ben Decker wrote about an 1,800% increase in membership for anti-mask groups on the service. And Mother Jones’s Kiera Butler has explained how QAnon hoaxes seep into Facebook groups about parenting.


When I spoke to Simo, she devoted a sizable chunk of her time to Facebook’s measures to fight hateful and otherwise dangerous use of groups. “I want to be clear that none of what we are doing matters until we keep people really safe in groups,” she told me.


She rattles off stats relating to the company’s efforts on that front: “We’ve actually removed about 1.5 million pieces of content for violating our policies on organized hate, 91% of which we actually found proactively before people reported that to us. We also removed about 12 million pieces of content in groups for violation of policies on hate speech, 87% of which we found proactively. And when it comes to groups themselves, we will take down an entire group if it repeatedly breaks our rules, or if it was set up with the intent to violate our standards.”


Facebook is also tightening the screws on administrators who have violated its community standards in the past. For instance, a new policy prevents such people from creating any new groups for 30 days—which, though it certainly falls short of zero tolerance, might discourage them from further misbehavior.


In addition, the company is removing health-related groups from its recommendations—not because there aren’t valuable health groups on Facebook, Simo says, but because “we want to make sure that people get their health information from authoritative sources.” Presumably, the bottom line is that the company isn’t trying to police the quality of health groups on the service, but at least doesn’t want to actively steer people to sources of dubious advice.

 

As with everything else on Facebook, the scale of Facebook Groups is such that the company can make lots of progress and still have major problems on its hands. For example, while Facebook announced in August that it’s banned or restricted more than 2,800 QAnon-related groups, multiple reports have said it was slow to take action and that such groups have proven adept at evading the crackdown. Moreover, there is no blanket ban on QAnon groups, just on ones that run afoul of policies such as bans on “coordinated inauthentic activity” and calls for violence. (You don’t have to dig deep to find groups that self-identify as being dedicated to QAnon advocacy; actually, all you have to do is search for “QAnon.”)


And then there’s the positive side


For all the alarming stories relating to Facebook Groups, there are also inspiring ones. In 2015, Latasha Morrison wanted to spark a healthy dialogue about racial disparities and injustices in the U.S. “I wasn’t an organization,” she says. “I was just a person who saw the brokenness in the world and wanted to create some conversations and some solutions around the racial problem in America. I didn’t have a website, and I wanted to gather people where they could continue to learn. And Groups was the best option for that.”



READ MORE 

 


 

Bookmark and Share Subscribe to the Beneath the Brand RSS Feed Share
blog comments powered by Disqus
About the Author
Beneath the Brand on

Advertise on Beneath the Brand
Return to Top