Facebook has steadily launched updates to Facebook groups to provide administrators with better ways to manage and moderate their online communities. Recently, this has included a combination of product releases – such as access to automated moderation aids and heated debate alerts – as well as new policies aimed at keeping groups in check. Today, Facebook says it is introducing two other changes. It will now enforce stricter measures against team members who break its rules, and it will make some of its removals more transparent with a new “Flagged by Facebook” feature.
Specifically, Facebook says it will begin to degrade all of the group’s content from members who have violated Facebook’s community standards throughout the platform. In other words, bad actors on Facebook can see that the content they share in groups has been downgraded, even though they have not violated any of these groups’ rules and policies.
By “degrading”, Facebook means that it shows the content shared by these members lower in the news feed. This is also called downranking and is an enforcement measure that Facebook has previously used to penalize content that it wanted less of in the News Feed – such as clickbait, spam or even posts from news organizations.
In addition, Facebook says these demotions will become more serious as members incur more violations across Facebook. Because Facebook algorithms rank news feed content in a way that is tailored to users, it can be difficult to track how well such demotions may or may not work in the future.
Facebook also tells us that the downgrades currently only apply to the main news feed, not the dedicated feed on the Groups tab, where you can browse posts from your various groups in one place.
The company hopes that this change will reduce members ‘ability to reach others and notes that it joins existing Groups sanctions for rule violations that include restricting users’ ability to write, comment, add new members to a group or create new groups .
Another change is the launch of a new feature called “Marked by Facebook.”
Photo credits: Facebook
This feature shows group administrators what content is marked for removal before it is displayed to their wider community. Administrators can then choose to remove the content themselves or review the content to see if they agree with Facebook’s decision. If not, they can ask for a review on Facebook and provide further feedback on why they think the content should remain. This can be useful in case of automated moderation errors. By allowing administrators to step in and request notifications, they could potentially protect members prior to an unnecessary strike and removal.
The feature joins an existing option to appeal a removal to group administrators when a post is found to be in breach of Community standards. Instead, it is about giving administrators a way to become more proactively involved in the process.
Unfortunately for Facebook, systems like this only work when groups are actively moderated. This is not always the case. Although groups may have assigned administrators, if they decide to stop using Facebook or manage the group but do not add another administrator or moderator to take over, their group may fall into chaos – especially if it is larger. A team member in a significant group of over 40,000 members told us that their administrator had not been active in the group since 2017. The members know this and some will benefit from the lack of way to post something they would like at times.
This is just one example of how Facebook’s group infrastructure is still largely an ongoing work. If a company were building a platform for private groups from scratch, policies and procedures — e.g. How content removal works, or the penalty for, for example, breaking rules — probably won’t be additions in later years. They would be basic elements. And yet, right now, Facebook is rolling out what should have been established protocols for a product that arrived in 2010.