Facebook released new guidelines on how it will handle Facebook Groups that are spreading misinformation, breaking community guidelines, and/or promoting violence. This update comes after Facebook has faced a lot of criticism over hate groups using its platform, as well as Facebook’s lackluster response to misinformation on the platform in general.
Recently, Facebook has come under fire for a variety of different instances where the company has failed to restrict its users. After Trump made a post encouraging voters to vote twice on election day, an activity which is illegal, Facebook only added a disclaimer addressing the trustworthiness of voting by mail. Likewise, disinformation is being spread regarding the wildfires in the western United States, with Facebook once again the primary culprit. Facebook is also currently investigating a post in a Facebook group that has been linked to the Kenosha Shootings.
In an announcement, Facebook explained more steps that it is taking to stop the spread of disinformation, and crackdown on violent speech. Moving forward, users who have had Facebook Groups taken down for misinformation or hate speech will not be allowed to create new groups for a period of time, following the previous group’s removal. Additionally, if individual posts are taken down from Facebook Groups, any new posts from those users will require admin approval. To prioritize accurate information, Facebook will be placing restrictions on Health Groups by not actively recommending them to users. Potential hate groups will also be removed from recommendations, restricted from search, and have their content reduced in the News Feed. If the groups discuss potential violence, they will be removed.
Facebook also took the opportunity to emphasize the importance of good admins in these groups, and has announced it will be removing groups that have no admins for a long period of time. The company will additionally add ways for admins to easily invite others in the group to become admins, when stepping down. As well as begin reaching out to active group members and inviting them to become admins in instances where there are no active admins in a group. If administrators or moderators of a Facebook Group repeatedly approve posts that violate Facebook’s community guidelines, the group itself will be removed.
To combat misinformation, Facebook Groups will be ranked lower in News Feeds and not recommended to users if fact-checkers have repeatedly marked their posts as false. When content is rated as false by fact-checkers, Facebook adds a label to the content, and informs users that the content is false before they attempt to share. Whether or not any of these changes will actually solve the Facebook Groups misinformation problem remains to be seen, although collectively they should help to make a difference.
Source: Facebook
from ScreenRant - Feed https://ift.tt/3cgejez
0 Comments