Header Ads Widget

Responsive Advertisement

Facebook Might Be More Divisive Than Expected, Based On Its Own Research

Facebook appears to have previously ignored its own internal research that shed light on how divisive the social media platform is. This is one more incident to add to the growing list of accusations Facebook has faced in recent times, regarding its content and administrative practices. While also further highlighting that Facebook users need to be aware of the mechanics in play on the platform.

This revelation comes in the midst of a pandemic where people are spending more time on platforms like Facebook where they are exposed to all kinds of misinformation. However, it is not the first time Facebook has been criticized for not taking responsibility for its content. Last year, a number of known personalities boycotted Facebook when it was reported that the social media giant had no issues with promoting paid political ads, despite them often containing false and divisive information. The social media giant's capability to swing public opinion has been well documented in the past, and since the controversial 2016 US Presidential election and the UK's Brexit referendum, it has become an important topic for governments and communication experts, alike. The issue escalated severely after the 2018 Cambridge Analytica scandal that revealed user information from millions of Facebook accounts was used to target people with political advertising.

Related: Facebook Moderators To Receive $52 Million Settlement For Psychological Damages

The latest allegation is just as serious as those from before. In an investigation by the Wall Street Journal, it was found that Facebook leadership ignored and shelved its own internal research that indicated the platform was, by design, promoting divisiveness. The researchers found that Facebook's recommendation algorithm is actively promoting divisive content, pages and groups to users, suggesting a serious and fundamental design flaw that would require a radical change for the issues to be overcome. However, the investigation also reveals that the Facebook leadership was unwilling to take that path and make the appropriate changes to the platform.

According to the report, Facebook's powerful recommendation algorithm is exploiting the "human brain's attraction to divisiveness" and if not fixed would continue to divide users even more, as a way to increase user engagement. This basic flaw in design hints that Facebook is far more polarizing than many actually believe it to be. However, the more serious allegation here is that Facebook executives, including Policy chief, Joel Kaplan, didn't take any action to solve the issue.

A former Deputy Chief of Staff in the George W. Bush administration, Kaplan was responsible for vetting the changes proposed by the research team, including a rethinking of how Facebook products work, the creation of additional features to reduce social media wars over sensitive topics, and preventing coordinated content manipulation efforts. The research team also added that making these changes would mean taking a 'moral stand' and risk less user engagement, as well as growth. Furthermore, the team also found that the changes would impact conservative politics far more than liberal, based on the suggestion a greater proportion of divisive content comes from right-wing users and groups.

Although a disturbing revelation, the findings are in line with how Facebook has responded to calls from different quarters to regulate and fact-check content, especially political ads. In a recent CNBC interview, Mark Zuckerberg reiterated that social media websites should not be fact-checking political speech. Meanwhile, platforms like Twitter and Spotify have taken bold steps to ban political ads, and are even starting to flag what is deemed potential misinformation by political leaders.

Failure to regulate harmful content is one thing, but the knowledge that Facebook's own recommendation algorithm is, by design, promoting polarizing content for increased user engagement is quite another. Even more shocking is the fact that Zuckerberg and fellow Facebook executives felt it was better to continue to allow this to happen, instead of looking to address the issue with changes.

More: Why Facebook & Twitter Decided To Have Employees Work Remotely From Home Permanently

Source: WSJ



from ScreenRant - Feed https://ift.tt/3ckiDaY

Post a Comment

0 Comments