FB ‘Ranking Failure’ Leads to Increased Misinformation on the Platform

Recently, a group of Facebook engineers discovered a “huge ranking failure” that exposed up to half of all News Feed views to potential “integrity threats” during the past six months, according to a report. This report circulated within the company last week, when a sudden influx of disinformation began flowing through the News Feed.

Rather than censoring posts from repeat disinformation offenders that were evaluated by the company’s network of outside fact-checkers, the News Feed distributed them, increasing views by up to 30% internationally. The engineers saw the surge subside a few weeks later and then flare up again until the ranking issue was fixed on March 11th.

FB ‘Ranking Failure’ Leads to Increased Misinformation on the Platform

In addition to posts flagged by fact-checkers, Facebook’s systems failed to properly demote probable nudity, violence, and even Russian state media, which the social network recently pledged to stop recommending in response to the country’s invasion of Ukraine.  Internally, the problem was labeled a level-one SEV, or site event — a designation reserved for high-priority technical issues.

Meta spokesman Joe Osborne ratified the incident, stating the company “detected irregularities in downranking on five distinct occasions, which matched with a minor, temporary rise in internal metrics.” Furthermore, Osborne said,

 We traced the root cause to a software bug and applied needed fixes. The bug “has not had any meaningful, long-term impact on our metrics” and didn’t apply to content that met its system’s threshold for deletion.

For years, Facebook has promoted downranking as a way to improve the quality of the News Feed, and its automated system has steadily expanded the types of content on which it acts. Downranking has been utilised in response to conflicts and contentious political stories, prompting fears of shadow banning and legislative action. Despite its growing relevance, Facebook has failed to discuss its influence on what people see and, as this incident demonstrates, what happens when the system fails.

Downranking, according to CEO Mark Zuckerberg, combats people’s natural desire to connect with “more sensationalist and controversial” information. According to Zuckerberg,

Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average — even when they tell us afterward they don’t like the content.

Check out? Facebook funded GOP firm to defame TikTok

Back to top button
>
×