The bug was serious enough for a group of Facebook workers to draft an internal report relating a “massive ranking failure” of content.
Content identified as misleading or problematic were mistakenly prioritised in users’ Facebook feeds recently, due to a software system bug that took six months to repair, according to tech web site The Verge.
Facebook disputed the report, that was published thursday, said that it “vastly overstated what this bug was because ultimately it had no meaningful, long-run impact on problematic content,” according to Joe osborne, a spokesman for parent company Meta.
But the bug was serious enough for group of Facebook Employes to draft an internal report relating a “massive ranking failure” of content, The Verge reported .
In October, the staff detected that some content that had been marked as questionable by external media – members of Facebook’s third-party fact-checking programme – was nevertheless being favored the rule to be widely distributed in users’ News Feeds.
“Unable to find the root cause, the engineers watched the surge subside Few weeks later and then flare up repeatedly until the ranking issue was fixed on March 11,,” The Verge reported.
But according to osborne, the bug affected “only a really small number of views” of content.
That’s because “the overwhelming majority of posts in Feed aren’t eligible to be down-ranked within the first place,” osborne explained, adding that different mechanisms designed to limit views of “harmful” content remained in place, “including various demotions, fact-checking labels and violating content removals.”
AFP presently works with Facebook’s fact checking programme in additional than eighty countries and twenty four languages. under the programme, that started in Dec 2016, Facebook pays to use fact checks from around eighty organisations, together with media outlets and specialised fact checkers, on its platform, WhatsApp and on Instagram.
Content rated “false” is downgraded in news feeds thus fewer people will see it. If somebody tries to share that post, they’re presented with an article explaining why it’s misleading.
Those who still choose to share the post receive a notification with a link to the article. No posts are taken down. reality checkers are free to select how and what they wish to investigate.