Comment on a Facebook bug to promote fake news instead of making it disappear

According to the specialized American site The Verge, a bug in the Facebook platform produced a highlighting of questionable content which, on the contrary, should have seen their visibility reduced on the social network. The bug was felt from October to March.

Facebook has been affected by a major bug in its algorithms… since 2019 © AFP / OLIVIER DOULIERY

It is, for Facebook, a “Site Event” (SEV), the level of critical alert internally, reserved for the most important crises. The American media The Verge, got their hands on an internal incident report from Meta according to which the social network platform was affected by a major bug, which reduced errors in the highlighting of content. A bug affecting an algorithm that affects “half of all views on news feeds” of Facebook in the world, and which made its effects felt from October until its correction on March 11, according to information that The Verge was able to obtain.

The problem, identified in October by Facebook engineers, concerns the algorithm responsible for moving up, or on the contrary, downgrading, content according to a classification (content containing nudity, violence, but also false information and hateful content). Indeed, the publications provoked as “border”, that is to say at the limit of violating the conditions of use of the site, are detected by an artificial intelligence and are supposed to be less visible. In 2018, Mark Zuckerberg explained that this system made it possible to go against the tendency to click more easily on “more sensationalist and provocative” content. The algorithm is also intended to “downgrade” content suspected of violating the terms of service but which has obtained human verification.

A 30% increase in visibility for questionable content

Last year, the same fate of “demotion” was to be reserved for all political content on Facebook. This ranking and highlighting/demoting algorithm has therefore taken on considerable importance, but the other side of the coin: in the event of a bug, such as the one identified by Facebook engineers, these publications find themselves highlighted. The issue was reported in an internal memo, but the company confirmed the malfunction to The Verge, adding that these messages, some of which contain false information or are linked to Russian state media, have saw their increase in visibility by 30%.

Facebook engineers, if they claim to have repaired the failure, are however unable to explain its origin. The first problems date back to 2019, but according to information from internal documents, they did not have an impact with October 2019. A spokesperson for Meta assured The Verge that this bug would not have an impact at long term on the platform.

Leave a Comment