Facebook censors scientific information

Facebook prefers to censor scientific information from very credible sources for a while rather than tolerating harmful publications for too long, while seeking to improve its algorithms incapable of decelerating sarcasm.

• Read also : A fake front page relayed by the Conservative Party of Quebec

Agence Science-Presse is a non-profit media that has existed for 43 years. She notably gave birth to the magazine and the TV show The Resourceful.

The Agency published on its Facebook account at the end of March the article Arguments and Strategies of the Anti-Vaccine Movement.

The explanation obtained from the Agence de Science-Presse following the 2nd withdrawal of its publication.

Facebook screenshot

The explanation obtained from the Agence de Science-Presse following the 2nd withdrawal of its publication.

The article, however, was removed on March 28 “for failing to meet community standards” and the page was “restricted for 90 days for repeatedly sharing false information”, say the posts Facebook has posted. placed on the Agency page.

She then stepped up her efforts to correct the situation, which took more than three weeks (see table).

“It’s frustrating at the time, but it’s so absurd that we try to laugh about it in the end”, denounces Pascal Lapointe, editor-in-chief of the Agency.

  • Listen to Benoît Dutrizac’s interview with Pascal Lapointe on QUB Radio:

In nothingness

“With a newspaper, a TV station or a radio station, we would have contacted the editor and solved the problem in 15 minutes. But here we are in the most total fog. We don’t know who to complain to, we can’t talk to a manager. We press a button and we do not know if it goes somewhere. And when in the end the problem is solved, we do not know what they based themselves on, ”he adds.

On the functioning of Facebook, Mr. Lapointe sees only two possibilities.

“Somewhere in the algorithm, after receiving a number of complaints, the robot reacts and blocks the site. I would not be surprised if anti-vaccines pressed a button claiming that we publish false news, ”he thinks.

The other possibility according to him is that the algorithm analyzes certain words.

“The article mentions that for a century, it is always the same arguments that are cited by anti-vaccines. It said for example ‘the first falsehood: vaccines are dangerous’. Does the robot automatically qualify this as fake news? “, he wonders.

As a non-profit organization, the Agency did not lose money “but it had an impact on traffic. A bunch of small sites like us are too dependent on Facebook. That’s the plate, we would like not to be, ”says Mr. Lapointe.

Explanations

The example of Agence Science-Presse is not unique in the science popularization community.

“We published a video on natural immunity, with references to scientific publications. Facebook blocked the post and gave us a warning. At the next “infraction” we will lose the page”, deplores the Dr Mathieu Nadeau-Vallee.

Facebook’s parent company says it understands the “irony” of seeing a site dedicated to fighting misinformation being blocked for this reason.

“We are sorry this happened. Our systems are not perfect, we know that. We continue to work on it,” explained a spokesperson for Meta.

“Meta prefers to delete a publication that is not harmful and reinstall it two days or two weeks later, rather than leaving a (harmful) one for too long,” he added, explaining that the multinational must manage three billion users and a hundred languages.

The explanation leaves the editor of the Agency skeptical, “considering the amount of fake news accounts that have been denounced over the years around the world and have remained online”, says Pascal Lapointe.

LEDsr Vadeboncoeur recently had four of his posts censored by Facebook. “The first two were slightly ironic. It would be interesting to know how they suspend people who always have a speech on can no longer be right in the sense of science, ”says the Dr Vadeboncoeur.

“The challenge for artificial intelligence is often able to decelerate sarcasm. The algorithms are not yet strong enough to understand that the Dr Vadeboncoeur uses humor or sarcasm. That’s why the human comes to the second level of verification, “replied the spokesperson for Meta.

LEDsr Vadeboncoeur recognizes that irony can be difficult to decelerate. But this loophole would have a major impact on the hunt for fake news. “It amounts to no longer being able to denounce an erroneous image or publication,” he thinks.

The message of May 1 from Dr Alain Vadeboncoeur.

Facebook screenshot

The message of May 1 from Dr Alain Vadeboncoeur.

Working

The first line of verification of posts on Facebook uses an algorithm that uses artificial intelligence. Disputed cases are then reviewed by an employee.

“Each of the employees has a specialty and if it affects COVID, for example, it goes in the same queue. Maybe that’s why it took longer,” Meta said, explaining the three-week delay for the Agency to resolve the issue.

Meta did not want to reveal how its first line of verification using artificial intelligence works.

“We do not give the recipe, it would be easy to hijack it later,” she said.

You mustn’t laugh!

Also a science popularizer, the Dr Alain Vadeboncoeur recently had four of his posts censored by Facebook.

“The two were first was slightly ironic. The third (in February) followed a text that compared the mortality rate of COVID to that of the flu which had provoked a lot of reactions. I had made a contextualization which insisted on certain completely founded points , recalls it.

“I haven’t had access to my account for 24-48 hours, then I have a posting ban on other pages. My interpretation is that there were quite a few reports on this post that made Facebook’s algorithms react. It would be interesting to know how they suspend people who still have a speech on can no longer be right in the sense of science, ”says the Dr Vadeboncoeur.

For the fourth, Monday, Mr. Vadeboncoeur had clearly indicated at the top of his publication that it was irony by publishing an image giving an absurd recipe for “devaxing”.

“The challenge for artificial intelligence is often able to decelerate sarcasm. The algorithms are not yet strong enough to understand that Dr. Vadeboncoeur is humorous or sarcasm. That’s why the human comes to the second level of verification, “replied the spokesperson for Meta.

Counter productive

LEDsr Vadeboncoeur recognizes that it is not “always easy” to slow down irony. But this flaw would also have a major impact on the hunt for fake news.

“In the most recent case, I had to say it was (ironically), and it didn’t change anything. It amounts to no longer being able to announce an erroneous image or publication, ”he thinks.

And if he understands the precautionary principle mentioned by Facebook to first block publications and then analyzers, the one followed by an estimated 50,000 people that the multinational could act otherwise.

“As long as Facebook recognizes a real identity (blue verification badge) and it is a highly followed account, it seems to me that it is absurd to suspend people who spend their lives fighting against disinformation for a touch of irony or even a bad reading of the situation”, regrets Mr. Vadeboncoeur.

Facebook's blocking notice on May 3.

Facebook screenshot

Facebook’s blocking notice on May 3.

Solutions

Pascal Lapointe of Agence Science-Presse argues for regulation of algorithms.

“It’s just one more example. We do not know who manages this and how it is managed. It’s opaque and abnormal that we, the Dr Vadeboncoeur, the Dr Nadeau-Vallée and many other blocked soybeans without anyone understanding why. These companies have achieved a power over information around the world that has no equivalent in history,” he said.

On the internet, the Agency indicates that its work in information literacy earned it in 2019 “an award from the Canadian Foundation for Journalism, a prize funded by … Facebook!”

LEDsr Mathieu Nadeau-Vallée suggests another solution. “Facebook should give an official seal to certain scientific pages,” he suggests.

Regulations

Simon Thibault, assistant professor of political science at the University of Montreal, would not comment directly on these cases, but he believes that they illustrate “the technical challenge of ensuring that algorithms are more vigilant”.

“Facebook, for example, wanted to be more intransigent with misinformation in the context of a pandemic. But the volume of information (to process) is quite spectacular. These platforms have been criticized, they are trying to clean up their platform and, of course, there are going to be incongruous situations like these, ”he believes.

A committee of 12 experts was created on March 30 to help the Canadian government draft legislation to combat harmful content online.

“People are saying that the self-regulation of these platforms is not working. That there must be an intervention of state legislation to ensure that these companies really put the resources and the necessary efforts so that the real problematic content is removed”, analyzes Mr. Thibault.

3 WEEKS WAITING

March 28: Facebook pulls the news from Agence Science-Presse. The Agency contacts Facebook, which the employee cannot do anything about.

March 29: The Agency publishes its news.

March, 31st : the news is withdrawn again.

1uh april : the Agency contacts a Facebook executive, Kevin Chan, and explains the situation to him.

April 4: The leader transfers the agency to the Director of Media Partnerships.

April 8: Without news, the Agency insists with the customer service which agrees “exceptionally” to appoint to his superior.

April 14: The agency again contacts Facebook and the two leaders.

April 20: The situation is restored.

Do you have any information to share with us about this story?

Got a scoop that might be of interest to our readers?

Write to us at jdm-scoop@quebecormedia.com or call us directly at 1 800-63SCOOP.

Leave a Comment