Fairplay, an association for the well-being of children, has just unveiled a survey estimating that Instagram would be at the origin of an exposure of 20 million people to pro-TCA content. Content promoting Eating Behavior Disorders has always existed on social networks, but this report shows worrying figures, especially for young people. Recommendation algorithms have become more and more efficient, sometimes making it possible to give people what they want, to the detriment of their health.
According to the study, nearly 20 million people with an Instagram account are “fueled by content from the pro-Eating Disorder bubble”. According to Instagram figures, the application has 1.386 billion monthly active users and 500 million daily active users worldwide. Speaking of France, 69% of daily visitors are in the 15-24 age group. Thus, Instagram is used by especially young people. Also, many lie about their age, as the limit is set at 13, so we can assume that the numbers do not fully represent reality.
The report clearly shows that the algorithms are pushing pro-anorexia and pro-TCA content to millions of users, including accounts of 13-year-olds. The report says nearly 20 million Instagram users are “fed with content from Instagram’s pro-food bubble,” and many of them are teenagers or younger. To understand the study, the researchers identified 153 “seed accounts” that were public, had more than 1,000 followers, and advocated for eating disorders. They calculated that around 1.6 million Instagram users follow one of these accounts, and 88,655 follow three or more. The researchers found that nearly 20 million Instagram users followed at least one of these 88,655 accounts, and could be enticed to follow seed accounts because they had a mutual connection.
One of the big problems of the social network are the suggestions on which users to follow. As soon as we show an interest in a desire for weight loss, Instagram very quickly drowns us in recommendations to follow accounts sharing the same interest. But what could Instagram do against this scourge? “You just need to put some safeguards around the follow-up recommendation algorithm to burst that bubble,” Farthing said. The American site BuzzFeed an open question to the American company why such a device did not already exist. Liza Crenshaw, spokesperson for the Instagram-owning group, responded:
In the follow-up to the post, Liza Crenshaw insists that when users search for or post eating disorder-friendly content, the company highlights organizations that can provide help, and users have the ability to report content related to eating disorders. Furthermore, accounts that expressly feel self-destructive content are not recommended and the social network is working to reduce search results related to hashtags or @ from these accounts.
In the United States, measures have already been taken. US Senators Richard Blumenthal and Marsha Blackburn recently introduced the Kids Online Safety Act, which required platforms to “act in the best interests of a minor” when using their platforms. Additionally, the California legislature is considering a provision inspired by the UK’s Age Appropriate Design Code, which would require companies to consider the best interests of children when creating or modifying their algorithms.