Instagram's Algorithm Under Scrutiny for Showing Inappropriate Content to Teen Followers

Instagram’s Algorithm Under Scrutiny for Showing Inappropriate Content to Teen Followers

Introduction

Instagram, a platform owned by Meta, is facing backlash for its algorithm that reportedly served inappropriate content, including child-sexualizing reels, to followers of teen influencers. The Wall Street Journal conducted an experiment that revealed the platform’s tendency to show risqué and sexual adult videos to accounts that followed young gymnasts and cheerleaders. Furthermore, ads from well-known brands such as Disney, Walmart, and Pizza Hut were shown alongside this inappropriate content. Meta’s response to the issue has left many advertisers concerned about the placement of their ads on the platform. This controversy comes after insensitivity to antisemitic content by another social media company, X, resulting in heightened scrutiny over online content algorithms.

Ads
  

The Experiment and Results

The Wall Street Journal experiment revealed that Instagram’s Reels video service was serving controversial content to accounts that exclusively followed teen influencers. This content included sexualized adult videos and footage using underage individuals. Such content is supposed to be prohibited on Meta’s platforms. Additionally, the report highlighted that the Canadian Centre for Child Protection conducted similar tests with identical outcomes.

Ads
  

Repercussions for Advertisers

Disney, Walmart, Pizza Hut, Bumble, Match Group, and The Wall Street Journal faced a dilemma as their ads were improperly mixed with inappropriate content. Some of these brands, including Bumble, Match Group, and Hims, have either withdrawn their ads from Meta or demanded that the issue be addressed. The repeated controversies involving X and Instagram have made advertisers more cautious about the type of content their ads are displayed next to.

Ads
  

Meta’s Response

Meta responded to the criticism by assuring clients that they were investigating the issue. They also pledged to pay for brand-safety auditing services to determine the frequency of ad placements next to unacceptable content. However, Meta did not provide a timeline or details on plans for prevention in the future.

Ads
  

Internal Awareness and Challenges

According to current and former Meta employees interviewed by the WSJ, the tendency to aggregate child-sexualizing content was a known issue internally, even before the launch of Reels. Employees claimed that the platform’s algorithms should be revamped to address this problem effectively. However, internal documents indicated that Meta prioritizes traffic performance, making it difficult for the safety team to implement significant changes.

Ads
  

Source: Instagram reportedly served up child-sexualizing reels to followers of teen influencers

Similar Posts