In response to mounting concerns about its platforms recommending explicit content for children, Meta has announced an expansion of its child safety features. This move follows extensive reports, particularly from The Wall Street Journal, detailing instances where Meta and Instagram delivered inappropriate and sexual content to users, particularly children.
A comprehensive report in June uncovered Instagram’s connection to a network of accounts involved in purchasing and selling explicit material related to child abuse. The platform facilitated the direction of these accounts to each other through its recommendation algorithms.
Further investigations have unveiled the pervasiveness of the issue within Facebook groups, revealing an ecosystem of accounts and groups engaged in the sexual exploitation of children. Some of these groups boasted memberships of around 800,000.
Meta’s recommendation system allowed abusive accounts to find each other through features such as “Groups You Should Join” on Facebook and automatic tagging on Instagram.
To address these concerns, Meta has imposed restrictions on how suspicious adult accounts interact with each other. Measures include preventing these accounts from following each other on Instagram, excluding them from recommendations, and making comments from such accounts invisible to other suspicious accounts.
Meta has also broadened its list of terms, phrases, and emojis related to child safety, incorporating machine learning to detect connections between different search terms.
These developments coincide with increased pressure from U.S. and EU regulators, who are urging Meta to take more decisive steps in ensuring child safety across its platforms. Mark Zuckerberg and other CEOs from major tech companies are scheduled to testify before the Senate in January 2024, specifically addressing the issue of child exploitation online.
In November, EU regulators issued Meta a final deadline to provide information on its measures to protect minors. They issued a new request explicitly highlighting concerns about the circulation of self-generated child sexual abuse material on Instagram and the platform’s recommendation system.
Leave a Reply