In response to mounting regulatory pressures worldwide aimed at mitigating potential harm from online content, Meta Platforms announced on Tuesday its intention to introduce additional protective measures for content accessible to teenagers on Facebook and Instagram.
A statement released by Meta outlined plans to enforce more stringent content control restrictions specifically for teenagers on both Facebook and Instagram. Additionally, the company disclosed its intent to restrict additional search terms within the Instagram photo application, making it more challenging for teenagers to access content related to sensitive topics like suicide, self-harm, and eating disorders through features such as “Search” and “Explore.”
Meta emphasized that these protective measures, set to be rolled out in the upcoming weeks, are designed to curate content that aligns with the appropriate age group.
This development unfolds against the backdrop of Meta encountering heightened scrutiny in both the United States and Europe. Allegations have surfaced, accusing Meta’s applications of fostering addiction and exacerbating the mental health crisis among the youth demographic.
In October, attorneys general from 33 U.S. states, including prominent ones like California and New York, filed lawsuits against Meta, contending that the company consistently misled users about the potential risks associated with its platforms.
Simultaneously, the European Commission sought detailed information regarding the measures Meta is implementing to shield children from illegal and harmful content.
The impetus behind these regulatory actions intensified following the testimony of a former Meta employee before the U.S. Senate. The individual accused the company of being cognizant of harassment and other detrimental effects experienced by teenagers on its platforms, yet failing to take sufficient measures to address these pervasive issues.
Leave a Reply