Meta Platforms said yesterday, Tuesday, that it will block more content that teenagers can see on Facebook and Instagram, following pressure on the company from regulatory authorities around the world to limit harmful content, according to a report by Reuters.
The company stated that it will impose stricter content control restrictions on teenagers in these applications and will also restrict additional search terms in the Instagram photo application. It added that this step will make it harder for teenagers to access sensitive content such as suicide, self-harm, and eating disorders when using features on Instagram, including “search” and “explore.”
The company mentioned that the protection measures it plans to implement in the coming weeks will help display content that is suitable for the “age group.”
Meta is under pressure in the United States and Europe over allegations that its applications are addictive and have played a role in exacerbating the crisis of youth mental health.
In October, the Attorney General filed lawsuits against the company in 33 U.S. states, including California and New York, saying that it misled users repeatedly about the risks of its platforms.
In Europe, the European Commission requested information about the measures Meta takes to protect children from illegal and harmful content.
Regulatory pressure came after a former Meta employee testified before the U.S. Senate, accusing the company of being aware of the harassment and other harms faced by teenagers on its platforms but failing to take action against these problems.
Leave a Reply