Meta has announced that it is working on implementing additional safeguards to protect teenage users from unwanted direct messages on its Instagram and Facebook platforms, according to Reuters. This move comes weeks after the owner of the WhatsApp application announced plans to hide more content from teenagers, following pressure from regulators to protect children from harmful content on its apps.
Meta stated that teenagers will no longer receive direct messages from anyone they do not follow or are not connected to on Instagram. They will also be required to obtain parental consent to change specific settings in the app. Previously, guardians would receive a notification when teenagers changed these settings, but they were unable to take any action regarding them.
The company provided an example that guardians could prevent changes if a teenage user attempted to make their account “public” instead of “private,” or modify the sensitive content control from “a little” to “standard,” or tried to alter the message-sending controls.
These new restrictions apply to all users under the age of 16 by default. Meta stated that it would notify existing users of the change through a notification. Messenger users will still receive messages from friends on Facebook or contacts in their address book.
Meta also plans to launch a feature that prevents teenagers from seeing unwanted or inappropriate images in their direct messages. The company mentioned that this feature works in end-to-end encrypted conversations and aims to prevent teenagers from sending such types of images.
Scrutiny has increased following the testimony of a former Meta employee in the U.S. Senate, claiming that the company was aware of harassment and other harms faced by teenagers on its platforms but failed to take action against them.
Leave a Reply