Executives at Meta, the parent company of Facebook, reportedly dismissed the idea of disabling the “People You May Know” feature despite concerns raised by an employee about potential child exploitation. According to The Wall Street Journal, in 2018, David Erb, an engineering manager at Facebook, led a team focused on identifying unsafe user behavior. The team found that the algorithm behind “People You May Know” was being misused by adults to target children on the platform.
Erb and his team proposed that the feature should stop recommending minors to adults, but Meta’s leadership rejected the suggestion. This internal dispute occurred while Meta was discussing transitioning to end-to-end encryption for Facebook messages, a move that raised concerns about the difficulty of detecting child exploitation.
Ultimately, Meta proceeded with message encryption, and Erb was removed from his role. He resigned shortly after in December 2018. Meta has countered Erb’s claims, stating that the company has long invested in child safety efforts.
Despite Meta’s ongoing efforts to remove violative accounts, questions remain about the company’s commitment to addressing child exploitation, especially after the announcement of default end-to-end encryption in Messenger. The move toward greater privacy has sparked internal debates about balancing user security with the need to combat harmful activities on the platform.
Meta created an internal tool named Masca (Malicious Child Safety Actor) to detect accounts involved in suspicious activities with minors. The company reported removing 160,000 accounts related to child exploitation since 2020. The broader implications of Meta’s actions are part of a larger discourse on online privacy, child safety, and the responsibilities of social media platforms.
Leave a Reply