Core Points - Current and former Meta employees have disclosed documents to Congress alleging that the company may have suppressed research on children's safety [1] - Allegations include changes in Meta's policies regarding sensitive research topics following the leak of internal documents by whistleblower Frances Haugen [2] - The whistleblowers claim that Meta discouraged discussions and research on the use of its social virtual reality apps by children under 13 [4] Policy Changes - Meta proposed two methods for researchers to limit risks in sensitive research: involving lawyers to protect communications and using vague language in findings [3] - A former researcher reported being instructed to delete recordings of an interview related to a serious safety concern on Meta's VR platform [3] Legal and Regulatory Context - A Meta spokesperson stated that information from minors under 13 must be deleted if collected without parental consent, but whistleblowers argue that employees were discouraged from addressing concerns about children's safety [4] - A lawsuit filed by a former employee raised similar concerns about the safety measures for users under 13 on Meta's Horizon Worlds platform [7] Racial and Safety Issues - Allegations in the lawsuit include that users with Black avatars faced racial slurs shortly after entering the platform [8] - The company is also facing criticism regarding the impact of its AI chatbots on minors, with reports indicating that previous rules allowed inappropriate conversations with children [9] Company Response - Meta claims that since the start of 2022, it has approved nearly 180 studies related to social issues, including youth safety and well-being, countering the narrative presented by whistleblowers [5]
Meta suppressed children's safety research, four whistleblowers claim