Core Viewpoint - Meta is enhancing safety features for teens on its social media platforms, particularly Instagram, to provide better protection against potential scams and harmful interactions [1][4]. Group 1: New Features for Teen Accounts - New safety features have been added to direct messages (DMs) in Teen Accounts, providing context about accounts being messaged and helping teens identify potential scammers [2]. - Teens will now see options to view safety tips, block accounts, and view the account's join date prominently displayed at the top of new chats [2]. - A new block and report function has been introduced, allowing users to block and report suspicious accounts simultaneously [2]. Group 2: User Engagement and Statistics - In June, 1 million Teen Accounts reported or blocked accounts, and another 1 million utilized the Location Notice feature to check if a messaging account was in a different country [3]. - The new DM features and block/report options are currently exclusive to Instagram, with potential plans to extend them to Facebook Messenger in the future [3]. Group 3: Addressing Past Accusations - Meta has faced accusations regarding the impact of its platforms on minors, including claims from a whistleblower about targeted ads based on teenagers' emotional states [4]. - In response, Meta has implemented improved safety features for underage users, including the introduction of "Teen Accounts" that limit contact and content visibility [4]. Group 4: Protections for Adult Accounts Related to Children - Meta will extend similar protections to adult accounts that share content related to children, such as family blogs, to prevent abuse and inappropriate interactions [5]. - Protections include placing these accounts into strict message settings and activating filters for offensive comments [5]. - These changes are set to roll out in the coming months [5].
Meta Debuts More Instagram Protections for Teen Users. Here's What's New