Workflow
Facebook's content moderation 'happens too late,' says research

Core Insights - New research from Northeastern University indicates that Facebook's content moderation is often ineffective as it occurs too late, with posts having already reached 75% of their predicted audience before removal [1][2][10] Group 1: Content Moderation Effectiveness - The study reveals that content moderation on Facebook does not significantly impact user experience due to its delayed nature [2][10] - A new metric called "prevented dissemination" was proposed to measure the potential impact of content moderation by predicting future post dissemination [3][4] - The research analyzed over 2.6 million Facebook posts, finding that only a small percentage were removed, with 0.7% in English, 0.2% in Ukrainian, and 0.5% in Russian [8] Group 2: User Engagement Patterns - The top 1% of most-engaged content accounted for 58% of user engagements in English, 45% in Ukrainian, and 57% in Russian [6][7] - A significant portion of engagement occurs quickly, with 83.5% of total engagement happening within the first 48 hours of a post being live [7] - The study found that removing posts only prevented 24% to 30% of their predicted engagement [9] Group 3: Algorithm and Moderation Mismatch - The research highlights a mismatch between the speed of Facebook's content moderation and its recommendation algorithm, suggesting that moderation needs to occur at a pace similar to content recommendations to be effective [10][11] - The majority of removed posts were identified as spam, clickbait, or fraudulent content, indicating the focus of content moderation efforts [8]