Core Viewpoint - Meta issued an apology for an error in Instagram's recommendation algorithm that led to users being shown disturbing and violent videos, including graphic depictions of fatal incidents, affecting a wide range of users, including minors [1][4][5]. Group 1: Incident Details - The algorithm error resulted in users receiving content from accounts they did not follow, such as "BlackPeopleBeingHurt" and "ShockingTragedies," with some videos receiving millions more views than typical posts from the same accounts [3][5]. - Despite the apology, the company did not disclose the scale of the issue, and reports indicated that disturbing content continued to appear even after the problem was claimed to be resolved [5][12]. Group 2: Content Moderation Policies - The incident occurred as Meta was adjusting its content moderation policies, particularly in relation to automated detection of objectionable material [6][9]. - Meta announced a shift in its moderation strategy to focus on "illegal and high-severity violations" while relying on user reports for less serious violations, which may have contributed to the algorithmic error [9][10]. - The company acknowledged that its systems had been overly aggressive in demoting posts and was in the process of eliminating most of those demotions [10][11]. Group 3: Company Context - Meta's content moderation changes are part of a broader strategy to allow freer expression, which has been interpreted by some as an effort to improve relations with political figures [14][15]. - The company has faced significant staffing reductions, cutting approximately 21,000 jobs, nearly a quarter of its workforce, including roles in civic integrity and safety teams [15].
Meta apologizes after Instagram users are flooded with violent videos