Meta found in breach of EU law over ‘ineffective' complaints system for flagging illegal content

Core Points - The European Commission has found that Meta's platforms, Instagram and Facebook, have violated EU law by not providing users with straightforward ways to report illegal content, including child sexual abuse material and terrorist content [1][2][3] - The commission's preliminary findings indicate that Meta employs "dark patterns" in its reporting mechanisms, making it confusing and discouraging for users to report illegal content [2][4] - Meta has denied any breach of the Digital Services Act (DSA) and claims to have implemented changes to improve content reporting and appeals processes [12] User Reporting Mechanisms - The commission criticized Meta for lacking a user-friendly 'notice and action' mechanism for reporting illegal content [3] - Current complaint mechanisms are deemed too complex, leading to user disincentives and ineffectiveness in reporting [4][7] - Simplifying the feedback system could also help combat misinformation, such as fake news related to political events [8] Researcher Access to Data - The commission has preliminarily found that both TikTok and Meta are not providing adequate access to public data for researchers, which is essential for assessing minors' exposure to harmful content [9][10] - Access to data is considered a transparency obligation under the DSA, allowing for public scrutiny of the platforms' impact on health [10] Compliance and Penalties - Meta and other platforms have been given time to comply with the commission's demands, with potential fines of up to 6% of total worldwide annual turnover for non-compliance [11] - The commission emphasizes that platforms must empower users and respect their rights as part of their obligations under the DSA [11][12]