UK watchdogs press Meta, TikTok, Snap and YouTube to block children
Reuters·2026-03-12 00:04

Core Viewpoint - UK regulators are urging major social media platforms to enhance measures to protect children from accessing their services, highlighting failures in enforcing age restrictions and the exposure of children to harmful content [1] Group 1: Regulatory Actions - Ofcom and the Information Commissioner's Office (ICO) are demanding that platforms like Meta, TikTok, Snap, and YouTube implement stricter age checks and safety measures by April 30 [1] - The ICO has called for the adoption of modern age-assurance tools to prevent children under 13 from accessing inappropriate services [1] - Ofcom has the authority to impose fines of up to 10% of a company's global revenue, while the ICO can fine up to 4% of a company's annual turnover [1] Group 2: Concerns Raised - Regulators express growing concern over algorithmic feeds that expose children to harmful or addictive content [1] - Melanie Dawes, Ofcom's chief executive, emphasized the need for companies to prioritize children's safety in their products [1] - The ICO recently fined Reddit nearly £14.5 million for failing to implement effective age checks and unlawfully processing children's data [1]