Core Viewpoint - The Federal Trade Commission (FTC) is investigating the impact of AI chatbots on children, requiring major companies like Google, OpenAI, and Meta to provide information on their safety measures and usage by minors [1][2]. Group 1: FTC Investigation - The FTC has issued orders to seven AI chatbot developers, including Google, OpenAI, and Meta, to gather information on how they measure and monitor their chatbots' impacts on children and teens [2] - The inquiry is conducted under the FTC's 6(b) authority, allowing the agency to issue subpoenas for market studies, with findings typically reported after extensive analysis [5] - The FTC's investigation may lead to official probes based on the information collected, particularly regarding OpenAI's compliance with consumer protection laws related to its ChatGPT product [6] Group 2: Safety Concerns - There is increasing scrutiny on chatbot developers regarding their efforts to ensure user safety and prevent harmful behaviors, highlighted by a lawsuit against OpenAI related to a student's suicide [3] - Current US law prohibits technology companies from collecting data on children under 13 without parental consent, and there are ongoing discussions in Congress to extend these protections to older teens [4]
Google, Meta, OpenAI Face FTC Inquiry On Chatbot Impact On Kids
NDTV Profit·2025-09-11 23:22