Workflow
Online child safety
icon
Search documents
Meta used National PTA to promote child safety efforts, report finds
CNBC· 2025-08-26 12:00
Core Perspective - Meta is facing increasing criticism regarding its responsibility for the safety of children on its platforms, particularly in light of incidents like the suicide of a teenager linked to sextortion through its Messenger app [1][4] Group 1: Meta's Relationship with Advocacy Groups - The National Parent Teacher Association (PTA) has been collaborating with Meta since at least 2010, promoting the company's child safety initiatives while not always disclosing their financial ties [3][4] - A report by the Tech Transparency Project claims that the PTA's relationship with Meta lends credibility to the company's efforts to keep young users engaged, potentially downplaying the risks associated with its platforms [2][3] Group 2: Public Response and Criticism - Parents, including Mary Rodee, are increasingly vocal about their concerns, holding Meta accountable for the safety of its users and criticizing organizations that accept funding from the company [1][4] - Meta has responded to public scrutiny by partnering with organizations to promote its safety tools and protections for teens, asserting that such collaborations are aimed at educating parents [5]
Roblox Denies Louisiana's Allegation That Gaming Platform Lacks Safety Protocols
PYMNTS.com· 2025-08-15 20:01
Core Viewpoint - Roblox is facing a lawsuit from Louisiana Attorney General Liz Murrill, alleging that the company fails to protect children on its gaming platform, prioritizing user growth and profits over child safety [2]. Legal Allegations - The lawsuit claims that Roblox does not implement basic safety controls to protect children from predators, leading to harmful content on the platform [2]. - Louisiana seeks to prohibit Roblox from violating the Louisiana Unfair Trade Practices Act and demands restitution, attorney's fees, civil penalties, and additional damages [2]. Company Response - Roblox stated it does not comment on pending litigation but aims to address what it calls "erroneous claims and misconceptions" [3]. - The company claims to continuously introduce new safety tools, maintain stricter safety policies than other platforms, and actively work to detect and prevent inappropriate content and behavior [3][4]. Safety Measures - Roblox provides parental controls to help ensure a safe online experience for children and collaborates with law enforcement and child safety organizations [4]. - The company emphasizes its commitment to creating a safe online environment, which it considers critical to its long-term vision and success [5]. Recent Complaints - In a short period, seven complaints have been filed against Roblox in various states, including California, Pennsylvania, and Texas, regarding issues related to child predators on the platform [5]. - Roblox reported having 111.8 million daily active users in its second-quarter earnings release [6].
Utah governor signs online child safety law requiring Apple, Google to verify user ages
CNBC· 2025-03-26 21:45
Group 1: Legislation Overview - Utah has enacted a law requiring Apple and Google to verify user ages and obtain parental permission for users under 18 to access certain apps, marking a significant shift in online age verification responsibilities [1][2] - The App Store Accountability Act, or S.B. 142, is the first of its kind in the U.S. and may inspire similar legislation in other states like South Carolina and California [2][4] Group 2: Implementation Details - The law mandates that Apple and Google request age verification checks when new accounts are created, likely using credit cards, and link accounts of users under 18 to a parent's account [3][4] - Parents will need to consent to in-app purchases, enhancing parental control over app usage [3] Group 3: Industry Reactions - Meta supports the bill, arguing that app stores are better suited for age verification than individual apps, while Apple contends that apps should handle their own age verification due to privacy concerns [5][6] - Google has expressed concerns that the law raises privacy and safety risks for minors and suggests that it shifts responsibility from companies to app stores without addressing the underlying issues [7][8] Group 4: Context and Background - The law follows a history of scrutiny over social media companies regarding child safety, particularly after a congressional hearing where CEOs faced criticism for failing to protect children online [9][10] - Meta has faced multiple lawsuits related to child well-being on its platforms, indicating ongoing legal challenges in this area [10]