Online child safety
Search documents
Roblox Hit With Multimillion-Dollar Suit By LA County Claiming Gaming Platform Leaves “Children Easy Prey For Pedophiles”
Deadline· 2026-02-19 23:02
Core Viewpoint - Los Angeles County has filed a lawsuit against Roblox, accusing the platform of being a dangerous environment for minors, facilitating exploitation and abuse through its design and lack of effective safeguards [1][4][7]. Group 1: Lawsuit Details - The lawsuit claims that Roblox has created a breeding ground for exploitation, including "pornographic content," grooming, and simulated sexual activity, particularly affecting minors [1][4]. - The filing highlights specific incidents, such as a reported case in 2018 where a seven-year-old girl's avatar was raped by other users' avatars, illustrating the severe risks present on the platform [6]. - The lawsuit seeks marketing injunctions and penalties of $2,500 for each violation, which could amount to millions daily [6]. Group 2: Roblox's Response - Roblox has publicly disputed the claims made in the lawsuit, asserting that safety is a core aspect of their platform and that they continuously enhance their protective measures [8][9]. - The company emphasizes that it has advanced safeguards to monitor harmful content and communications, including restrictions on image sharing via chat [9]. - Roblox has 30 days to respond to the lawsuit after being served, which may involve various legal strategies [10]. Group 3: Industry Context - Roblox has established lucrative partnerships with major companies like Warner Bros, Netflix, and TikTok, indicating its significant market presence and appeal [2]. - The lawsuit from LA County aligns with similar legal actions from other states, suggesting a growing scrutiny of online platforms regarding child safety [3].
Meta used National PTA to promote child safety efforts, report finds
CNBC· 2025-08-26 12:00
Core Perspective - Meta is facing increasing criticism regarding its responsibility for the safety of children on its platforms, particularly in light of incidents like the suicide of a teenager linked to sextortion through its Messenger app [1][4] Group 1: Meta's Relationship with Advocacy Groups - The National Parent Teacher Association (PTA) has been collaborating with Meta since at least 2010, promoting the company's child safety initiatives while not always disclosing their financial ties [3][4] - A report by the Tech Transparency Project claims that the PTA's relationship with Meta lends credibility to the company's efforts to keep young users engaged, potentially downplaying the risks associated with its platforms [2][3] Group 2: Public Response and Criticism - Parents, including Mary Rodee, are increasingly vocal about their concerns, holding Meta accountable for the safety of its users and criticizing organizations that accept funding from the company [1][4] - Meta has responded to public scrutiny by partnering with organizations to promote its safety tools and protections for teens, asserting that such collaborations are aimed at educating parents [5]
Roblox Denies Louisiana's Allegation That Gaming Platform Lacks Safety Protocols
PYMNTS.com· 2025-08-15 20:01
Core Viewpoint - Roblox is facing a lawsuit from Louisiana Attorney General Liz Murrill, alleging that the company fails to protect children on its gaming platform, prioritizing user growth and profits over child safety [2]. Legal Allegations - The lawsuit claims that Roblox does not implement basic safety controls to protect children from predators, leading to harmful content on the platform [2]. - Louisiana seeks to prohibit Roblox from violating the Louisiana Unfair Trade Practices Act and demands restitution, attorney's fees, civil penalties, and additional damages [2]. Company Response - Roblox stated it does not comment on pending litigation but aims to address what it calls "erroneous claims and misconceptions" [3]. - The company claims to continuously introduce new safety tools, maintain stricter safety policies than other platforms, and actively work to detect and prevent inappropriate content and behavior [3][4]. Safety Measures - Roblox provides parental controls to help ensure a safe online experience for children and collaborates with law enforcement and child safety organizations [4]. - The company emphasizes its commitment to creating a safe online environment, which it considers critical to its long-term vision and success [5]. Recent Complaints - In a short period, seven complaints have been filed against Roblox in various states, including California, Pennsylvania, and Texas, regarding issues related to child predators on the platform [5]. - Roblox reported having 111.8 million daily active users in its second-quarter earnings release [6].
Utah governor signs online child safety law requiring Apple, Google to verify user ages
CNBC· 2025-03-26 21:45
Group 1: Legislation Overview - Utah has enacted a law requiring Apple and Google to verify user ages and obtain parental permission for users under 18 to access certain apps, marking a significant shift in online age verification responsibilities [1][2] - The App Store Accountability Act, or S.B. 142, is the first of its kind in the U.S. and may inspire similar legislation in other states like South Carolina and California [2][4] Group 2: Implementation Details - The law mandates that Apple and Google request age verification checks when new accounts are created, likely using credit cards, and link accounts of users under 18 to a parent's account [3][4] - Parents will need to consent to in-app purchases, enhancing parental control over app usage [3] Group 3: Industry Reactions - Meta supports the bill, arguing that app stores are better suited for age verification than individual apps, while Apple contends that apps should handle their own age verification due to privacy concerns [5][6] - Google has expressed concerns that the law raises privacy and safety risks for minors and suggests that it shifts responsibility from companies to app stores without addressing the underlying issues [7][8] Group 4: Context and Background - The law follows a history of scrutiny over social media companies regarding child safety, particularly after a congressional hearing where CEOs faced criticism for failing to protect children online [9][10] - Meta has faced multiple lawsuits related to child well-being on its platforms, indicating ongoing legal challenges in this area [10]