Workflow
青少年网络安全
icon
Search documents
青少年健康用网公益行动走进湖南两学校 700名学生共上防范隔空猥亵安全课
Sou Hu Wang· 2025-12-16 09:48
Core Viewpoint - The article highlights a public awareness campaign aimed at educating middle school students about online safety and the dangers of virtual sexual harassment, led by prosecutor Li Siyuan in collaboration with Kuaishou and local youth organizations [2][9]. Group 1: Event Overview - The campaign took place in Pingjiang County and Li Siyuan engaged nearly 700 first-year middle school students through lectures, interactive Q&A, and sand painting performances [2][9]. - The initiative is part of a broader effort by Kuaishou to promote safe internet usage among youth, responding to the central government's call for enhanced online safety measures [9]. Group 2: Educational Content - Li Siyuan conducted a survey among students regarding their internet usage and experiences with online traps, emphasizing the importance of recognizing and preventing virtual sexual harassment [3][5]. - She explained the definition, characteristics, and psychological impacts of virtual sexual harassment, using real cases and legal provisions to illustrate the issue [5][6]. Group 3: Prevention Strategies - Li Siyuan outlined three layers of defense against online harassment: mental vigilance (not trusting strangers), behavioral precautions (not sharing personal information), and technical measures (using parental controls and saving evidence) [6][7]. - She emphasized that boys can also be victims of virtual sexual harassment and provided a clear action plan for victims, including preserving evidence and seeking help from trusted adults [7]. Group 4: Collaborative Efforts - The campaign underscores the need for a collective approach to protect minors online, involving families, schools, society, government, and judicial departments [7]. - Kuaishou staff demonstrated features of their app designed to enhance online safety for minors, such as enabling minor mode and anti-violence functions [9].
某初中生长期沉迷网络虚拟世界,受队友蛊惑企图创立“同盟联军”并计划亲身模仿游戏内容于成年后轰炸学校,国家安全部披露——
Xin Jing Bao· 2025-12-16 01:32
据国家安全部12月16日消息,随着互联网深度普及,我国未成年网民规模已超过1.96亿,青少年首次触网年龄呈明显低龄化趋势。网络在开阔视野、便利 学习的同时,其潜在的负面影响也日益凸显。极个别青少年沉迷网络游戏,受不良信息侵蚀现象偶有发生。不良网络信息对青少年世界观、人生观的无形 侵害,需要我们保持高度警惕。 ――侵蚀国家认同的"慢性毒药"。境外 间谍情报机关与别有用心者,有组织地 利用网络游戏、动漫社群、视频平台等 = 小 上 时 上 上 上 十 十 出 出 上 七 十 育少年浆朱吧, 无主义与反动言论,解构主流价值观, 丑化民族英雄,旨在潜移默化地削弱青 少年的国家认同与文化自信。这种长期 的思想渗透,直接危害国家的政治安全 与文化安全。 观扭曲,从勤奋好学变得厌世消极,甚 至与家庭、社会产生严重对立。 ――危害社会安定的"思想毒瘤"。网络 中一些披着"暴力美学""侠义江湖"外衣 的 极 端 观 点 , 鼓 吹 用 暴 力 手 段 解 决 问 题, 蔑视法律与公序良俗。青少年若长 期受此浸染,容易模糊道德与法律的边 界,不仅可能滋生校园欺凌等行为,更 可能在特定情境下被煽动,实施危害公 共安全的过激举动。某 ...
警惕互联网陷阱!合力守护青少年网络空间清朗 这四招很关键
记者从国家安全部了解到,随着互联网深度普及,我国未成年网民规模已超过1.96亿,青少年首次触网 年龄呈明显低龄化趋势。 家校协同引导。学校应将网络素养与国家安全教育纳入常规教学体系,与家庭共同教育引导青少年辨别 网络信息真伪,识别错误思想,并配合有关部门对涉案未成年人进行耐心引导和深入教育,帮助其认清 极端思想的本质与危害,防止在错误道路上越陷越深,实现政治效果、法律效果、社会效果的统一。 平台压实责任。目前,各大平台均已上线"未成年人模式",可探索优化推荐机制,加强内容审核,提高 对隐蔽性极端思想的识别能力,及时清理传播错误思想的内容,减少并避免同类不良信息的过度聚合和 推荐。 广泛宣传教育。国家安全机关将立足职能,会同相关部门通过典型案例以案说法、以案析理,深刻揭露 极端网络思想的渗透手法与巨大危害。鼓励广大人民群众特别是青少年及其家长通过国家安全部官方微 信公众号等权威平台学习国家安全知识,增强识别和抵制能力,共筑维护国家安全的钢铁长城。 严格打击处置。国家安全机关将立足职能,会同有关单位坚决打击传播极端思想、诱导不良价值观的内 容和行为,加大对传播影响青少年身心健康的不良信息行为处罚力度,织密安全防护 ...
Google says Australia’s teen social media ban ‘extremely difficult’ to enforce (GOOG:NASDAQ)
Seeking Alpha· 2025-10-13 10:04
Core Viewpoint - Alphabet-owned Google expressed concerns that Australia's new law prohibiting social media use for individuals under 16 would be "extremely difficult" to enforce and would not effectively enhance online safety for children [6] Group 1 - Australia is poised to become the first country to implement such a law regarding social media usage for minors [6] - Google highlighted that the enforcement of this law could lead to challenges in ensuring compliance and monitoring [6] - The company warned that the law may not achieve its intended goal of making children safer online [6]
印媒:禁用社交媒体,青少年就能安全吗?
Huan Qiu Shi Bao· 2025-08-07 22:57
Core Viewpoint - The recent Australian proposal to ban minors from using YouTube and other social media platforms has sparked intense debate, highlighting the challenges of ensuring online safety for youth in a digital age [1][2]. Group 1: Regulatory Changes - Australia has revoked the exemption previously granted to YouTube, mandating compliance with new online safety regulations aimed at protecting minors [1]. - The proposed "Social Media Minimum Age Law" will prohibit individuals under 16 from using platforms like YouTube, Facebook, and X [1]. Group 2: Effectiveness of Age Restrictions - Research indicates that strict age restrictions do not effectively prevent youth from encountering online dangers, as evidenced by Norway's experience where 72% of 11-year-olds continued to use social media despite a minimum age limit of 13 [1]. - The UK's Online Safety Act, intended to limit minors' access to social networks, has led to absurd situations where youth use virtual avatars to bypass facial recognition technology [1]. Group 3: YouTube's Influence and Risks - YouTube's viewing time surpasses that of traditional media giants like Disney and Netflix, showcasing its appeal but also revealing potential risks associated with its open platform [2]. - A study from Dartmouth College found that while YouTube's algorithm rarely recommends extremist content to users who do not seek it out, such content still exists on the platform [2]. Group 4: Call for Action - Policymakers are urged to push social media platforms to address inherent risks rather than simply imposing age restrictions, advocating for increased transparency in algorithms and targeted solutions from stakeholders [2].
Meta updates safety features for teens. More than 600,000 accounts linked to predatory behavior
CNBC· 2025-07-23 11:00
Group 1 - Meta introduced new safety features for teen users on Facebook and Instagram, including enhanced direct messaging protections to prevent exploitative content [1] - Teens will receive more information about their chat partners, such as account creation dates and safety tips, to help identify potential scammers [1] - The company reported blocking accounts 1 million times and receiving another 1 million reports after issuing a Safety Notice in June [2] Group 2 - Meta removed nearly 135,000 Instagram accounts earlier this year that were found to be sexualizing children, which included accounts leaving sexualized comments or requesting sexual images [3] - The takedown also involved 500,000 Instagram and Facebook accounts linked to the original profiles that were involved in the exploitation [3] - This initiative is part of a broader effort by Meta to protect teens and children on its platforms amid increasing scrutiny from policymakers [2]