Core Viewpoint - The article discusses the implementation and effectiveness of minor protection modes in various popular apps, highlighting the balance between protection and meeting the needs of minors in the digital space [1][4][41]. Group 1: Implementation of Minor Protection Modes - Since the trial launch of the youth anti-addiction system in 2019, the protection of minors online has gradually been incorporated into platform regulations and national supervision [1]. - In 2024, the "Guidelines for the Construction of Minor Modes in Mobile Internet" were released, introducing age-based recommendation standards, prompting major apps to launch dedicated modes for minors [2][41]. - A total of 24 popular apps across seven categories (social, gaming, short video, etc.) have implemented minor modes, but the effectiveness of these modes is questioned [13][41]. Group 2: Challenges and Issues - Despite the establishment of minor modes, issues such as lack of age-appropriate content and insufficient resources have been reported [3][41]. - The entry points for accessing minor modes in apps are often deeply nested, making it difficult for users to find and activate them easily [14][41]. - In gaming apps, only one app allows switching to minor mode from within the game, while others require parental intervention through third-party platforms, complicating the monitoring process [17][41]. Group 3: Content Supply and Age Appropriateness - Among the 24 apps tested, only 58% have age-based features that allow parents to set content according to their child's age [23][41]. - The content provided in minor modes often lacks diversity and fails to meet the developmental needs of different age groups, with some apps repeating the same content across various age ranges [24][34][41]. - In music apps, the minor modes tend to restrict genres, focusing on educational content while neglecting the musical preferences of minors [39][41]. Group 4: Global Context and Legislative Measures - The issue of minors' internet usage is becoming a significant topic in global digital governance, with many countries implementing legislative measures to define platform obligations and parental responsibilities [5][7]. - Various countries have established age-based restrictions and parental consent requirements for minors using online services, with penalties for non-compliance [8][10][41]. - The EU's GDPR mandates that platforms obtain parental consent for processing data of minors under 16, while Australia has enacted strict measures prohibiting minors under 16 from using most social media platforms [9][10][41].
24款热门APP未成年模式测评:“过度禁止”不容忽视
Hu Xiu·2025-08-18 02:36