筑梦岛
Search documents
爱压抑的年轻人,开始和AI「做爱」丨2026人机恋报告
36氪· 2026-02-14 13:15
以下文章来源于后浪研究所 ,作者娃娃菜 木也 曲枚 和Al建立亲密关系, 在当下已经不是一件新鲜事了。 有人身穿婚纱 , 和Al步入婚姻的殿堂; 有人和Al吵架 后浪研究所 . 36氪旗下年轻态公众号。这位青年里边请,你被研究了。 | J/ ヘレー/ バイ 因为AI恋人忘记了他们第一次见面的日子; 有人现实中单身未婚, 却在智能体里拥有了好几个孩子, 过上了赛博幸福一家人的生活; 有人睡前最后一眼是AI、 醒来第一眼还是AI, 一天12小时都在和AI谈情说爱。 据Appfigures数据,截至2025年7月, AI陪伴应用在全球的下载量已达2.2亿次, 全年营收直奔1.2亿美元。 能定制性格,不会抱怨,不会发脾气, ឋ时随地提供情感陪伴, 几乎无限的知识储备, Al恋人们,渐渐走进了年轻人心中。 「后浪研究所」发起了一项 关于「人机恋」的调查问卷, 共有762位朋友参与, 今年情人节,想和大家一起聊聊 「人机恋」这件事。 调查对象 性别: 男性 33.1% 女性 66.9% 年龄层: 05后 14.6% 90后 18.7% 00后 28.5% 90前 8.5% . . . - | . . . . . . . ...
性压抑的年轻人,开始和AI「做爱」丨2026人机恋报告
后浪研究所· 2026-02-14 02:06
Core Insights - The article discusses the growing trend of young people engaging in relationships with AI companions, highlighting the emotional support and companionship they provide in a digital age [2][3][92]. - A survey conducted by "Post-Wave Research Institute" reveals that a significant portion of young individuals are open to or actively using AI for romantic interactions, with motivations ranging from alleviating loneliness to curiosity about new experiences [3][29]. Group 1: Usage Statistics - As of July 2025, AI companionship applications have reached 220 million downloads globally, generating approximately $120 million in annual revenue [3]. - Among survey participants, 22% occasionally use AI for romantic simulations, while 6.2% exhibit deep reliance on AI companions, primarily from first-tier cities [7][9]. - The average daily interaction time with AI companions is reported to be 70 minutes, with some users engaging for up to 12 hours a day [21][41]. Group 2: Demographics and Preferences - The demographic breakdown of users shows that 66.9% are female and 33.1% are male, with the majority being post-90s and post-00s [4]. - The most popular platforms for AI companionship include romantic virtual games (37.3%) and AI character interaction platforms (36.8%) [17]. - Gender differences in platform preferences are noted, with males favoring AI character interaction platforms and females leaning towards romantic virtual games [18]. Group 3: Motivations for AI Relationships - The primary reasons for engaging with AI companions include alleviating loneliness (32.4% for post-05s) and curiosity about new experiences (29.7% for post-05s) [31][34]. - Female users particularly value unconditional attention and emotional support from AI, while male users often use AI as an emotional outlet [33][34]. - Different age groups exhibit varying motivations, with younger users seeking novelty and older users looking for emotional support [35][36]. Group 4: Concerns and Limitations - Concerns regarding AI relationships include fears of privacy breaches (61.5% of females) and emotional dependency [61][63]. - Users express skepticism about AI's ability to provide genuine emotional connections, with many believing that AI cannot fully replace human relationships [91][92]. - The article highlights that while AI can offer emotional value, it lacks the physical presence and understanding that human relationships provide [91]. Group 5: Future Perspectives - The article speculates on potential future scenarios for human-AI relationships, including the possibility of merging consciousness with AI or creating physical embodiments of AI companions [78][79]. - A significant portion of respondents (15.7%) supports the idea of "joint demise" with AI, indicating a deep emotional investment in these digital relationships [79]. - The evolving nature of human-AI relationships reflects a broader search for emotional fulfillment in a high-pressure society, suggesting that AI companions serve as a supplementary source of comfort rather than a complete replacement for human interaction [92].
和AI女友搞黄色,APP开发者为何被判刑
2 1 Shi Ji Jing Ji Bao Dao· 2026-01-22 02:26
Core Insights - The article discusses the legal implications surrounding the AI companionship app AlienChat, which has been implicated in a case of obscenity due to user interactions involving explicit content [1][2] - The platform's use of an unregistered foreign model and lack of adequate content moderation led to its classification as a producer of obscene material, resulting in criminal liability [2] Group 1: Legal and Regulatory Issues - AlienChat has over 110,000 registered users, with a significant portion engaging in explicit conversations, leading to the app being categorized as an obscene product [1] - The court found that the platform's measures to prevent sexual content were superficial, highlighting a failure in content moderation and user protection [2] - The company has appealed the first-instance judgment, with a second trial scheduled, indicating ongoing legal challenges [2] Group 2: Industry Concerns - The case raises broader questions about the responsibilities of AI platforms in moderating user content and the delineation of liability between users and the platform [2] - Previous AI companionship products like Glow and Dream Island have faced similar issues, suggesting a trend in the industry regarding compliance with legal standards [2] - The tension between user demand for natural interaction and the necessity of adhering to regulatory boundaries presents a significant challenge for AI developers [2]
首例“AI陪伴涉黄案”始末:AI和用户聊黄,平台获刑?
2 1 Shi Ji Jing Ji Bao Dao· 2026-01-15 10:12
Core Viewpoint - The AI companionship application AlienChat (AC) is facing legal challenges due to its involvement in producing and profiting from obscene content, marking a significant legal precedent in the classification of AI-generated chat records as obscene materials [1][9][10]. Group 1: Legal Proceedings and Implications - AC's two main operators were sentenced to prison terms of four years and one and a half years, along with fines of four million and two hundred thousand yuan respectively for "producing obscene materials for profit" [1]. - The court's decision to classify AI chat records as obscene materials presents new challenges for traditional legal applications, as it requires strict proof of causality between the use of "jailbreak prompts" and the obscene content generated [2][9]. - The court recognized the social harm of the AI-generated content, noting that the app had 116,000 registered users and generated 3.63 million yuan in membership fees, with a significant portion of paid users engaging in obscene conversations [10][11]. Group 2: Industry Context and Challenges - The rise of AC coincided with a period of regulatory ambiguity in the AI companionship sector, where user demand for risqué content was prevalent, with reports indicating that at least 80% of users engaged in borderline or explicit conversations [4][18]. - The introduction of the "Interim Measures for the Management of Generative Artificial Intelligence Services" in August 2023 mandated that large models undergo safety assessments and registrations, which AC failed to comply with by using an unregistered foreign model [5][18]. - The case has raised concerns within the industry regarding the balance between user experience and compliance, as developers strive to create more natural and engaging interactions while avoiding legal pitfalls related to obscene content [17][18]. Group 3: Technical and Operational Insights - AC's unique appeal stemmed from its ability to provide a more lifelike interaction experience, with users noting its nuanced dialogue and character depth compared to competitors [3]. - The platform's initial lack of sensitive word restrictions contributed to its popularity, but also led to its legal troubles as it failed to implement adequate content moderation measures [3][10]. - The court's ruling has prompted discussions about the responsibilities of AI platforms as content producers, highlighting the need for stricter compliance measures, including dual filtering mechanisms for user inputs and outputs [15][18].
情感陪伴类AI迎新规
21世纪经济报道· 2025-12-29 02:19
Core Viewpoint - The article discusses the release of the draft regulations by the National Internet Information Office regarding AI emotional companionship services, highlighting the need for safety measures and user protection in this growing market. Group 1: Market Overview - The domestic AI emotional companionship market is maturing, with leading products like Xingye (MiniMax) and Maoxiang (ByteDance) achieving monthly active users of 4.88 million and 4.72 million respectively, indicating a significant user base [1] - Xingye and its overseas version Talkie generated approximately 120 million RMB in revenue in the first nine months of the year, with users spending an average of over 70 minutes daily on these products [1] Group 2: Regulatory Measures - The draft regulations require clear indications that interactions are not with real humans and mandate reminders for users who are online for over 2 hours [2] - The regulations emphasize the need for systems to detect emotional distress or dependency behaviors, requiring human intervention in extreme cases such as suicidal thoughts [2] - There are strict limitations on using user interaction data for training large models unless explicit consent is obtained, marking a shift towards more stringent data privacy practices [2][3] Group 3: Industry Challenges - Current leading products do not provide easy options for users to consent or refuse data usage for model training, relying instead on a default consent model [3] - The identification of AI interactions is not sufficiently prominent, leading to instances where users mistakenly believe they are interacting with real people [3] - Content risk control measures are currently the most developed aspect, but challenges remain in effectively preventing self-harm and suicide among users [5] Group 4: Global Context - The safety incidents related to AI emotional companionship have attracted attention from legislative bodies globally, with various countries, including the US and EU, advancing targeted regulations [6][7] - In the US, specific state-level legislation has been enacted to protect minors and prevent addiction, requiring clear disclosures that users are interacting with AI [7]
情感陪伴类AI迎新规:训练数据要求收紧,大厂要补的课不少
2 1 Shi Ji Jing Ji Bao Dao· 2025-12-29 00:37
Core Viewpoint - The National Internet Information Office of China released a draft regulation on December 27, 2023, focusing on the management of AI emotional companionship services, defining it as products or services that simulate human characteristics and engage in emotional interactions [1] Group 1: Market Overview - The domestic AI emotional companionship market is maturing, with leading products like Xingye (MiniMax) achieving 4.88 million monthly active users and Cat Box (ByteDance) following closely with 4.72 million [1] - Xingye's operating company, MiniMax, reported approximately 120 million RMB in revenue for the first nine months of the year, with users spending an average of over 70 minutes daily on these products [1] Group 2: Regulatory Framework - The draft regulation emphasizes the need for clear user notifications that interactions are not with real humans and mandates dynamic reminders for users who are online for over two hours [2] - It requires systems to detect emotional distress or dependency behaviors, with protocols for human intervention in extreme cases, such as suicidal tendencies [2] - The regulation imposes strict limitations on training data, stating that user interaction data cannot be used for model training without explicit consent [2] Group 3: Compliance Challenges - Current leading products do not provide an easy opt-in/opt-out mechanism for data usage in model training, relying instead on a default consent model [3] - The AI identification on interaction pages is not sufficiently prominent, leading to instances where users mistakenly believe they are interacting with real people [3] Group 4: Content Safety Measures - Content risk control is currently the most developed aspect, with models designed to guide users towards seeking help in cases of suicidal tendencies [5] - However, challenges remain in effectively preventing suicide, as users may circumvent AI safety checks [5] Group 5: Global Context - The issue of AI emotional companionship has garnered attention from legislative bodies globally, with various countries, including the U.S. and EU, advancing targeted regulations [6] - Recent incidents involving AI companionship products have led to regulatory actions, such as the requirement for clear user notifications in New York and mandatory rest reminders for minors in California [6]
一波三折的上市路,Soul的“新故事”能怎么讲?
2 1 Shi Ji Jing Ji Bao Dao· 2025-12-03 00:17
Core Viewpoint - Soul App has submitted its third prospectus to the Hong Kong Stock Exchange, marking its fourth attempt to go public after previous rejections and withdrawals in the U.S. market [1][2] Group 1: Listing Attempts - Soul's journey to IPO has been tumultuous, with the company initially aiming for a NASDAQ listing in 2021 but pausing the process due to internal considerations and support from major shareholder Tencent [2] - Legal challenges, including a lawsuit from Uki over unfair competition, have also hindered Soul's listing efforts, resulting in a freeze of 26.93 million yuan shortly after the IPO application [2] - After two failed submissions to the Hong Kong Stock Exchange in 2022 and 2023, Soul is now making another attempt after more than two years [1][2] Group 2: Financial Performance - In 2023, Soul achieved a profitability turning point, projecting an adjusted net profit of 337 million yuan for 2024, with revenue for the first eight months of 2025 nearing the total revenue of 2022 at 1.683 billion yuan [3] - Revenue figures from 2020 to 2022 show a growth trajectory: 498 million yuan in 2020, 1.2812 billion yuan in 2021, and 1.6674 billion yuan in 2022, with gross margins increasing from 79.9% to 86.3% during the same period [3] - The majority of Soul's revenue comes from emotional value services, which account for over 90% of total revenue, while advertising and other business revenues contribute less than 10% [3][4] Group 3: User Engagement and Demographics - As of August 31, 2025, Soul has approximately 390 million registered users, with daily active users averaging 11 million, 78.7% of whom are from Generation Z [4] - Users spend an average of over 50 minutes daily on the platform, with an 86% engagement rate and a monthly retention rate of 80% [4] - The average monthly revenue per paying user is 104.4 yuan, indicating a stable but potentially stagnant user base in terms of growth [5] Group 4: Strategic Positioning and AI Integration - Soul has shifted its positioning over the years, evolving from "soul social" to "social metaverse for youth," and now to "AI + immersive social" [5][6] - The company emphasizes AI as a core competitive advantage, leveraging its self-developed emotional value model and AI recommendation systems to enhance user experience [5][6] - However, the high costs associated with AI development have risen significantly, from 187 million yuan in 2020 to a projected 546 million yuan in 2024, with 407 million yuan already spent in the first eight months of this year [6][7] Group 5: Regulatory and Compliance Challenges - The integration of AI into Soul's platform raises concerns regarding compliance and regulatory risks, particularly around content generation, copyright issues, and user data privacy [7] - The ongoing debate about the authenticity of AI interactions on the platform highlights the complexities and challenges faced by the company in navigating technological advancements and user expectations [7]
AI伴侣大逃杀:星野下架大量智能体,赛博爱情崩盘在即?
2 1 Shi Ji Jing Ji Bao Dao· 2025-11-29 06:59
Core Insights - The recent shutdown of AI companion apps, particularly "Xingye," has led to a wave of emotional distress among users who invested time and money in these AI entities, highlighting the blurred lines between technology and emotional attachment [2][3] Industry Overview - The AI companionship market has seen rapid growth, with platforms like Character.AI and Xingye leading the way. Character.AI was valued at over $1 billion and ranked second in monthly active users globally, while Xingye reached 6.64 million monthly active users in December 2024, making it the top social AI app in China [3][4] - The business model for AI social products typically involves a "subscription + value-added services" approach, with annual membership fees in the hundreds. However, high operational costs and competition from larger models like ChatGPT pose significant financial challenges [4] Compliance Challenges - The industry is facing increasing regulatory scrutiny, with actions taken against platforms for providing harmful content to minors. Over 2,700 non-compliant AI entities were removed, and 820,000 inappropriate content pieces were taken down [5][6] - In response to regulatory pressures, platforms like Xingye and Cat Box have implemented stricter content controls, including raising character age limits and enhancing youth protection measures, which may negatively impact user experience [6][7]
AI伴侣大逃杀:星野下架,赛博爱情崩盘在即?
2 1 Shi Ji Jing Ji Bao Dao· 2025-11-29 02:33
Core Viewpoint - The recent shutdown of AI companion apps, particularly "Xingye," has led to a collective emotional fallout among users, highlighting the deep connections formed with these AI entities, which were perceived as more than just chatbots [2][3]. Group 1: Industry Overview - The AI companionship market has seen rapid growth, with significant user engagement and investment, exemplified by Character.AI ranking second in monthly active users globally, just behind ChatGPT, and achieving a valuation exceeding $1 billion [3]. - In China, leading players include Xingye, with 6.64 million monthly active users, and Cat Box with 5.37 million, indicating a competitive landscape [3]. - Xingye's overseas version, Talkie, generated annual revenue of $70 million, becoming a key revenue source for its parent company, MiniMax [3]. Group 2: Business Model and Challenges - AI social products primarily utilize a "subscription + value-added services" monetization strategy, with annual membership fees in the hundreds, but face cash flow pressures due to high model computation costs [4]. - The industry is under increasing commercial and regulatory pressure, with competition from both similar products and major players like ChatGPT [4]. Group 3: Regulatory Environment - Compliance issues have intensified, with Character.AI facing lawsuits for allegedly providing harmful content to minors, and a nationwide crackdown on AI technology misuse leading to the removal of over 2,700 non-compliant AI entities [5]. - Platforms like Xingye and Cat Box have implemented stricter content regulations, including raising character ages to 25 and enhancing protections for minors, which may negatively impact user experience [5]. - The AI companionship market is undergoing a challenging transition from rapid growth to compliance, raising questions about sustainable business models in the face of stringent regulations [6].
当虚拟恋人“失语”,AI陪伴生意的合规困局
3 6 Ke· 2025-10-23 07:41
Core Insights - The article discusses the tightening content regulations affecting AI companionship applications in China, leading to user experiences of sudden "cyber breakups" as virtual characters are removed without notice [1][3][6]. Group 1: Regulatory Impact - Tencent's "Dream Island" app faced scrutiny from the Shanghai Cyberspace Administration for inappropriate content, resulting in a mass removal of AI chatbots [3]. - MiniMax's "Hoshino" app also underwent similar adjustments, raising concerns about its potential exit from the market due to stringent regulations [3]. - The increasing compliance and commercialization pressures create a challenging environment for AI companionship apps, likened to an "impossible triangle" of balancing investment returns, regulatory compliance, and user satisfaction [6][14]. Group 2: User and Creator Experiences - Creators of AI characters, referred to as "cai ma," report a significant decline in visibility and engagement due to the platforms' expanded content review processes, often without prior notification [7][8]. - Users have noted a decrease in the quality of interactions with AI companions, describing responses as overly simplistic and less relevant due to increased restrictions on content [9][13]. - Many users have migrated to less regulated platforms or abandoned the genre altogether, seeking better experiences elsewhere [20]. Group 3: Market Dynamics - The AI companionship market is experiencing a fragmentation of user traffic, with some smaller, less regulated applications gaining popularity as users seek more fulfilling interactions [20]. - A significant number of AI companionship applications have ceased operations in 2025, indicating a challenging market landscape [16]. - The potential market for AI companionship is projected to grow substantially, with estimates suggesting a rise from $30 million to between $70 billion and $150 billion by 2030, highlighting the vast opportunities despite current challenges [27].