Workflow
共享算力
icon
Search documents
00后博士休学,首创“算力滴滴”
虎嗅APP· 2025-08-21 10:08
Core Viewpoint - The article discusses the emergence of a new business model in the AI industry, specifically focusing on the success of a startup, Gongji Technology, which has effectively commercialized the concept of shared computing power, addressing the growing demand for AI inference capabilities [4][6]. Group 1: Market Demand and Trends - The year 2025 is anticipated to be a pivotal year for AI Agents, leading to a significant increase in demand for AI inference computing power, with IDC predicting that the workload share of inference servers will rise from 51.5% in 2020 to 62.2% by 2026 [4]. - The demand for AI services has led to the emergence of various AI companies, including those focused on video generation and companion robots, which face challenges related to high long-term server rental costs from cloud providers [5][6]. Group 2: Business Model and Innovation - Gongji Technology offers a flexible computing power service that allows users to pay based on actual usage, addressing the need for stable and elastic computing resources [6][30]. - The company has developed a unique model that treats the city as a large distributed data center, where idle computing resources from individual users can be matched with AI companies needing processing power [7][30]. - The startup has achieved a stability rate of over 99.9% in its service, allowing for seamless transitions between idle computers for task execution [7][23]. Group 3: Company Background and Growth - Gongji Technology was founded by a team of young entrepreneurs, including a dropout from a PhD program, who identified a significant market need for shared computing resources after experiencing challenges in their own research [8][9]. - The company has successfully transitioned from a consumer-focused model to serving B2B clients, generating over 20 million yuan in revenue in the first half of 2023, primarily from business clients [8][70][71]. Group 4: Challenges and Solutions - The company faced skepticism from investors due to the historical challenges of monetizing shared computing power, which had not been successfully achieved in the past [16][18]. - Gongji Technology has navigated technical challenges by iterating its product over 80 times to ensure ease of use for individuals sharing their computing power [7][30]. Group 5: Future Outlook - The company is positioned to participate in national-level infrastructure projects, indicating a growing recognition of the importance of shared computing resources in the AI landscape [22][35]. - The founders believe that the future of computing power will increasingly rely on flexible, on-demand services rather than traditional long-term rentals, which are often inefficient for fluctuating AI workloads [29][30].
00后博士休学,首创“算力滴滴”
Hu Xiu· 2025-08-21 02:14
"AI 原生 100" 是虎嗅科技组推出针对 AI 原生创新栏目,这是本系列的第「 14 」篇文章。 共享算力,这件事放在今天,并不新鲜。早在20世纪80年代就有人尝试,甚至有人用它寻找外星文明,但却没人在共享算力上赚到钱。 一直到AI Agent的大爆发,这一模式第一次获得了商业上的成功。首次将共享算力这事儿,在全球范围内能跑通,并实现盈利,竟让一个00后团队做到了。 2025年7月,在世界人工智能大会上,付智十分忙碌,他是共绩科技的CEO。这一次,他在大会上对接了40多条线索、转化了20多条潜在商机。 2025年,被业界视为"AI Agent"爆发的元年,这直接导致了AI推理的算力需求暴增。根据第三方分析机构IDC的预测,推理服务器的工作负载占比从2020年 的51.5%,增加到了2026年的62.2%。 这些推理的需求,投射到终端,就是一系列AI视频生成公司、AI陪伴机器人公司、AI模型公司的爆发。但这些AI公司的痛点在于,每次都需要支付一大笔 长租服务器的费用给到云厂商,这造成的问题是,当需求较少时,他们需要为算力服务,承担闲置算力资源的成本,当需求增多时,他们又要让用户排队。 这对于成本敏感、精细化 ...