Workflow
DeepSeek
icon
Search documents
李彦宏接受《时代》专访:AGI可能不存在,中国模型落后的不太多
Feng Huang Wang· 2026-01-27 04:39
凤凰网科技讯 北京时间1月27日,百度CEO李彦宏接受了美国《时代》杂志的专访,谈到了百度在AI领 域的发展历程。李彦宏表示,他甚至不相信存在所谓的通用人工智能(AGI),没有一个模型能够"面面俱 到",OpenAI不行,谷歌也做不到。在模型发展上,中国落后的没那么多。 在百度北京庞大总部入口大厅的墙上,悬挂着一块镶嵌着金色数字"1417"的小木牌。这个数字取自北大 对面的一间宾馆房间。正是在那里,李彦宏于2000年创立了这家市值500亿美元的公司。 问:你在2000年创办百度时,有没有预料到AI能在今天发挥如此重要作用? 李彦宏:没有。当时我创办百度时,我意识到互联网将在中国成为一件大事,而搜索技术对于中国互联 网的发展非常重要。但我当时并没有把AI和搜索引擎联系起来。大约在2010年,我们意识到机器学习 (AI的一个分支)开始在搜索结果排名中发挥作用。我们大约在那时开始投资AI,以便研究有多少人会点 击某个链接。随后在2012年,我们意识到深度学习将变得非常重要。它识别图像的精确度远超上一代技 术。百度对AI的实质性大规模投资,正是从2012年左右开始的。 问:你曾提到,去年在将AI融入社会和经济各个领域 ...
重磅!Optimus核心供应商速递
Robot猎场备忘录· 2026-01-27 04:02
温馨提示 : 点击下方图片,查看运营团队最新原创报告(共260页) 说明: 欢迎约稿、刊例合作、行业交流 , 行业交流记得先加入 "机器人头条"知识星球 ,后添加( 微信号:lietou100w )微 信; 若有侵权、改稿请联系编辑运营(微信:li_sir_2020); 正文: Optimus V3进厂在即,发包、定点等利好下,T链们却迎来一波下行行情! 先说两个夸张小作文(新段子和老段子):1)鉴于 (R) 及其 Tier2标的们 皆大幅下行,关于其产品出问题的 消息疯传;2)鉴于 (T) 大幅下行,行贿传闻再起。 伴随上周末诸多利好(V3进厂、新一轮审厂、核心标的利好等)下,周一(1月26日)却迎来一波大幅下行, 多 个核心标的迎来跌停, 运营微信群又是被信息轰炸的一天 (其实每天都是,各类一手资讯云集,只是今天更多是观点分 享)。 1月以来, T链们陆续开启北美行, 特斯拉发包中, 核心标的越发明朗,但由于缺乏官方"爆点"利好催化, 更多 是核心标的行情且持续性差, 虽整体走势较弱,但板块一直呈现上行趋势;本月最大的催化则是来自于特斯拉 Q4财报电话交流会(北美1月28日),结果周一一波大幅下行打的措手 ...
【专访】刘铁岩:投身AI浪潮,中国如何成为引领者?
Sou Hu Cai Jing· 2026-01-27 03:13
Core Viewpoint - The ongoing AI investment boom may evolve into a bubble if it fails to deliver on productivity promises, with 2026 being a critical year for validation or refutation of AI's potential [1][6] Group 1: AI Investment and Market Dynamics - Historical trends show that every major tech wave, including the internet and cloud computing, has been accompanied by bubbles, and AI is no exception [1][6] - Investors and entrepreneurs should approach AI with caution, as the high barriers to entry in foundational technology development make blind investments risky [1][6] - The AI sector is transitioning from a scale-driven "infrastructure race" to a research-driven "innovation marathon," emphasizing the need for sustainable industrial capabilities rather than just rapid growth [6][7] Group 2: Regional Development and Innovation - Different regions should focus on their unique industrial foundations and application scenarios to avoid homogenized competition in AI [2][13] - The emphasis should be on application innovation driven by real-world needs, rather than merely increasing computational power or building platforms [2][13] Group 3: Talent and Research Environment - To lead in the new technological wave, a supportive research environment, high-end talent, and original innovative outcomes are essential [5][9] - China has made significant contributions to AI, with a leading number of researchers and publications, and is moving from a "follower" to a "runner" in AI technology [8][9] Group 4: AI's Impact on Industries - AI is expected to revolutionize various sectors, significantly enhancing efficiency; for instance, AI can improve manufacturing efficiency by 15-20% and logistics procurement efficiency by over 30% [14][15] - The transformative potential of AI extends to redefining scientific discovery processes, drastically reducing timelines in fields like energy and materials science [15][16] Group 5: Employment and Workforce Dynamics - The rise of AI may lead to job displacement but will also create new opportunities and roles, similar to past industrial revolutions [17][18] - Workers must adapt by updating their skills and knowledge to remain competitive in an evolving job market shaped by AI [18][19]
穿越周期的力量:2025中国企业家年度榜单
Sou Hu Cai Jing· 2026-01-26 15:59
Core Insights - The article highlights the recognition of 3 "Special Contribution Entrepreneurs" and 20 "2025 Entrepreneurs" who exemplify long-termism and innovation across various industries in China, including liquor, manufacturing, energy, agriculture, internet, AI, and new consumption [1][2]. Group 1: Special Contribution Entrepreneurs - Ji Keliang, former chairman of Kweichow Moutai Group, transformed traditional brewing techniques into scientific data over 60 years, emphasizing quality over speed, which laid the foundation for Moutai's billion-dollar brand value [4][10][12]. - Zhang Ruimin, founder of Haier Group, is known for his continuous self-disruption and innovation, leading Haier from a struggling factory to a global leader in home appliances with over 400 billion yuan in revenue [18][20][21]. - Jiang Baoquan, founder of Nanjing Gold Foil Holdings, turned a failing workshop into the world's largest gold foil producer, emphasizing resilience and innovative management theories [25][27][29]. Group 2: 2025 Entrepreneurs - Ma Huateng, chairman of Tencent, focuses on "technology for good," committing to social responsibility and innovation in digital technology to drive high-quality economic development [31][34][41]. - Wang Ning, founder of Pop Mart, capitalizes on emotional value and consumer psychology, creating a successful business model around collectible toys that resonate with young consumers [43][45][46]. - Wang Xingxing, founder of Yushutech, leads advancements in humanoid robotics, achieving significant market presence and profitability while promoting technological innovation [48][49][51]. - Fang Hongbo, chairman of Midea Group, has successfully transformed Midea into a global technology group through strategic restructuring and a focus on efficiency and innovation [54][56]. - Liu Yonghao, chairman of New Hope Group, maintains a long-term vision in agriculture, achieving growth even during economic downturns by embracing new technologies [67][69][70]. - Liu Qiangdong, founder of JD.com, integrates the concept of "common prosperity" into business practices, ensuring fair profit distribution among stakeholders while enhancing supply chain efficiency [73][75][78]. - Li Dongsheng, founder of TCL, exemplifies global leadership in semiconductor display and photovoltaic sectors, driving innovation and sustainable growth through strategic partnerships [110][111].
X @The Economist
The Economist· 2026-01-26 15:00
In the year since DeepSeek shocked the world with a whizzy new AI model, China’s clout in the tech has only grown. Turning a profit, however, is proving difficult: https://t.co/h7P8yLG6hiIllustration: Simon Bailly https://t.co/Wx9Ejk5T50 ...
X @Bloomberg
Bloomberg· 2026-01-26 11:44
A year ago, the Chinese startup DeepSeek freaked out the stock market with the idea that developing AI was much easier and cheaper than everyone imagined. But that’s turned out to be largely a mirage. https://t.co/1BDW4jNKPr ...
“DeepSeek-V3基于我们的架构打造”,欧版OpenAI CEO逆天发言被喷了
3 6 Ke· 2026-01-26 07:44
Core Viewpoint - The discussion centers around the competitive landscape in the AI field, particularly focusing on the contrasting approaches of Mistral and DeepSeek in developing sparse mixture of experts (MoE) models, with Mistral's CEO acknowledging China's strong position in AI and the significance of open-source models [1][4]. Group 1: Company Perspectives - Mistral's CEO, Arthur Mensch, claims that open-source models are a strategy for progress rather than competition, highlighting their early release of open-source models [1]. - The recent release of DeepSeek-V3 is built on Mistral's proposed architecture, indicating a collaborative yet competitive environment in AI development [1][4]. - There is skepticism among the audience regarding Mistral's claims, with some suggesting that Mistral's recent models may have borrowed heavily from DeepSeek's architecture [4][13]. Group 2: Technical Comparisons - Both DeepSeek and Mistral's Mixtral focus on sparse MoE systems, aiming to reduce computational costs while enhancing model capabilities, but they differ fundamentally in their approaches [9]. - Mixtral emphasizes engineering principles, showcasing the effectiveness of a robust base model combined with mature MoE technology, while DeepSeek focuses on algorithmic innovation to address issues in traditional MoE systems [9][12]. - DeepSeek introduces a fine-grained expert segmentation approach, allowing for more flexible combinations of experts, which contrasts with Mixtral's flat knowledge distribution among experts [11][12]. Group 3: Community Reactions - The community has reacted critically to Mistral's statements, with some users expressing disbelief and pointing out the similarities between Mistral's and DeepSeek's architectures [2][17]. - There is a sentiment that Mistral, once a pioneer in the open-source AI space, is now perceived as having lost its innovative edge, with DeepSeek gaining more influence in the sparse MoE and MLA technologies [14][17]. - The competitive race for foundational models is expected to continue, with DeepSeek reportedly targeting significant releases in the near future [19].
DeepSeek最新论文解读:mHC如何用更少的钱训练出更强的模型?——投资笔记第243期
3 6 Ke· 2026-01-26 07:38
Core Insights - DeepSeek has released a significant paper on Manifold-Constrained Hyper-Connections (mHC), focusing on the fundamental issue of how information flows stably through ultra-deep networks in large models, rather than on model parameters, data volume, or computational power [2] Group 1: Residual Connections and Their Limitations - The concept of residual connections, introduced by Kaiming He’s team in 2015, is a milestone in AI development, allowing deeper neural networks by addressing the vanishing gradient problem [3] - Prior to residual connections, neural networks were limited to depths of 20-30 layers due to the exponential decay of gradients, which hindered effective feature learning [3][4] - Residual connections introduced a "shortcut" for signal transmission, enabling the depth of trainable networks to increase from tens to hundreds or thousands of layers, forming the structural foundation of modern deep learning [4] Group 2: Introduction of Hyper-Connections - Hyper-Connections emerged as a solution to the limitations of residual connections, allowing multiple pathways for information transfer within a model, akin to a relay race with multiple runners [6][7] - This approach enables information to be distributed across multiple parallel channels, allowing for dynamic weight allocation during training, enhancing the model's ability to handle complex, multi-source information [6][7] Group 3: Challenges with Hyper-Connections - Hyper-Connections face a critical flaw: instability due to excessive freedom in information flow, which can lead to imbalances in the model's internal information flow [9] - The training process of models using Hyper-Connections can exhibit high volatility and loss divergence, indicating a lack of stability in information transmission [9] Group 4: The Solution - mHC - mHC, or Manifold-Constrained Hyper-Connections, introduces a crucial constraint to Hyper-Connections by employing a double stochastic matrix, ensuring that information is redistributed without amplification [11] - This constraint prevents both signal explosion and signal decay, maintaining a stable flow of information throughout the network [13] - The implementation of mHC enhances training stability and performance, with only a 6.7% increase in training time, which is negligible compared to the significant cost savings in computational resources and debugging time [13][14] Group 5: Implications for Future AI Development - mHC strikes a new balance between stability and efficiency, reducing computational costs by approximately 30% and shortening product iteration cycles [14] - It supports the development of larger models, addressing the stability bottleneck in scaling to models with hundreds of billions or trillions of parameters [16] - The framework of mHC demonstrates that "constrained freedom" is more valuable than "complete freedom," suggesting a shift in AI architecture design from experience-driven to theory-driven approaches [16]
2026了,大厂们还在用撒钱这招搞AI
Di Yi Cai Jing· 2026-01-26 05:28
几乎可以预判这场大战的结局:春节期间,各大撒钱的AI应用的下载量将迎来一条漂亮的、陡峭增长 的曲线,日活数据会创下新高。但随着时间过去,这些脉冲式的流量、这些为红包而来的用户又将迅 速退潮。 过去互联网"烧钱换用户"的逻辑,本质上是"花钱买时间",用资本购买"网络效应"和"习惯养成"的 快捷方式。为什么这套打法在AI赛道不完全奏效? 腾讯元宝豪掷10亿,百度紧随其后撒出5亿,熟悉的味道又回来了。这套战术的"剧本"几乎不变。 2015年微信支付凭借春晚"摇一摇"一役成名,被喻为"珍珠港偷袭",成功将数亿用户绑定至其生态 之下,完成对用户习惯的一次闪电式改造。如今,"弹药"依然是真金白银,但冲锋的目标,已从昔 日的支付入口、短视频流量,转向了人工智能。 "红包炮弹"当然有效,大厂"撒钱"的价值不能完全否定。尤其选在春节这个时间窗口,可以说是大 家唯一能通过"合家欢"场景,实现技术普惠与圈层穿透的时间节点,将AI应用塞进数亿人的手机 里,完成一场全民AI启蒙。成本看似高昂,却也可能最有效率。 更重要的是,在AI应用尚未出现"杀手级应用"的当下,谁都不敢掉队,通过红包维持存在感、卡位 春节流量池是巨头的本能反应。况且 ...
2026了,大厂们还在用撒钱这招搞AI
第一财经· 2026-01-26 05:24
2026.01. 26 本文字数:2393,阅读时长大约4分钟 作者 | 第一财经 刘佳 封图 | AI生成 都2026了,大厂们还在用"撒钱"这招搞AI。 大厂们不得不直面一个现实:在 AI 时代,技术壁垒的权重远高于资本壁垒,用户愿意为优质体验买 单,也会为单纯的红包停留,但可能不会停留太久。 几乎可以预判这场大战的结局:春节期间,各大撒钱的AI应用的下载量将迎来一条漂亮的、陡峭增 长的曲线,日活数据会创下新高。但随着时间过去,这些脉冲式的流量、这些为红包而来的用户又将 迅速退潮。 腾讯元宝豪掷10亿,百度紧随其后撒出5亿,熟悉的味道又回来了。这套战术的"剧本"几乎不变。 2015年微信支付凭借春晚"摇一摇"一役成名,被喻为"珍珠港偷袭",成功将数亿用户绑定至其生态 之下,完成对用户习惯的一次闪电式改造。如今,"弹药"依然是真金白银,但冲锋的目标,已从昔日 的支付入口、短视频流量,转向了人工智能。 "红包炮弹"当然有效,大厂"撒钱"的价值不能完全否定。尤其选在春节这个时间窗口,可以说是大家 唯一能通过"合家欢"场景,实现技术普惠与圈层穿透的时间节点,将AI应用塞进数亿人的手机里, 完成一场全民AI启蒙。成 ...