Workflow
ClaudeCode
icon
Search documents
为什么 ClawdBot 能带火 Mac mini?叶天奇聊 Agent 电脑丨100 个 AI 创业者
晚点LatePost· 2026-02-05 14:35
以下文章来源于晚点AI ,作者晚点团队 晚点AI . 关注人工智能的一切,一切都关于人工智能。 这可能是下一个时代的电脑。 文 丨 祝颖丽 编辑 丨 宋玮 叶天奇觉得,ClawdBot( 现改名 Openclaw、曾改名 Moltbot,后文继续用 ClawdBot 指代 ) 火得莫名其妙。在他看 来,这就是一个胶水沾了很多东西的傻瓜包,像早年安卓时代,大家都会安装很多预装包一样。 不过他也不得不承认,这个在中美两国被病毒式传播的、新的 AI 开源产品,其最大的价值在于降低了与 Agent 的交互 门槛。 但类似的解决方案早就有了,甚至几个月前,叶天奇推翻过这种做法。当时他觉得做 "Agent Layer"(Agent 层)最终其 实是躺在 "大厂的火车轨道" 上;现在 ClawdBot 所做的优化交互、记忆工程,都属于短时间的技术投机,长期一定是大 厂做得更好,就比如 Anthropic 对 ClaudeCode 的持续优化。 放弃做 Agent 层,叶天奇认为最需要的是给 Agent 提供一个 Runtime(运行时间),给它提供一个物理的设备。 因此,创业者叶天奇做了一个内置 ClaudeCode 的微 ...
AI的瓶颈不是算力,而是…
3 6 Ke· 2026-01-17 08:18
Core Insights - The discussion around AI has established a narrative framework where computing power determines limits, models dictate capabilities, and data defines intelligence levels. However, the real challenge lies in organizational adaptation to AI, which is often linear compared to the exponential growth of AI capabilities [1] Group 1: AI Implementation and Organizational Change - A seemingly reasonable figure, such as 30% of code being generated by AI, may mask a more conservative reality. If the potential was close to 100%, then 30% indicates organizational restraint rather than efficiency issues [2] - A practical experiment revealed that when organizational boundaries were removed, nearly all code could be generated by AI, highlighting the importance of organizational willingness to change [2][12] - Traditional organizational structures, rooted in the industrial era, create high collaboration costs that can hinder AI's potential [3][4] Group 2: New Collaborative Models - The shift towards AI-native workflows resembles 3D printing rather than traditional bricklaying, allowing for more integrated and efficient collaboration [4] - As AI raises the baseline for delivery standards, the value of human input shifts from execution to defining what excellence looks like and taking responsibility for it [5][12] Group 3: Organizational Transformation Initiatives - The company transformed management meetings into "AI promotion meetings," focusing on how AI can create value rather than merely reviewing performance metrics [6] - A training and certification program named "ABC+" was introduced to empower non-technical staff to utilize AI tools, identifying potential future leaders within the organization [7][8] - A hackathon for non-technical employees resulted in a project that streamlined communication between sales and development, reducing organizational friction and enhancing efficiency [9][10] Group 4: Leadership and Organizational Structure - As AI capabilities are integrated into workflows, the minimum deliverable unit within the organization shrinks, leading to a reduced need for coordination and a shift in the role of middle management [10][11] - AI serves as a consensus tool for driving long-term organizational change, making it a compelling reason for CEOs to advocate for transformation [11] Group 5: The Bottleneck of AI Adoption - The true bottleneck for AI is not technological but rather the readiness of people and organizations to embrace change and redesign themselves [12][13]
【大涨解读】AI应用:科技巨头新模型再度点燃AI应用上涨潮,GEO+AI编程成重点,AI重塑流量入口有望打开千亿市场
Xuan Gu Bao· 2026-01-12 03:19
Group 1: AI Application Stocks Performance - On January 12, AI application concept stocks experienced a collective surge, with companies like Liou Co., BoRui ChuanBo, and others achieving significant gains, including multiple stocks hitting the 20% limit up [1] - Notable performers included BlueFocus, Kunlun Wanwei, and others, all rising over 10% [1] Group 2: AI Marketing and Programming Developments - Elon Musk announced the open-sourcing of X platform's content recommendation algorithm, interpreted as a move into GEO (Generative AI Optimization) [3] - DeepSeek plans to release its next-generation V4 model in mid-February, focusing on enhanced programming capabilities, with initial tests showing superior performance compared to mainstream models [3] - AI programming is becoming a core application area, with leading model companies emphasizing code capabilities, as seen in the advancements of models like Claude and GPT [4][5] Group 3: Market Insights and Future Projections - The global GEO market is projected to reach $11.2 billion by 2025 and potentially $100 billion by 2030, indicating a significant growth trajectory [6] - The rise of GEO as a new paradigm in marketing, evolving from SEO, is expected to transform advertising agencies' business models towards subscription or performance-based payment structures, enhancing profitability [6]
Karpathy 2025年AI终极觉醒:我们还没发挥出LLM潜力的10%
3 6 Ke· 2025-12-22 00:29
Core Insights - 2025 is anticipated to be a pivotal year in the history of artificial intelligence, marking a transition from "impressive" in 2023 to "confusion" in 2024, and finally to "awakening" in 2025 [1][3] Group 1: RLVR Revolution - The traditional training process for large language models (LLMs) involves three stages: pre-training, supervised fine-tuning, and human feedback reinforcement learning (RLHF) [4][6] - RLHF has been criticized for training models to "appear to reason" rather than genuinely reasoning, leading to issues like "sycophancy" where models produce plausible but incorrect outputs [6][7] - The emergence of RLVR (Reinforcement Learning from Verifiable Rewards) represents a new phase where models are trained based on objective results rather than human feedback, allowing for a more robust learning process [7][12] - RLVR enables models to explore multiple reasoning paths and self-verify their outputs, leading to the development of reasoning capabilities without explicit instruction [18][19] - The shift in focus from training to inference time allows models to enhance their intelligence by spending more time on complex problems, akin to a student taking longer to solve difficult questions [21][23] Group 2: Philosophical Divide - A philosophical debate is presented regarding whether AI is creating new "animals" or "ghosts," with the latter referring to LLMs that lack continuous consciousness and are instead statistical constructs of human language [24][32] - Rich Sutton's "Bitter Lesson" suggests that methods leveraging unlimited computational power will ultimately outperform those relying on human knowledge, emphasizing the supremacy of computational approaches [27][28] - The current AI models are seen as "ghosts" that lack a continuous self and are instead reflections of human language, leading to a "uncanny valley" effect in interactions [33][35] Group 3: Vibe Coding - Vibe Coding represents a shift in programming paradigms where developers focus on intent rather than code details, allowing AI to generate code based on natural language descriptions [40][44] - The emergence of tools like MenuGen demonstrates the potential of Vibe Coding, where even experienced programmers can create applications without writing traditional code [44][45] - The competition between AI programming tools, such as Cursor and ClaudeCode, highlights the evolving landscape of AI-assisted development, with each offering different levels of integration and autonomy [45][46] Group 4: Paradigm Shift - The introduction of Google's Gemini Nano Banana signifies a major paradigm shift in computing, suggesting that LLMs will redefine user interface experiences beyond traditional text-based interactions [47][49] - The preference for visual and spatial information over text indicates a need for LLMs to evolve in how they communicate with users, moving towards more engaging formats [49][50] - The "jagged" intelligence of AI, where it excels in certain areas while failing in others, reflects the uneven distribution of training data and highlights the complexities of AI capabilities [51][52] Group 5: Future Outlook - The year 2025 is positioned as an exciting yet unpredictable time for LLMs, with the potential for significant advancements and untapped capabilities still remaining [53][55] - The belief in rapid development alongside the need for further work suggests a dynamic and evolving landscape in AI research and application [57][58]
X @𝘁𝗮𝗿𝗲𝘀𝗸𝘆
AI Model Performance - CodeX outperforms ClaudeCode in specific data retrieval tasks [1] - CodeX successfully retrieved pure contract data that ClaudeCode failed to obtain [1] Cost Optimization - The company is reducing costs by unsubscribing from ClaudeCode (200U) [1] - The company is subscribing to CodeX (20U), potentially indicating a cost-effective alternative [1]
X @𝘁𝗮𝗿𝗲𝘀𝗸𝘆
#AI伙伴开了个 AWS,初次使用 ClaudeCode 把 URL 发我登录一下就解锁了。理论上所有 ClaudeCode 关联请求都从 AWS 发起,和他本地(国内)没有任何联系。IP 问题解决了。随便搞张 US BIN 的卡去支付不就完了。没太理解用各种高价、不稳定、泄露隐私的中转服务,到底在折腾啥🤔 ...
一图看懂|如何用 AI 重构企业产品增长新曲线
AI前线· 2025-06-19 08:10
Core Insights - The AICon Beijing event on June 27-28 will focus on cutting-edge AI technology breakthroughs and industry applications, discussing topics such as AI Agent construction, multimodal applications, large model inference optimization, data intelligence practices, and AI product innovation [1] Group 1 - OpenAI is experiencing significant talent poaching, with reports of substantial signing bonuses, indicating a competitive landscape for AI talent [1] - The performance of DeepSeek R1 in programming tests has surpassed Opus 4, suggesting advancements in AI model capabilities [1] - There are concerns regarding the use of AI in governance, highlighted by the leak of Trump's AI plan on GitHub, which has drawn criticism from the public [1] Group 2 - The departure of executives from Jieyue Xingchen to JD.com reflects ongoing talent movement within the AI sector [1] - Baidu is aggressively recruiting top AI talent, with job openings increasing by over 60%, indicating a strong demand for skilled professionals [1] - Alibaba has acknowledged pressure from competitors like DeepSeek, suggesting a highly competitive environment in the AI industry [1] Group 3 - Employees are reportedly willing to spend $1,000 daily on ClaudeCode, indicating high demand for advanced AI tools despite their cost [1]
推理、训练、数据全链条的工程挑战,谁在构建中国 AI 的底层能力?|AICon 北京
AI前线· 2025-06-16 07:37
Core Viewpoint - The rapid evolution of large models has shifted the focus from the models themselves to systemic issues such as slow inference, unstable training, and data migration challenges, which are critical for the scalable implementation of technology [1] Group 1: Key Issues in Domestic AI - Domestic AI faces challenges including computing power adaptation, system fault tolerance, and data compliance, which are essential for its practical application [1] - The AICon conference will address seven key topics focusing on the infrastructure of domestic AI, including native adaptation of domestic chips for inference and cloud-native evolution of AI data foundations [1] Group 2: Presentations Overview - The "Chitu Inference Engine" by Qingcheng Jizhi aims to efficiently deploy FP8 precision models on domestic chips, overcoming reliance on NVIDIA's Hopper architecture [4] - Huawei's "DeepSeek" architecture will discuss performance optimization strategies for running large models on domestic computing platforms [5][6] - JD Retail's presentation will cover the technical challenges and optimization practices for high throughput and low latency in large language models used in retail applications [7] - Alibaba's session will explore the design and future development of reinforcement learning systems, emphasizing the complexity of algorithms and system requirements [8] - The "SGLang Inference Engine" will present an efficient open-source deployment solution that integrates advanced technologies to reduce inference costs [9] - Ant Group will share insights on stability practices in large model training, focusing on distributed training fault tolerance and performance analysis tools [10] - Zilliz will discuss the evolution of data infrastructure for AI, including vector data migration tools and cloud-native data platforms [11]