Workflow
开源AI Agent系统
icon
Search documents
OpenClaw橙皮书:从入门到精通
从入门到精通· 2026-03-12 01:30
Investment Rating - The report does not explicitly provide an investment rating for the industry or company Core Insights - OpenClaw is an open-source, self-hosted AI agent system that transforms AI from a "chat tool" into a "digital employee capable of executing tasks" [13][14] - The platform connects over 20 messaging channels and can autonomously manage schedules, handle emails, and perform various tasks [13] - OpenClaw has rapidly gained popularity, achieving over 278,932 GitHub stars within a short period, surpassing React as the fastest-growing open-source project [19][24] Summary by Sections Part 1: Introduction - OpenClaw is described as a personal AI operating system that can run on a user's server and interact through various instant messaging tools [13][16] - The mascot is a lobster, and the community refers to using OpenClaw as "养虾" (raising lobsters) [16][26] Part 2: Technical Architecture - OpenClaw employs a three-layer architecture: Gateway, Node, and Channel, utilizing WebSocket for communication [30] - The Gateway manages sessions and routes messages, while Nodes execute tasks locally [30][34] Part 3: Deployment Solutions - OpenClaw supports multiple deployment methods, including local installation, Docker, and cloud services, catering to different user needs and technical expertise [64][66] - Various cloud platforms offer one-click deployment options, with pricing ranging from 9.9 yuan/month to higher tiers depending on the service [81][86] Part 4: Skills System - The Skills system allows users to create and integrate various functionalities, with over 13,729 skills available in the ClawHub marketplace [19][28] - The report highlights the importance of skill quality, noting that over 50% of skills are deemed low quality or malicious [28] Part 5: Security and Cost - OpenClaw emphasizes security through its design, with a focus on local data control and minimal external exposure [30][32] - Cost management is crucial, as model usage can lead to significant expenses, with the report suggesting that model fees are the primary ongoing cost [66] Part 6: Ecosystem and Community - The community around OpenClaw has grown significantly, with a dedicated social platform called Moltbook for AI agents to interact [27] - The cultural phenomenon of "养虾" has contributed to the project's viral growth and community engagement [26][27] Part 7: Conclusion - OpenClaw's rapid ascent in the open-source community is attributed to its unique capabilities, community-driven development, and effective marketing strategies [19][24]
GLM-5引爆行情!智谱大涨28%
第一财经· 2026-02-12 04:15
Core Viewpoint - The article highlights the successful launch of Zhipu's new model GLM-5, which has received positive market feedback, evidenced by a 28.68% increase in stock price on its first trading day [5]. Group 1: Model Features and Updates - Zhipu's GLM-5 model has enhanced programming and agent capabilities, increasing pre-training data from 23 trillion to 28.5 trillion [6]. - The model introduces a new "Slime" framework to support larger model scales and complex reinforcement learning tasks, along with an asynchronous reinforcement learning algorithm for continuous learning from long-term interactions [6]. - GLM-5 has achieved state-of-the-art (SOTA) performance in coding and agent capabilities, closely matching the user experience of Claude Opus 4.5 in real programming scenarios [6]. Group 2: Applications and Integrations - Typical applications of GLM-5 in agent engineering include end-to-end application development, general agent assistants, and direct output of office documents [7]. - The model can be integrated into the popular open-source AI agent system OpenClaw, allowing users to have a smart intern for various tasks such as web searching and programming [7]. - Zhipu has also launched an AutoGLM version of OpenClaw, enabling seamless integration with Feishu robots [7]. Group 3: Industry Trends - The article notes that multiple model updates are occurring in the industry, focusing on inference efficiency, long context, multimodality, and cost reduction [7]. - Other models released around the same time include Step 3.5 Flash, Qwen3-Coder-Next, and MiniMax-M2.5, all emphasizing similar advancements [7]. - DeepSeek's recent updates have increased context length support to 1 million tokens, a significant improvement from the previous 128,000 tokens [8].