Cloud Code
Search documents
Approaches for Managing Agent Memory
LangChain· 2025-12-18 17:53
Hey, this is Lance. I want to talk a bit about memory patterns for agents, focusing specifically on deep agents. Now, you might think about memory in two different ways.Explicit updating of agent memory and implicit updating of agent memory. Let me talk about the first one. So, explicit updating agent memory, you can see a great example of right here.This is the cloud code change log. This is recent yesterday. So cloud code recently removed this pound shortcut which you could previously use for updating mem ...
How Agents Use Context Engineering
LangChain· 2025-11-12 16:36
Context Engineering Principles for AI Agents - The industry recognizes the increasing task length AI agents can perform, with task length doubling approximately every seven months [2] - The industry faces challenges related to context rot, where performance degrades with longer context lengths, impacting cost and latency [3][4] - Context engineering, involving offloading, reducing, and isolating context, is crucial for managing context rot in AI agents [8][9][10] Context Offloading - Giving agents access to a file system is beneficial for saving and recalling information during long-running tasks and across different agent invocations [11][15][18] - Offloading actions from tools to scripts in a file system expands the agent's action space while minimizing the number of tools and instructions [19][22] - Progressive disclosure of actions, such as with Claude skills, saves tokens by selectively loading skill information only when needed [26][30] Context Reduction - Compaction, summarization, and filtering are techniques used to reduce context size and prevent excessively large tool results from being passed to the language model [32][33][39] - Manis compacts old tool results by saving them to a file and referencing the file in the message history [34] - Deep agents package applies summarization after a threshold of 170,000 tokens [38] Context Isolation - Context isolation, using separate context windows or sub-agents for individual tasks, helps manage context and improve performance [10][39][40] - Sub-agents can have shared context with the parent agent, such as access to the same file system [42] Tool Usage - Agent harnesses often employ a minimal number of general, atomic tools to save tokens and minimize decision-making complexity [44] - Cloud code uses around a dozen tools, Manis uses less than 20, and the deep agent CLI uses 11 [24][25][44]
全球AI应用专家交流
2025-10-30 15:21
Summary of Key Points from Conference Call Industry and Company Overview - The conference discusses advancements in the AI application industry, particularly focusing on the Cloud Code tool developed by Anthropic, which has significantly impacted programming efficiency and company valuation, now estimated between $170 billion and $180 billion [1][2][3]. Core Insights and Arguments - **Cloud Code Tool**: This tool enhances programming efficiency through context engineering, utilizing a virtual machine-like approach for context management and sandbox technology for user experience optimization. It leverages user data accumulated over three years to improve product performance [1][3][4]. - **Cost Efficiency**: AI applications, particularly through tools like Cloud Code, allow teams to complete tasks at a fraction of the traditional cost, exemplified by the ability to create a company website for just $35 in one hour [1][5]. - **AIGC Applications**: The most active area in AI-generated content (AIGC) is text processing, while image generation growth has slowed. Multimedia generation, driven by models like Google Gemini 2.5, is rapidly expanding, especially in e-commerce and live streaming [1][8][9]. - **AI App Market**: The AI app market is growing quickly but remains in its infancy, lacking a dominant app. The business model is shifting from traditional subscriptions to usage-based billing, emphasizing high-quality data over ad revenue [1][10]. - **Context Management**: Scene intelligence addresses the limitations of large models in context management, enhancing the precision of information services, such as advanced meeting record systems [1][11][12]. - **Industry-Specific AI Apps**: Despite the capabilities of large models like ChatGPT, specialized industry AI apps are necessary due to the complexity of high-quality prompt writing and context management [1][6]. - **Development Stages of AI Apps**: Most AI apps are currently at the third stage of development, indicating maturity in cloud infrastructure and context management, with some companies exploring more advanced paradigms [1][7]. Additional Important Insights - **AIGC Forms**: AIGC primarily manifests in four forms: pure text, images, multimedia (video and audio). Text applications are the most competitive, while image generation has seen a decline in demand [1][8][9]. - **User Data Utilization**: The extensive user data collected allows Cloud Code to better understand user intent, further enhancing product performance [4]. - **Market Trends**: The AI app market is characterized by a lack of leading apps, with significant potential for new entrants. The shift to usage-based pricing models reflects a broader trend in the industry [1][10]. - **Challenges in Multimedia**: The multimedia segment faces challenges such as copyright issues and model alignment, but it remains one of the fastest-growing areas [1][9]. - **AI in Document Processing**: AI tools significantly improve document processing efficiency, converting unstructured documents into structured formats, enhancing speed and accuracy [1][22]. - **Future Outlook**: The next two to three years are expected to see a rise in agent-enabled apps, similar to the mobile internet boom in the early 2010s, with substantial investment interest [1][26]. This summary encapsulates the key points discussed in the conference call, highlighting the advancements and trends in the AI application industry, particularly focusing on the impact of the Cloud Code tool and the evolving market dynamics.