Workflow
Context Engineering
icon
Search documents
从概念热到落地难:Agent 元年的真实进程
Sou Hu Cai Jing· 2025-10-17 13:03
Core Insights - The article highlights the growing trend of large tech companies and emerging startups actively developing Agent products, which are increasingly being integrated into various industries such as financial services, manufacturing, and education [2][3] - OpenAI has launched a new toolset called AgentKit to assist developers and enterprises in building, deploying, and optimizing Agents [3] - The competitive focus in the Agent sector is shifting from model parameters to platform engineering capabilities and enterprise implementation capabilities, indicating that the ability to provide a comprehensive and scalable infrastructure is becoming crucial [4] Industry Trends - The Agent sector is undergoing a transformation where the emphasis is now on platform capabilities rather than just model intelligence [4] - A recent conference by Baidu confirmed that while interest in Agents is rising among enterprises, there are significant challenges in practical implementation, including technology maturity and scene applicability [5][7] - Key challenges identified include the mismatch between model capabilities and task requirements, high costs associated with multi-turn calls, complex system integration, and security concerns [7][10] Company Developments - Baidu's upgraded Qianfan platform integrates large models, tool components, and Agent development into a unified enterprise toolchain, expanding its role from a cloud service platform to a comprehensive development platform for Agents [5][10] - The Qianfan platform features a flexible Agent orchestration architecture and enhanced performance, compatibility, and stability to meet diverse enterprise needs [12] - Baidu has introduced various self-developed components and third-party tools to create a rich ecosystem, significantly enhancing the knowledge acquisition and execution capabilities of Agents [14] Future Outlook - The future of Agents is expected to see deeper integration into business processes, driven by continuous model evolution and improved understanding of business data [15][16] - The emergence of specialized Agents across various industries is anticipated, which will require platforms to enhance their tools and interfaces to support high-value Agent creation [17] - The balance between model capabilities, platform ecosystems, market demand, and policy environments is approaching a point where innovation can be scaled effectively [17]
Elastic (NYSE:ESTC) Analyst Day Transcript
2025-10-09 19:02
Summary of Elastic (NYSE:ESTC) Analyst Day - October 09, 2025 Company Overview - **Company**: Elastic (NYSE:ESTC) - **Event**: Financial Analyst Day - **Date**: October 09, 2025 - **Key Speaker**: Ash Kulkarni (CEO), Eric Prengel (Global VP of Elastic and Head of Investor Relations) Core Industry and Company Insights - **Industry**: Data management and analytics, focusing on unstructured data - **Company's Role**: Elastic is recognized as the world's most popular data platform for unstructured data, with over 5.5 billion downloads of its software, averaging over three downloads per second over 15 years [6][7][8] - **Competitive Advantage**: Elastic's ability to handle unstructured data is its greatest competitive advantage, with over 30 petabytes of new data ingested daily into paid clusters globally [7][9] Key Points and Arguments 1. **Unstructured Data Growth**: The company emphasizes the increasing importance of unstructured data, particularly in the context of AI and large language models (LLMs) [9][10] 2. **AI Integration**: Elastic's platform is positioned as a natural choice for AI applications due to its capabilities in managing unstructured data, which is crucial for training AI models [11][12] 3. **Product Announcements**: Six new product capabilities were announced, including: - **Agent Builder**: A tool for building AI agents directly on top of data [17] - **Elastic Inference Service**: A GPU-accelerated service for embedding and retrieval models [17] - **Acquisition of Jina AI**: Enhances Elastic's capabilities in multilingual and multimodal models [18] 4. **Customer Use Cases**: Notable customers include: - **DocuSign**: Chose Elastic for its intelligent agreement management platform, needing to search billions of documents [20] - **Legora**: An AI-native company that utilizes Elastic for legal research and drafting [21] - **National Health Service (NHS)**: Uses Elastic for patient record management, emphasizing data privacy and relevance [21] 5. **Observability and Security**: Elastic's observability platform is built to handle messy data, with over 90% of Elastic Cloud Observability customers using it for log analytics [28][30] 6. **Market Position**: Elastic is recognized as a leader in its field by analysts, with over 50% of Fortune 500 companies as customers, indicating significant growth potential [37] Additional Important Insights - **Context Engineering**: The concept of context engineering is highlighted as vital for AI applications, ensuring that LLMs have the right data and context to function effectively [55] - **Developer Community**: Elastic has a strong developer community, with 17% of professional developers and 19% of AI developers using Elasticsearch, showcasing its popularity and trust [56][57] - **Performance Improvements**: Recent enhancements include a new data lake architecture that maintains high performance while providing scalability and efficiency [47] Conclusion - **Future Outlook**: Elastic is well-positioned to capitalize on the growing demand for unstructured data management and AI integration, with a strong product lineup and a diverse customer base [39][38]
Elastic (NYSE:ESTC) Earnings Call Presentation
2025-10-09 18:00
Business Overview and Growth - Elasticsearch is the world's most popular open-source data platform for unstructured data, evidenced by 55 billion downloads and ranking as the 1 search engine and VectorDB[10, 11] - Elastic's total revenue has grown consistently, reaching $1483 million in FY25, with a year-over-year growth of 17%[16] - The company is targeting a $296 billion total addressable market (TAM) by 2029, driven by Search, Security, Observability, and GenAI[52] AI and Technology - Elastic has a strong foundation for AI, with 15 years of development in native vector search and AI workloads[22, 23] - The Elasticsearch platform ingests 30 petabytes of raw data per day and handles 30 billion queries per day on Elastic Cloud[13] - Elastic Cloud has over 2200 customers using AI[32] Customer Adoption and Expansion - Elastic has over 21550 total customers, with over 1550 customers spending more than $100K ACV[56] - 50% of Fortune 500 companies are paid customers of Elastic[56] - 90% of Elastic Cloud observability customers use log analytics, and 35% use beyond log analytics[38] - 95% of Elastic Cloud security customers use Elastic as a SIEM, and 20% use it beyond SIEM for use cases like XDR[45] Financial Performance and Targets - Elastic aims for a medium-term sales-led subscription revenue growth of 20%, comprising 15% base growth and 5% GenAI tailwinds[265, 267] - The company is targeting a non-GAAP operating margin of over 20% and an adjusted free cash flow margin of over 20% in the medium term[267] - Elastic's adjusted free cash flow reached $286 million in FY25, representing a 19% margin[271]
Context Engineering & Coding Agents with Cursor
OpenAI· 2025-10-08 17:00
AI Coding Evolution - 软件开发正经历从终端到图形界面,再到AI辅助的快速演变 [1][2][3][4] - Cursor 旨在通过AI 自动化编码流程,重点在于模型和人机交互 [46] - Cursor 的目标是让工程师更专注于解决难题、设计系统和创造有价值的产品 [47][49] Context Engineering & Coding Agents - Context Engineering 关注于为模型提供高质量和有针对性的上下文信息,而非仅仅依赖 Prompt 技巧 [16][17] - Semantic Search 通过自动索引代码库并创建嵌入,提升代码搜索的准确性和效率 [19][20] - Semantic Search 将计算密集型任务转移到离线索引阶段,从而在运行时获得更快、更经济的响应 [22] - Cursor 发现用户更倾向于使用 GP 和 Semantic Search 相结合的方式,以获得最佳效果 [22] Cursor's Products & Features - Tab 功能每天处理超过 4 亿次请求,通过在线强化学习优化代码建议 [7] - Cursor 正在探索多种 Coding Agents 的管理界面,包括并行运行和模型竞争 [38][39][42][43] - Cursor 正在探索为 Agent 提供计算机使用权限,以便运行代码、测试并验证其正确性 [44] - Cursor 允许用户通过自定义命令和规则,共享 Prompt 和上下文信息,实现团队协作 [32][33]
X @Anthropic
Anthropic· 2025-09-30 18:52
AI Agent Development - Anthropic Engineering Blog introduces "context engineering" for maximizing AI agent performance, going beyond traditional prompt engineering [1] - The blog post explains how context engineering works [1]
扒完全网最强 AI 团队的 Context Engineering 攻略,我们总结出了这 5 大方法
Founder Park· 2025-09-28 12:58
Core Insights - The article discusses the emerging field of "context engineering" in AI agent development, emphasizing its importance in managing the vast amounts of context generated during tool calls and long-horizon reasoning [4][8][20]. - It outlines five key strategies for effective context management: Offload, Reduce, Retrieve, Isolate, and Cache, which are essential for enhancing the performance and efficiency of AI agents [5][20][21]. Group 1: Context Engineering Overview - Context engineering aims to provide the right information at the right time for AI agents, addressing the challenges posed by extensive context management [5][8]. - The concept was popularized by Karpathy, highlighting the need to fill a language model's context window with relevant information for optimal performance [8][10]. Group 2: Importance of Context Engineering - Context management is identified as a critical bottleneck in the efficient operation of AI agents, with many developers finding the process more complex than anticipated [8][11]. - A typical task may require around 50 tool calls, leading to significant token consumption and potential cost implications if not optimized [11][14]. Group 3: Strategies for Context Management - **Offload**: This strategy involves transferring context information to external storage, such as file systems, rather than sending complete context back to the model, thus optimizing resource utilization [21][23][26]. - **Reduce**: This method focuses on summarizing or pruning context to eliminate irrelevant information while being cautious of potential information loss [32][35][38]. - **Retrieve**: This involves sourcing relevant information from external resources to enhance the model's context, which has become a vital part of context engineering [45][46][48]. - **Isolate**: This strategy entails separating context for different agents to prevent interference, particularly in multi-agent architectures [55][59][62]. - **Cache**: Caching context can significantly reduce costs and improve efficiency by storing previously computed results for reuse [67][68][70]. Group 4: The Bitter Lesson - The article references "The Bitter Lesson," which emphasizes that algorithms relying on large amounts of data and computation tend to outperform those with manual feature design, suggesting a shift towards more flexible and less structured approaches in AI development [71][72][74].
RAG 的概念很糟糕,让大家忽略了应用构建中最关键的问题
Founder Park· 2025-09-14 04:43
Core Viewpoint - The article emphasizes the importance of Context Engineering in AI development, criticizing the current trend of RAG (Retrieval-Augmented Generation) as a misleading concept that oversimplifies complex processes [5][6][7]. Group 1: Context Engineering - Context Engineering is considered crucial for AI startups, as it focuses on effectively managing the information within the context window during model generation [4][9]. - The concept of Context Rot, where the model's performance deteriorates with an increasing number of tokens, highlights the need for better context management [8][12]. - Effective Context Engineering involves two loops: an internal loop for selecting relevant content for the current context and an external loop for learning to improve information selection over time [7][9]. Group 2: Critique of RAG - RAG is described as a confusing amalgamation of retrieval, generation, and combination, which leads to misunderstandings in the AI community [5][6]. - The article argues that RAG has been misrepresented in the market as merely using embeddings for vector searches, which is seen as a shallow interpretation [5][7]. - The author expresses a strong aversion to the term RAG, suggesting that it detracts from more meaningful discussions about AI development [6][7]. Group 3: Future Directions in AI - Two promising directions for future AI systems are continuous retrieval and remaining within the embedding space, which could enhance performance and efficiency [47][48]. - The potential for models to learn to retrieve information dynamically during generation is highlighted as an exciting area of research [41][42]. - The article suggests that the evolution of retrieval systems may lead to a more integrated approach, where models can generate and retrieve information simultaneously [41][48]. Group 4: Chroma's Role - Chroma is positioned as a leading open-source vector database aimed at facilitating the development of AI applications by providing a robust search infrastructure [70][72]. - The company emphasizes the importance of developer experience, aiming for a seamless integration process that allows users to quickly deploy and utilize the database [78][82]. - Chroma's architecture is designed to be modern and efficient, utilizing distributed systems and a serverless model to optimize performance and cost [75][86].
X @Avi Chawla
Avi Chawla· 2025-09-11 19:53
Context Engineering Workflow - The industry focuses on building a context engineering workflow step by step [1] - The industry highlights the importance of context engineering [1]
X @Avi Chawla
Avi Chawla· 2025-09-11 06:33
That's a wrap!If you found it insightful, reshare it with your network.Find me → @_avichawlaEvery day, I share tutorials and insights on DS, ML, LLMs, and RAGs. https://t.co/XSbmekHfM6Avi Chawla (@_avichawla):Let's build a context engineering workflow, step by step: ...
X @Avi Chawla
Avi Chawla· 2025-09-11 06:30
Workflow Construction - The document focuses on building a context engineering workflow step by step [1]