Workflow
大模型(LLM)
icon
Search documents
美团入局AI浏览器:光年之外发布Tabbit
在传统互联网时代,浏览器是用户通往互联网世界的"窗户"。然而,在信息过载与SaaS应用碎片化的当 下,传统浏览器仅作为被动的信息展示工具,已难以满足用户处理复杂任务的需求。 美团旗下的光年之外团队此次发布的Tabbit AI浏览器,试图打破这一边界,将网页浏览、全网搜索、AI 对话与复杂任务执行融为一体。 根据官方介绍,Tabbit的核心突破在于引入了"智能代理(Agent)"概念。用户只需用自然语言描述需 求,如"跨越多个业务平台抓取数据生成分析报表"或"在多个SaaS系统间自动流转填写表单",Tabbit即 可自动完成。任务执行过程全程可视化,且在独立标签组中运行,不干扰用户的日常浏览。 记者从美团方面获悉,为了降低自动化门槛,Tabbit还推出了"妙招(Skill)"和"脚本(Script)"功能。 前者允许用户将常用的提问格式、规范和工作流程保存为可一键复用的快捷指令,让宝贵的工作经验得 以沉淀;后者则允许用户通过自然语言生成自动化脚本,实现网页智能净化、资源批量提取等个性化定 制。 在交互体验上,Tabbit也进行了大幅重构。其推出的"全能输入框"打破了地址栏、搜索框和AI对话框的 隔阂,支持一键" ...
LeCun离职后不止创一份业,押注与大模型不同的路线,加入硅谷初创董事会
3 6 Ke· 2026-01-30 08:11
Core Viewpoint - Yann LeCun has left Meta and is now focusing on diversifying his efforts in AI by founding his own startup AMI and joining Logical Intelligence as the founding chair of the technical research committee, which is pursuing a different technological route compared to mainstream large language models (LLMs) [1][3][12]. Company Overview - Logical Intelligence is an AI company that recently emerged in January, developing an Energy-Based Reasoning Model (EBM) [13][25]. - The company aims to create a model that excels in learning, reasoning, and self-correction, distinguishing itself from traditional LLMs [4][25]. Technology and Model Performance - The EBM developed by Logical Intelligence operates by scoring solutions based on constraints, optimizing them to find the lowest "energy" state, which represents the most consistent and stable solution [15][18]. - The first model released by Logical Intelligence is called Kona, which has shown superior performance in solving Sudoku puzzles compared to leading LLMs, completing tasks in under 1 second while competitors took over 100 seconds [27][28][33]. - Kona is designed to handle complex real-world problems, such as optimizing energy distribution networks and automating precision manufacturing processes, emphasizing that these tasks are not language-dependent [33][34]. Unique Features and Advantages - EBM's training data can be any type, allowing for tailored models for individual business needs, contrasting with the one-size-fits-all approach of traditional LLMs [34]. - Kona's architecture is built to extract complete data from sparse datasets, enhancing its training efficiency [34]. Future Considerations - Currently, Kona is a closed-source model, but there are plans to consider open-sourcing certain aspects in the future, with a focus on understanding its safety and potential before public release [35][36].
全球生成式 AI 消费支出将大幅增长, 2030 年逼近 7000 亿美元,但能否满足投资者预期仍存疑
Counterpoint Research· 2025-12-25 06:14
Core Insights - The report from Counterpoint Research highlights that consumer spending in the generative AI sector is rapidly reshaping the global technology landscape, with significant growth expected in both AI software and the hardware required to run it [5][6] - AI hardware is projected to maintain the largest share of overall consumer spending, driven by the integration of AI functionalities in personal devices [5][6] - The global generative AI smartphone shipment is expected to grow at a compound annual growth rate (CAGR) of 26% from 2023 to 2030, with corresponding revenue growth at a CAGR of 16% [5][7] AI Software Market - The consumer AI software market is anticipated to expand significantly, primarily due to a substantial increase in user base, with monthly active users of AI dialogue platforms expected to reach nearly 5 billion during the forecast period [5][7] - The competition among large language model (LLM) providers is intensifying, and notable changes in market dynamics and shares are expected throughout the forecast period [7] Hardware and Software Dynamics - While AI hardware spending is expected to remain strong in the coming years, the success of the next generation of the AI ecosystem will largely depend on the growth of software spending [6] - High-end smartphones will continue to support revenue until 2030, but growth in shipments is increasingly driven by mid-range models, indicating a trend towards broader AI capability adoption [6] Market Growth Projections - Global generative AI consumer spending is projected to increase from $225 billion in 2023 to $699 billion by 2030, with an overall CAGR of 21% [7] - The AI dialogue platform segment is identified as the fastest-growing area, with personal assistant AI and content generation tools also expected to see significant expansion [7] Industry Challenges - A key question facing the industry is whether the rapid market growth justifies the unprecedented capital investments in the sector [8]
耗时15年、扫描1400个大脑,她发现了藏在人脑中的“生物版ChatGPT”
3 6 Ke· 2025-12-08 03:04
Core Insights - The research by Ev Fedorenko identifies a specialized language network in the human brain, likened to a "biological version of ChatGPT," which focuses solely on mapping words to meanings and constructing sentences, separate from thought and emotion [1][3][30] - Fedorenko's findings suggest that language serves as an interface for thought rather than being synonymous with it, challenging the common perception that finding words is an integral part of thinking [2][34] Group 1: Language Network Characteristics - The language network is described as a compact system, roughly the size of a strawberry, that connects external inputs (like auditory and visual stimuli) to other brain regions responsible for meaning representation [6][11] - This network is distinct from other brain areas associated with language, such as Broca's area, which is more involved in the planning of speech actions rather than the processing of language itself [17][18] - Fedorenko's research has established a probability map of the language network's location in the brain, showing consistent patterns across approximately 1400 scanned individuals [16][11] Group 2: Functionality and Implications - The language network operates as a "mapping warehouse," continuously updating the relationships between forms and meanings, enabling both expression and comprehension of language [21][22] - Damage to this network can lead to aphasia, where complex thoughts remain unexpressed, highlighting the critical role of this network in communication [8][7] - Fedorenko's work indicates that the language network can respond similarly to meaningful and nonsensical sentences, suggesting its function is more superficial and focused on structure rather than deep semantic understanding [28][29] Group 3: Research Background and Development - Fedorenko has spent 15 years gathering biological evidence for the language network, culminating in a comprehensive review published in 2024 [5][11] - Her academic journey began with a focus on linguistics and psychology, leading to her exploration of the brain's language processing capabilities [10] - The research emphasizes the specialization of the language network, which operates independently from higher-level cognitive functions, reinforcing the idea that language processing is a distinct and specialized function [36][30]