本地大模型
Search documents
量化看市场系列之十一:Token太贵?让龙虾使用本地大模型
Huachuang Securities· 2026-03-29 14:48
- LM Studio is a cross-platform desktop application designed for running large language models (LLMs) locally, built on llama.cpp, enabling offline operation of models like Llama, DeepSeek, Qwen, and Mistral without relying on cloud APIs, ensuring data privacy[13][16][46] - LM Studio acts as the "model engine," responsible for loading GGUF/MLX format local models and executing inference, while OpenClaw serves as the "intelligent agent brain," handling task planning, tool invocation, and multi-agent collaboration[2][46][8] - OpenClaw and LM Studio connect via OpenAI-compatible API protocols, allowing LM Studio to provide a local HTTP interface for model invocation by OpenClaw, enabling seamless switching between models ranging from lightweight 7B to professional-grade 70B models[2][32][46] - LM Studio supports two model formats: GGUF for general use across platforms and MLX optimized for Apple Silicon Macs, enhancing speed and efficiency[23][22][46] - Apple Silicon Macs leverage Unified Memory Architecture (UMA), enabling shared memory access between CPU and GPU, eliminating data copying overhead and enhancing performance for local AI development and model deployment[18][20][46] - OpenClaw's multi-agent collaboration framework allows users to create specialized AI agents with distinct workspaces, memory systems, and skill permissions, enabling efficient parallel execution and context isolation[9][8][46] - OpenClaw's task execution process involves receiving natural language instructions, standardizing them, submitting to agents, invoking tools, and returning results, forming a complete task execution loop[9][46][8] - LM Studio provides features like OpenAI-compatible local API services, integrated model search via Hugging Face, and RAG (retrieval-augmented generation) for offline document interaction[21][22][46] - Recommended deployment strategy includes running OpenClaw's gateway service and LM Studio on the same device, leveraging Mac's hardware advantages, and configuring cloud models as primary with local models as fallback for high-availability scenarios[47][46][8]
QQ音乐你变了,竟能免费在AI PC上原创一首《大东北》
量子位· 2025-12-16 11:52
Core Viewpoint - QQ Music has introduced an AI songwriting feature that allows users to create original songs for free, significantly lowering the barrier to music creation [1][2][4]. Group 1: AI Songwriting Feature - The AI songwriting feature enables users to input their song ideas and select a genre, resulting in a complete song generated in minutes [3][4]. - This feature is unique to the AI PC platform, utilizing local large models for real-time generation, making it accessible for both amateurs and professionals [5][6]. Group 2: AI PC Capabilities - The AI PC is transforming creative processes across various applications, including video editing, image processing, and report writing, by integrating high AI capabilities [7][8]. - The introduction of AI PCs is redefining personal computing, breaking down the barriers between professional and amateur creators [10][11]. Group 3: Technical Innovations - Intel's Core Ultra AI PC processor integrates a dedicated NPU, marking a shift from traditional CPU and GPU architectures to a new heterogeneous computing model [28][30]. - This new architecture enhances performance, reduces power consumption, and allows for efficient handling of continuous AI workloads, improving user experience [33][40]. Group 4: Future of AI PCs - The upcoming Panther Lake processor is expected to further elevate AI PC capabilities, emphasizing the importance of a robust AI ecosystem for future competition [41][43]. - Intel's innovations are aimed at meeting diverse user needs, positioning the Core Ultra as a critical advancement in enhancing productivity and creativity [44].