Llama 3.2 1B
Search documents
仅0.27B参数,谷歌开源史上最小Gemma 3,手机能跑,25次对话耗电不到1%
3 6 Ke· 2025-08-15 10:15
Core Insights - Google has launched the Gemma 3 270M, the smallest open-source model to date, featuring 270 million parameters and designed for specific task fine-tuning, showcasing strong instruction tracking and text capabilities [2][5]. Model Performance - In instruction execution capability tests, the Gemma 3 270M outperformed larger models like Qwen2.5 0.5B Instruct and matched the performance of Llama 3.2 1B [1]. - The model excels in specific tasks, achieving performance levels comparable to larger models, making it suitable for offline and web-based creative tasks [3]. Model Architecture - The Gemma 3 270M features a lightweight yet powerful architecture with 270 million parameters, including 170 million embedding parameters and 100 million Transformer module parameters, supported by a large vocabulary of 256k tokens [4]. - The model is designed for low power consumption, consuming only 0.75% of battery over 25 dialogues on the Pixel 9 Pro SoC, making it Google's most energy-efficient Gemma model [4]. Instruction Following and Deployment - The model has excellent instruction-following capabilities, providing a pre-trained checkpoint that can respond to general instructions "out of the box" [4]. - It supports quantization-aware training (QAT) checkpoints, allowing operation at INT4 precision with minimal performance loss, crucial for deployment on resource-constrained devices [4]. Target Use Cases - The Gemma 3 270M is ideal for users with high-capacity, well-defined tasks who need cost-effective, rapid iteration and deployment, or have privacy concerns [5]. - The launch of this lightweight model addresses the misconception that larger parameter sizes equate to better performance, demonstrating the effectiveness of smaller models in instruction adherence and fine-tuning [5].