Core Insights - Cognition has launched its new high-speed AI coding model SWE-1.5, designed for high performance and speed in software engineering tasks, now available in the Windsurf code editor following its acquisition of Windsurf in July [1][2] - SWE-1.5 operates at speeds up to 950 tokens per second, making it 13 times faster than Anthropic's Sonnet 4.5 model, and significantly improving task completion times from 20 seconds to 5 seconds [2][4] Model Performance - SWE-1.5 is a cutting-edge model with hundreds of billions of parameters, designed to provide top-tier performance without compromising speed [2] - The model achieved a score of 40.08% in the SWE-Bench Pro benchmark, ranking just below Claude's Sonnet 4.5, which scored 43.60% [4] Technical Infrastructure - The model is trained on an advanced cluster of thousands of NVIDIA GB200 NVL72 chips, which can enhance performance by up to 30 times compared to NVIDIA H100 GPUs while reducing costs and energy consumption by up to 25% [8] - SWE-1.5 utilizes a custom Cascade intelligent framework for end-to-end reinforcement learning, emphasizing the importance of high-quality coding environments for downstream model performance [9] Development Strategy - The development of SWE-1.5 is part of a broader strategy to integrate it into the Windsurf IDE, aiming to create a unified system that combines speed and intelligence [10] - Cognition plans to continuously iterate on model training, framework optimization, and tool development to enhance speed and accuracy [11] Market Positioning - The launch of SWE-1.5 coincides with the release of Cursor's Composer model, indicating a strategic convergence in the AI developer tools market, with both companies focusing on proprietary models and low-latency developer experiences [13] - SWE-1.5's processing speed of 950 tokens per second is nearly four times faster than Composer's 250 tokens per second, highlighting its competitive edge [14]
4倍速吊打Cursor新模型,英伟达数千GB200堆出的SWE-1.5,圆了Devin的梦,实测被曝性能“滑铁卢”?