阿里千问开源Qwen3-Coder-Next模型
Core Insights - Alibaba's Qwen announced the open-source release of a high-efficiency mixture of experts (MoE) model, Qwen3-Coder-Next, designed for programming agents and local development, featuring a total of 80 billion parameters with only 3 billion activated per inference [1] Group 1 - The Qwen3-Coder-Next model is available in two versions: Qwen3-Coder-Next (Base) and Qwen3-Coder-Next (Instruct), supporting research, evaluation, and commercial applications [1]