Workflow
大模型
icon
Search documents
阿里发布新一代基模千问3.5
Xin Lang Cai Jing· 2026-02-16 09:53
2月16日除夕当天,阿里巴巴开源全新一代大模型千问Qwen3.5-Plus。 此次发布的Qwen3.5-Plus版本总参数为3970亿,激活仅170亿,以小胜大,性能超过万亿参数的Qwen3- Max模型,部署显存占用降低60%,推理效率大幅提升,最大推理吞吐量可提升至19倍。Qwen3.5-Plus 的API价格每百万Token低至0.8元,仅为Gemini 3 pro的1/18。 炒股就看金麒麟分析师研报,权威,专业,及时,全面,助您挖掘潜力主题机会! 据悉,千问APP、PC端已第一时间接入Qwen3.5-Plus模型。开发者可在魔搭社区和HuggingFace下载新 模型,或通过阿里云百炼直接获取API服务。 (来源:智通财经) ...
Qwen3.5-Plus登顶全球最强开源模型
Xin Lang Cai Jing· 2026-02-16 09:53
Core Viewpoint - Alibaba Cloud has launched the new generation open-source model Qwen 3.5-Plus, which is claimed to be the strongest open-source model globally, marking a significant advancement from a pure text model to a native multimodal model [1] Group 1 - Qwen 3.5 has transitioned from pre-training on pure text tokens to pre-training on a mix of visual and text tokens, enhancing its capabilities [1] - The model has significantly increased its dataset, incorporating multilingual, STEM, and reasoning data, which allows it to acquire more comprehensive world knowledge and reasoning logic [1] - Qwen 3.5 achieves top-tier performance with less than 40% of the parameter count compared to the Qwen 3-Max base model, which has over one trillion parameters, excelling in various benchmark evaluations including reasoning, programming, and agent intelligence [1]
Qwen3.5-Plus的API价格每百万Token为0.8元
Jin Rong Jie· 2026-02-16 09:48
Core Insights - Alibaba has released the Qwen3.5-Plus model on New Year's Eve, featuring significant architectural innovations [1] Group 1: Model Specifications - The new version has a total of 397 billion parameters and 17 billion activated parameters, surpassing the performance of Qwen3-Max [1] - Memory usage has decreased by 60%, while inference throughput has increased by 19 times [1] Group 2: Pricing and Accessibility - The API price for Qwen3.5-Plus is set at 0.8 yuan per million tokens [1] - The Qwen app and PC version have already integrated this new model [1]
阿里正式发布新一代大模型Qwen3.5
Mei Ri Jing Ji Xin Wen· 2026-02-16 09:36
Core Viewpoint - Alibaba's Qwen officially released Qwen3.5, introducing the first model in the Qwen3.5 series, Qwen3.5-397B-A17B, with open weight version [1] Group 1: Model Features - The model utilizes an innovative hybrid architecture combining Gated Delta Networks (linear attention) and Mixture of Experts (MoE) [1] - It achieves excellent inference efficiency with a total parameter count of 397 billion, activating only 17 billion parameters during each forward pass [1] - The design optimizes speed and cost while maintaining performance capabilities [1]
阿里开源新一代基模千问3.5,可通过千问APP免费体验
Cai Jing Wang· 2026-02-16 09:31
2月16日除夕当天,阿里巴巴开源全新一代大模型千问Qwen3.5-Plus,性能媲美Gemini3Pro,登顶全球 最强开源模型。千问3.5实现了底层模型架构的全面革新,此次发布的Qwen3.5-Plus版本总参数为3970 亿,激活仅170亿,以小胜大,性能超过万亿参数的Qwen3-Max模型,部署显存占用降低60%,推理效 率大幅提升,最大推理吞吐量可提升至19倍。Qwen3.5-Plus的API价格每百万Token低至0.8元,仅为 Gemini3Pro的1/18。 ...
阿里正式发布新一代基模千问3.5
第一财经· 2026-02-16 09:24
2月16日,阿里巴巴开源全新一代大模型千问Qwen3.5-Plus,记者了解到,千问3.5实现了底层模型架构的创新,此次发布的Qwen3.5-Plus版本总参数 为3970亿,激活仅170亿,性能超过万亿参数的Qwen3-Max模型,部署显存占用降低60%,最大推理吞吐量可提升至19倍。Qwen3.5-Plus的API价格每 百万Token低至0.8元。 据悉,千问APP、PC端已第一时间接入Qwen3.5-Plus模型。开发者可在魔搭社区和HuggingFace下载新模型,或通过阿里云百炼直接获取API服务。 今天下午,阿里在chat.qwen.ai页面低调上线了Qwen3.5-Plus和Qwen3.5-397B-A17B两款新模型。Qwen3.5-Plus定位为Qwen3.5系列最新大语言 模型,Qwen3.5-397B-A17B定位则是Qwen3.5开源系列旗舰大语言模型。两款模型均支持文本和多模态任务。 微信编辑 | 苏小 ...
阿里正式发布新一代基模千问3.5
Di Yi Cai Jing· 2026-02-16 09:20
(文章来源:第一财经) 2月16日,阿里巴巴开源全新一代大模型千问Qwen3.5-Plus,记者了解到,千问3.5实现了底层模型架构 的创新,此次发布的Qwen3.5-Plus版本总参数为3970亿,激活仅170亿,性能超过万亿参数的Qwen3- Max模型,部署显存占用降低60%,最大推理吞吐量可提升至19倍。Qwen3.5-Plus的API价格每百万 Token低至0.8元。 据悉,千问APP、PC端已第一时间接入Qwen3.5-Plus模型。开发者可在魔搭社区和HuggingFace下载新 模型,或通过阿里云百炼直接获取API服务。 ...
阿里发布千问3.5,性能媲美Gemini 3, Token价格仅为其1/18
Hua Er Jie Jian Wen· 2026-02-16 09:07
Core Insights - Alibaba has launched a new generation of open-source model, Qwen3.5-Plus, which is claimed to outperform Gemini 3 Pro, making it the strongest open-source model globally [1] Model Performance - Qwen3.5-Plus features a total of 397 billion parameters, with only 17 billion activated, achieving superior performance compared to the trillion-parameter Qwen3-Max model [1] - The deployment memory usage has been reduced by 60%, and inference efficiency has significantly improved, with maximum inference throughput increased by up to 19 times [1] Cost Efficiency - The API pricing for Qwen3.5-Plus is set at 0.8 yuan per million tokens, which is only 1/18th of the cost of Gemini 3 Pro [1] Technological Advancements - Qwen3.5 has achieved breakthroughs in native multimodal capabilities through pre-training on mixed text and visual data, demonstrating excellent performance across various benchmarks in reasoning, programming, and agent intelligence [1] - The model has also excelled in authoritative evaluations of visual understanding, winning several performance accolades [1] Accessibility - The Qwen3.5-Plus model is now integrated into the Qwen APP and PC platform, with developers able to download it from the Modao community and HuggingFace, or access API services directly through Alibaba Cloud [1]
豆包App暂停视频通话功能
Xin Lang Cai Jing· 2026-02-16 08:57
新浪科技讯 2月16日下午消息,今日,新浪科技实测发现,豆包App目前暂停了视频通话功能,页面显 示"新春人数较多,暂不支持此功能",或因算力压力所致。 截至发稿,字节方面对此暂未回应。 据悉,除夕夜的中央广播电视总台2026年春晚期间,豆包将为全国观众送出超过10万份接入豆包大模型 的科技好礼,包括宇树机器人、新能源车使用权以及现金红包等。 新浪科技讯 2月16日下午消息,今日,新浪科技实测发现,豆包App目前暂停了视频通话功能,页面显 示"新春人数较多,暂不支持此功能",或因算力压力所致。 截至发稿,字节方面对此暂未回应。 据悉,除夕夜的中央广播电视总台2026年春晚期间,豆包将为全国观众送出超过10万份接入豆包大模型 的科技好礼,包括宇树机器人、新能源车使用权以及现金红包等。 责任编辑:尉旖涵 责任编辑:尉旖涵 ...
搞事情!阿里除夕夜发布新一代千问大模型
程序员的那些事· 2026-02-16 06:52
Core Insights - Alibaba officially released Tongyi Qwen 3.5 on the evening of February 16, featuring two core versions: a lightweight model for 2B applications that can run locally on mobile and edge devices, and a flagship model, 35B-A3B MoE, which activates only about 3 billion parameters during inference, significantly reducing computational costs while matching the performance of top-tier closed-source models [1] - Qwen 3.5 showcases architectural innovations with enhanced native multimodal capabilities, including improved image recognition, text comprehension, and logical reasoning, allowing smaller parameter models to achieve high performance [1] Licensing and Accessibility - The model is released under the Apache 2.0 open-source license, allowing for direct commercial use and secondary development, with global platform availability for immediate deployment [2]