Workflow
混合专家模型(Mixture-of-Experts
icon
Search documents
美团今日正式发布并开源LongCat-Flash-Chat
Mei Ri Jing Ji Xin Wen· 2025-09-01 02:53
Core Insights - Meituan has officially released and open-sourced LongCat-Flash-Chat, which utilizes an innovative Mixture-of-Experts (MoE) architecture with a total of 560 billion parameters [2] - The model activates between 18.6 billion to 31.3 billion parameters, averaging 27 billion, achieving a dual optimization of computational efficiency and performance [2] - LongCat-Flash-Chat demonstrates performance comparable to leading mainstream models while activating only a small number of parameters, particularly excelling in agent tasks [2]