美团今日正式发布并开源LongCat-Flash-Chat
Core Insights - Meituan has officially released and open-sourced LongCat-Flash-Chat, which utilizes an innovative Mixture-of-Experts (MoE) architecture with a total of 560 billion parameters [2] - The model activates between 18.6 billion to 31.3 billion parameters, averaging 27 billion, achieving a dual optimization of computational efficiency and performance [2] - LongCat-Flash-Chat demonstrates performance comparable to leading mainstream models while activating only a small number of parameters, particularly excelling in agent tasks [2]