混合思考模型
Search documents
智谱:GLM-4.7-Flash开源14天突破百万下载
Di Yi Cai Jing· 2026-02-04 05:03
(文章来源:第一财经) 据智谱消息,混合思考模型GLM-4.7-Flash发布两周,在Hugging Face的下载量突破了100万。 ...
智谱GLM-4.7-Flash发布并开源
Mei Ri Jing Ji Xin Wen· 2026-01-20 01:02
Core Viewpoint - The release of GLM-4.7-Flash marks a significant advancement in mixed thinking models, offering a new choice that balances performance and efficiency for lightweight deployment [2] Group 1: Product Details - GLM-4.7-Flash has a total parameter count of 30 billion, with 3 billion active parameters, positioning it as a state-of-the-art (SOTA) model in its category [2] - GLM-4.7-Flash will replace the previous version, GLM-4.5-Flash, on the Zhiyu open platform BigModel.cn, and it will be available for free access starting immediately [2]
智谱将开源GLM-4.7-Flash,API免费调用
Xin Lang Cai Jing· 2026-01-20 00:56
Core Insights - The company is set to release and open-source the GLM-4.7-Flash model today, which is a hybrid thinking model with a total parameter count of 30 billion and an active parameter count of 3 billion, providing a balance of performance and efficiency for lightweight deployment [1][3] Group 1 - GLM-4.7-Flash will replace the previous GLM-4.5-Flash model [1][3] - The model will be available for free access on the Zhiyu Open Platform BigModel.cn [1][3]
GLM-4.7-Flash开源,API免费
智通财经网· 2026-01-20 00:36
Core Insights - GLM-4.7-Flash has been officially released and is now open-source, providing a new option for lightweight deployment while balancing performance and efficiency [1] - The model features a total parameter count of 30 billion, with 3 billion active parameters, positioning it as a state-of-the-art (SOTA) model in its category [1] - GLM-4.7-Flash will replace GLM-4.5-Flash and is available for free on the Zhiyu Open Platform BigModel.cn starting immediately [1]