智谱GLM-4.7-Flash发布并开源

Core Viewpoint - The release of GLM-4.7-Flash marks a significant advancement in mixed thinking models, offering a new choice that balances performance and efficiency for lightweight deployment [2] Group 1: Product Details - GLM-4.7-Flash has a total parameter count of 30 billion, with 3 billion active parameters, positioning it as a state-of-the-art (SOTA) model in its category [2] - GLM-4.7-Flash will replace the previous version, GLM-4.5-Flash, on the Zhiyu open platform BigModel.cn, and it will be available for free access starting immediately [2]

KNOWLEDGE ATLAS-智谱GLM-4.7-Flash发布并开源 - Reportify