智谱GLM-4.7-Flash正式发布并开源 供免费调用

Core Viewpoint - The release of GLM-4.7-Flash marks a significant advancement in lightweight model deployment, offering a balance of performance and efficiency, and is now available for free on the BigModel.cn platform [2] Group 1: Model Performance - GLM-4.7-Flash is recognized as the strongest model in the 30B category, surpassing competitors such as gpt-oss-20b and Qwen3-30B-A3B-Thinking-2507 in major benchmark tests like SWE-bench Verified and τ²-Bench [2] - The model achieves state-of-the-art (SOTA) scores in the open-source category among models of similar and approximate sizes [2] Group 2: Application Scenarios - GLM-4.7-Flash demonstrates excellent performance in both front-end and back-end tasks during internal programming tests [2] - Beyond programming, the model is applicable in various general scenarios including Chinese writing, translation, long text generation, and emotional/role-playing tasks [2]