Group 1 - The core point of the news is the announcement of the new flagship model GLM-5 by Zhiyu, which features significant enhancements in parameters and pre-training data, indicating a strong focus on AI programming and advanced capabilities in the field [1][2] - GLM-5's parameter scale has increased from 355 billion (activated 32 billion) to 744 billion (activated 40 billion), and pre-training data has risen from 23 terabytes to 28.5 terabytes, enhancing the model's general intelligence level [1] - The model introduces a new "Slime" framework that supports larger model scales and more complex reinforcement learning tasks, improving the efficiency of post-training processes [1] Group 2 - Zhiyu has observed a strong growth in market demand for its GLM Coding Plan, leading to an increase in user scale and call volume, prompting the company to invest more in computing power and model optimization [2] - The company has decided to adjust the pricing structure of the GLM Coding Plan, with an overall increase starting from 30%, while maintaining prices for existing subscribers [2] - The industry is witnessing a shift from "Vibe Coding" to "Agentic Engineering," with GLM-5 being a product of this transformation, achieving technical leadership in programming and agent capabilities [2]
智谱宣布开源新一代旗舰大模型GLM-5 并宣布GLM Coding Plan涨价