M2.7 模型
Search documents
未知机构:多家AI模型厂商已上调其API定价-20260323
未知机构· 2026-03-23 02:15
Summary of Conference Call Records Industry Overview - Multiple AI model vendors have raised their API pricing, reflecting high and rising costs of computing, memory, and electricity, alongside rapidly growing inference demand driven by agents like OpenClaw [1][2] - In the U.S., API pricing remains approximately six times higher than in China, indicating a tight supply of computing resources and previously unsustainable low pricing levels in China [1][2] Key Points and Arguments - The increase in API pricing is driven by expensive and tight supply of computing and memory resources, with many U.S. and Chinese AI vendors adjusting their model API pricing due to soaring costs [1][2] - The average API price in the U.S. has been raised by 17% to 67% by companies like Anthropic, Google, and OpenAI, while memory prices have surged by 3 to 5 times, and next-generation AI servers and GPUs are becoming more costly and power-hungry [2] - Despite the growth in inference demand, the rapid increase in API pricing may help control this demand, as most AI vendors face pressure to raise their API prices [2] Company-Specific Insights - In China, independent AI model vendors may face greater margin pressure, with five AI vendors raising their model API pricing and two lowering it, including Grok and Alibaba [3] - MiniMax plans to reduce the price of its M2.7 model by 50% by October 2025, making it the second cheapest AI model after DeepSeek [3] - Alibaba Cloud has increased its pricing for third-party computing/storage by 5% to 34% while reducing its model API pricing by 42%, likely to enhance competitiveness but indicating potential margin pressure for independent AI vendors renting computing/storage from Alibaba Cloud [3] Investment Risks and Opportunities - The value of AI is primarily flowing to upstream hardware manufacturers, presenting investment return risks [4] - AI model vendors must invest heavily in computing to enhance model performance and support growing inference demand, suggesting that current investment opportunities are mainly concentrated in upstream hardware suppliers such as CPU/GPU, memory, optical communication, and data centers [4] - The potential for investment returns remains a significant risk in the global AI development landscape [4]