DeepSeek更新新模型,支持最高1M百万Token上下文长度

Core Viewpoint - DeepSeek has released a version update that supports a maximum context length of 1 million tokens, but it has not yet enabled multimodal capabilities [1][2]. Group 1: Version Update - The recent update for DeepSeek on both web and app platforms allows for a context length of up to 1 million tokens [1][2]. - As of now, the updated version does not support multimodal capabilities [1][2]. Group 2: Future Developments - Reports suggest that a minor update for the V3 series model is expected to be released around the Spring Festival [1][2]. - The next flagship model from DeepSeek is anticipated to be a trillion-parameter foundational model, but the significant increase in scale has slowed down the training speed, causing delays in the release process [1][2].

Seek .-DeepSeek更新新模型,支持最高1M百万Token上下文长度 - Reportify