DeepSeek更新新模型 可一次性处理超长文本

Core Insights - DeepSeek has updated its web and app versions to support a maximum context length of 1 million tokens, significantly enhancing its ability to process long texts [1][2] - The previous version, DeepSeek V3.1, had a context length of 128,000 tokens, indicating a substantial improvement in the latest update [1] - DeepSeek successfully processed a document of over 240,000 tokens, demonstrating its capability to recognize and handle extensive content [2] - There are indications that a minor update for the V3 series was expected around the Spring Festival, but the major advancements are still forthcoming [2] - The next flagship model from DeepSeek is anticipated to be a trillion-parameter foundational model, although the increase in scale has slowed down the training speed and delayed the release timeline [2]

Seek .-DeepSeek更新新模型 可一次性处理超长文本 - Reportify