Core Viewpoint - The release of DeepSeek-V3.2 by DeepSeek Company and its adaptation by Cambricon signifies a strong collaboration among leading tech firms in China's AI industry, aiming to enhance efficiency in long-text training and inference [2][3][4]. Group 1: Model Release and Features - DeepSeek Company launched the experimental version DeepSeek-V3.2-Exp, which introduces a sparse attention mechanism for optimizing long text training and inference [2]. - The new model has a substantial size of 671GB, requiring approximately 8-10 hours for download under ideal bandwidth conditions [3]. Group 2: Collaboration and Industry Impact - Cambricon's quick adaptation to DeepSeek-V3.2-Exp indicates prior collaboration and communication between the two companies, reflecting a trend of low-profile yet effective partnerships in the tech industry [3]. - The collaboration between leading companies in the AI model and chip sectors is expected to significantly reduce training and inference costs for users, facilitating the emergence of AI applications [4].
强强联手!深度求索、寒武纪同步发布DeepSeek-V3.2模型架构和基于vLLM的模型适配源代码