Investment Rating - The report maintains a "Leading Market-A" rating for the industry, indicating that the investment return is expected to outperform the CSI 300 Index by more than 10% over the next 6 months [1] Core Views - The domestic MoE (Mixture of Experts) model demonstrates significant advantages, particularly in AI vertical applications, which are experiencing growth [1] - DeepSeek-V2, a second-generation MoE model released by DeepSeek, features 236 billion parameters and shows Chinese comprehensive capabilities surpassing GPT-4 and comparable to GPT-4-Turbo and Wenxin 4.0 [1] - The model's English comprehensive capabilities are on par with LLaMA3-70B and exceed those of the open-source MoE model Mixtral8x22B [1] - The model is trained on a high-quality, multi-source pre-training corpus of 8.1T tokens, with an increased proportion of Chinese data compared to the previous generation [1] - DeepSeek-V2 introduces the MLA architecture and a self-developed Sparse structure, DeepSeekMoE, which significantly reduces computational load and inference memory, leading to a substantial decrease in per-token cost [1] - The API pricing for DeepSeek-V2 is 1 yuan per million tokens for input and 2 yuan for output (32K context), which is nearly one-hundredth of the cost of GPT-4-Turbo [1] - The MoE model, composed of multiple sub-models (experts), uses a gating network to adapt specific models based on data, reducing interference between different types of samples [1] - The Transformer architecture, primarily used for sequence-to-sequence tasks, lacks a loop structure, requiring significant computational power and time for training large AI models [1] - The trend is shifting towards sustainable new models with MoE architecture, which are energy-efficient and effective in both training and inference [1] - The domestic AI product market shows significant growth in certain segments, with AI search occupying two of the top five spots in the AI product growth chart [1] - The AI search market in China is expected to reach 32.935 billion yuan by 2027, with a compound annual growth rate (CAGR) of approximately 32.93% [1] Industry Performance - The industry has shown relative returns of -7.87% over 1 month, 0.38% over 3 months, and -19.66% over 12 months [1] - Absolute returns for the industry are -4.11% over 1 month, 9.33% over 3 months, and -28.64% over 12 months [1] Investment Recommendations - The report suggests that the update of MoE models may lead new trends in large models, and the development of AI products is expected to enrich the overall ecosystem of vertical fields [1] - Recommended stocks to watch include Kunlun Wanwei, Jiecheng Shares, Visual China, Zhongguang Tianze, CITIC Press, Wanxing Technology, Insai Group, BlueFocus, Yuanlong Yatu, Tianyu Digital, Tom Cat, and Zhongyuan Media [1]
传媒行业快报:国产MoE模型优势显著,AI垂类应用迎增长
Huajin Securities·2024-05-10 14:00