海外AI应用近况点评:AI应用逐步落地,推理算力需求提速
Guoxin Securities·2024-12-10 10:10

Industry Investment Rating - The report maintains an "Outperform" rating for the industry [1][2] Core Views - AI applications are gradually being implemented, driving rapid growth in inference computing power demand [2][8] - OpenAI's GPT-O1 enhances model reasoning capabilities through chain-of-thought, significantly increasing inference computing power requirements [9] - The proportion of inference computing power in China is expected to rise to 67.7% in 2024, up 26.4 percentage points year-on-year [2][14] C-Side AI Applications - ChatGPT's weekly visits reached 899 million, up 0.5% week-on-week [3] - ChatGPT's MAU in November was 287.25 million, up 11.27% month-on-month [3] - Domestic AI applications like Doubao, Wenxiaoyan, Kimi, and Zhipu Qingyan showed significant MAU growth in November [3] B-Side AI Applications - AppLovin's Q3 revenue increased by 39% year-on-year, with net profit up 300% [7] - Palantir's Q3 revenue grew by 30%, with US commercial business up 54% [7] - Companies like Salesforce, DocuSign, and Asana benefited from AI-driven performance improvements [7] Inference Computing Power Demand - Single inference involves tokenization, embedding, positional encoding, Transformer layers, and Softmax, with significant computing power consumed in the Transformer decoding layer [8] - AI application implementation increases model inference frequency, driving rapid growth in inference computing power demand [8] GPT-O1 and Chain-of-Thought - GPT-O1 improves reasoning capabilities by generating internal chains-of-thought, breaking down complex problems into simpler steps [9] - Chain-of-thought requires multi-step reasoning, significantly increasing inference computing power demand [9] - Chain-of-thought is effective only for models with over 100 billion parameters, pushing up the lower limit of model parameters and increasing inference computing power demand [9] AI Chip Market in China - China's AI chip market size was approximately 103.88 billion yuan in 2023, expected to grow to 178 billion yuan by 2025, with a CAGR of 30.9% [14] - Inference computing power is expected to account for 67.7% of total AI computing power in China by 2024, up 26.4 percentage points year-on-year [14] Key Companies in AI Chips - Broadcom provides custom AI ASIC chips for clients like Google and Meta, offering high-complexity custom accelerator cards (XPU) and ASICs [17] - Marvell offers custom ASIC SOC services, with major clients including Amazon [17] - Hygon Information focuses on AI chips using GPGPU architecture, with its DCU chips excelling in inference [17] - Cambricon's Siyuan 370 accelerator card performs well in inference [17] - Intellifusion's Deep Edge series inference cards are compatible with nearly ten mainstream large models, including Yuntian Shuji, Tongyi Qianwen, Baichuan Intelligence, and Llama2/3 [20] Investment Recommendation - With AI applications gradually being implemented and the use of technologies like chain-of-thought, inference computing power demand is expected to rise rapidly, suggesting attention to domestic computing power chip companies like Hygon Information [21]

海外AI应用近况点评:AI应用逐步落地,推理算力需求提速 - Reportify