卷积神经网络
Search documents
【广发金工】ETF资金大幅流入(20250413)
广发金融工程研究· 2025-04-13 06:41
Market Performance - The recent five trading days saw the Sci-Tech 50 Index decline by 0.63%, the ChiNext Index drop by 6.73%, and the large-cap value index decrease by 2.61% [1] - The agricultural, forestry, animal husbandry, and fishery sectors, along with retail trade, performed well, while the power equipment and telecommunications sectors lagged behind [1] Risk Premium Analysis - The risk premium, measured as the inverse of the static PE of the CSI All Share Index minus the yield of ten-year government bonds, indicates that the implied returns of equity and bond assets are at historically high levels, reaching 4.17% on April 26, 2022, and 4.11% on January 19, 2024 [1] - As of April 11, 2025, the risk premium stands at 4.09%, with the two-standard deviation boundary at 4.73% [1] Valuation Levels - As of April 11, 2025, the CSI All Share Index's PE TTM percentile is at 45%, with the SSE 50 and CSI 300 at 56% and 43% respectively, indicating that the ChiNext Index is at a relatively low valuation compared to historical levels [2] - The long-term view of the Deep 100 Index suggests a cyclical pattern of bear and bull markets every three years, with the current adjustment phase starting in Q1 2021 showing sufficient time and space for a potential upward cycle [2] Fund Flow and Trading Activity - In the last five trading days, ETF inflows totaled 206.9 billion yuan, while margin financing decreased by approximately 98.3 billion yuan, with an average daily trading volume of 1.5742 trillion yuan [3] AI and Machine Learning Insights - The use of convolutional neural networks (CNN) for modeling price and volume data has been explored, with features mapped to industry themes, indicating a focus on sectors like securities as of April 11, 2025 [7][2]
【广发金工】AI识图关注红利低波(20250330)
广发金融工程研究· 2025-03-30 04:51
Market Performance - The recent 5 trading days saw the Sci-Tech 50 Index decline by 1.29%, and the ChiNext Index drop by 1.12%, while the large-cap value index rose by 0.28% and the large-cap growth index increased by 0.04% [1] - The healthcare and agriculture sectors performed well, whereas the computer and defense industries lagged behind [1] Risk Premium Analysis - The static PE of the CSI All Share Index minus the yield of 10-year government bonds indicates a risk premium, which has historically reached extreme levels at two standard deviations above the mean during significant market bottoms [1] - As of January 19, 2024, the risk premium indicator was at 4.11%, marking the fifth occurrence since 2016 of exceeding 4% [1] Valuation Levels - As of March 28, 2025, the CSI All Share Index's PE TTM percentile was at 53%, with the SSE 50 and CSI 300 at 58% and 48% respectively, while the ChiNext Index was close to 14% [2] - The ChiNext Index's valuation is relatively low compared to historical averages [2] Long-term Market Trends - The Shenzhen 100 Index has experienced bear markets approximately every three years, followed by bull markets, with declines ranging from 40% to 45% [2] - The current adjustment cycle, which began in Q1 2021, appears to have sufficient time and space for a potential upward trend [2] Fund Flow and Trading Activity - In the last 5 trading days, ETF inflows totaled 16.2 billion yuan, while margin financing decreased by approximately 24.8 billion yuan [3] - The average daily trading volume across both markets was 1.2346 trillion yuan [3] Thematic Investment Focus - As of March 28, 2025, the recommended investment themes include construction materials and low-volatility dividend stocks [2][8]
重磅!AlexNet源代码已开源
半导体芯闻· 2025-03-24 10:20
Core Points - The article discusses the release of the source code for AlexNet, a groundbreaking neural network developed in 2012, which has significantly influenced modern AI methods [1][18] - AlexNet was created by researchers from the University of Toronto, including Alex Krizhevsky, Ilya Sutskever, and Geoffrey Hinton, and it is primarily used for image recognition tasks [2][15] Group 1: Background of Deep Learning - Geoffrey Hinton is recognized as one of the fathers of deep learning, which utilizes neural networks and forms the foundation of contemporary AI [4] - The revival of neural network research in the 1980s was led by cognitive scientists who rediscovered the backpropagation algorithm, essential for training multilayer neural networks [5][6] Group 2: ImageNet and GPU Development - The ImageNet project, initiated by Stanford professor Fei-Fei Li, provided a large dataset necessary for training neural networks, significantly contributing to the success of AlexNet [8][9] - NVIDIA played a crucial role in making GPU technology more versatile and programmable, which was essential for the computational demands of training neural networks [9][12] Group 3: Creation and Impact of AlexNet - AlexNet combined deep neural networks, large datasets, and GPU computing, achieving groundbreaking results in image recognition [13] - The paper on AlexNet published in 2012 has been cited over 172,000 times, marking it as a pivotal moment in AI research [17] - The release of AlexNet's source code by the Computer History Museum (CHM) is seen as a significant historical contribution to the field of artificial intelligence [18]
【广发金工】神经常微分方程与液态神经网络
广发金融工程研究· 2025-03-06 00:16
广发证券首席金工分析师 安宁宁 anningning@gf.com.cn 广发证券资深金工分析师 陈原文 chenyuanwen@gf.com.cn 联系人:广发证券金工研究员 林涛 gflintao@gf.com.cn 广发金工安宁宁陈原文团队 摘要 神经常微分方程: 在机器学习国际顶会NeurIPS 2018上,Chen等人发表的论文《Neural Ordinary Differential Equations》获得了大会的最佳论文奖。简单来 说,一个常见的ResNet网络通常由多个形如h_{t+1}=f(h_t,_t)+h_t的残差结构所组成。在常规求解中,需计算出每一个残差结构中最能拟合训练数据的网 络参数。而该论文提出,假设当ResNet网络中的残差结构无限堆叠时,则每一个残差结构的参数都可以通过求解同一个常微分方程来获得。 液态神经网络: 基于上述工作,来自麻省理工学院的Ramin Hasani等人,创新性地以常微分方程的形式描述循环神经网络的隐藏状态变化,提出了一类被 称之为液态神经网络的模型,这些研究成果被发表在《Nature:Machine Intelligence》等国际顶级期刊上。此类模 ...