Batch Normalization(批次归一化)

Search documents
陶哲轩18个月没搞定的数学挑战,被这个“AI高斯”三周完成了
3 6 Ke· 2025-09-14 05:16
Core Insights - The new AI agent named Gauss has demonstrated remarkable capabilities by solving a mathematical challenge in just three weeks, a task that took renowned mathematicians 18 months to make limited progress on [2][4][6]. Company Overview - Gauss is developed by a company called Math, which specializes in AI applications for formal verification in mathematics [4][6]. - The founder of Math, Christian Szegedy, is a notable figure in the AI community, recognized for his contributions to the field, including the influential paper on Batch Normalization [13][15][17]. Technical Achievements - Gauss generated approximately 25,000 lines of Lean code, encompassing over a thousand theorems and definitions, a scale of formal proof that typically requires years to complete [7]. - The largest previous formalization projects took up to a decade and involved significantly more code, highlighting Gauss's efficiency [7]. - The Math team has partnered with Morph Labs to develop the Trinity infrastructure, enabling Gauss to operate with thousands of concurrent agents, each requiring substantial computational resources [8]. Future Prospects - The Math team anticipates that Gauss will significantly reduce the time required to complete large mathematical projects, with plans to increase the volume of formalized code by 100 to 1,000 times within the next 12 months [9]. - This advancement is seen as a step towards achieving "verifiable superintelligence" and creating a "generalist machine mathematician" [9].
一篇被证明“理论有误”的论文,拿下了ICML2025时间检验奖
量子位· 2025-07-15 08:31
Core Insights - The Batch Normalization paper, published in 2015, has been awarded the Time-Tested Award at ICML 2025, highlighting its significant impact on deep learning [1] - With over 60,000 citations, this work is considered a milestone in the development of deep learning, facilitating the training and application of deep neural networks [2][4] - Batch Normalization is a key technology that enabled deep learning to transition from small-scale experiments to large-scale practical applications [3] Group 1 - In 2015, deep learning faced challenges in training deep neural networks, which were often unstable and sensitive to parameter initialization [5][6][7] - Researchers Sergey Ioffe and Christian Szegedy identified the issue of Internal Covariate Shift, where the distribution of data within the network changes during training, complicating the training process [8][11] - Their solution involved normalizing the data at each layer, similar to input layer normalization, which significantly improved training speed and stability [12] Group 2 - The original paper demonstrated that using Batch Normalization allowed advanced image classification models to achieve the same accuracy with only 1/14 of the training steps [13] - Batch Normalization not only accelerated training but also introduced a regularization effect, enhancing the model's generalization ability [14][15] - Following its introduction, Batch Normalization became foundational for many mainstream convolutional neural networks, such as ResNet and DenseNet [18] Group 3 - In 2018, a paper from MIT challenged the core theory of Batch Normalization, showing that even with introduced noise, models with Batch Normalization still trained faster than those without it [21][23] - This research revealed that Batch Normalization smooths the Optimization Landscape, making gradient behavior more predictable and stable [24] - It was suggested that Batch Normalization acts as an unsupervised learning technique, allowing networks to adapt to the data's inherent structure early in training [25] Group 4 - Recent studies have provided deeper insights into Batch Normalization from a geometric perspective [29] - Both authors, Ioffe and Szegedy, have continued their careers in AI, with Szegedy joining xAI and Ioffe following suit [30][32] - Szegedy has since transitioned to a new role at Morph Labs, focusing on achieving "verifiable superintelligence" [34]