像挖币一样挖激活函数?DeepMind搭建「算力矿场」,暴力搜出下一代ReLU
机器之心·2026-02-07 04:09

Core Insights - The article discusses the evolution of activation functions in neural networks, highlighting the transition from traditional functions like Sigmoid and ReLU to newer ones like GELU and Swish, emphasizing the impact on model performance [1][2]. Group 1: DeepMind's Innovation - Google DeepMind is revolutionizing the search for activation functions through a new method called AlphaEvolve, which explores an infinite space of Python functions rather than relying on predefined search spaces [2][4]. - The research paper titled "Finding Generalizable Activation Functions" showcases how DeepMind's approach led to the discovery of new activation functions, including GELUSine and GELU-Sinc-Perturbation, which outperform traditional functions in certain tasks [4][30]. Group 2: Methodology - AlphaEvolve utilizes a large language model (LLM) to generate and modify code, allowing for a more flexible and expansive search for activation functions [8][11]. - The process involves a "micro-laboratory" strategy, where synthetic data is used to optimize for out-of-distribution (OOD) generalization capabilities, avoiding the high costs of searching on large datasets like ImageNet [14][18]. Group 3: Performance of New Functions - The newly discovered functions demonstrated superior performance in algorithmic reasoning tasks, with GELU-Sinc-Perturbation achieving a score of 0.887 on the CLRS-30 benchmark, surpassing ReLU and GELU [34]. - In visual tasks, GELUSine and GELU-Sinc-Perturbation maintained competitive accuracy on ImageNet, achieving approximately 74.5% Top-1 accuracy, comparable to GELU [34][35]. Group 4: Insights on Function Design - The research indicates that the best-performing functions often follow a general formula combining a standard activation function with a periodic term, suggesting that incorporating periodic structures can enhance model generalization [25][35]. - The study highlights the importance of understanding the inductive biases introduced by activation functions, suggesting that periodic elements can help capture complex data structures beyond linear relationships [40][42].

像挖币一样挖激活函数?DeepMind搭建「算力矿场」,暴力搜出下一代ReLU - Reportify