邓明扬一作论文改写生成范式!何恺明也署名了
量子位·2026-02-05 11:20

Core Viewpoint - The article discusses the introduction of a new generative model paradigm called Drifting Models, proposed by He Kaiming's team, which shifts the distribution evolution process from the inference stage to the training stage, enabling one-step generation of high-quality samples [1][4][36]. Summary by Sections Introduction of Drifting Models - The Drifting Model represents a significant innovation in generative modeling by introducing the "Drifting Field" mechanism, which aligns the prior distribution with the real data distribution during training, eliminating common instabilities in GANs and avoiding reliance on multi-step ODE/SDE solutions [5][12][19]. Mechanism of Drifting Models - The core of the Drifting Model is to learn a mapping function that transforms a simple prior distribution (like Gaussian noise) into a pushforward distribution that matches real data [9][10]. - Unlike traditional models that require multiple iterations during inference, the Drifting Model allows for single-step generation by leveraging the iterative nature of neural network training as the driving force for distribution evolution [14][18]. Training Process - The training process involves calculating a drift vector for each sample based on the distribution of positive and negative samples, guiding the model to align its output distribution with the target distribution [21][26]. - The model's training trajectory is essentially equivalent to the path of distribution evolution, allowing for high-quality generation with only a single forward pass during inference [18][36]. Experimental Results - In the ImageNet 256x256 benchmark, the Drifting Model achieved a FID score of 1.54 in latent space and 1.61 in pixel space during one-step inference, outperforming many traditional diffusion models that require hundreds of iterations [32][33]. - The model also demonstrated strong generalization capabilities in embodied intelligence control tasks, matching or exceeding the decision quality of diffusion policies that require significantly more inference steps [34][35]. Conclusion - The Drifting Model successfully transfers the generative pressure from the inference stage to the training stage, providing a new perspective on generative modeling that reinterprets the training process as a mechanism for distribution evolution [36][37].