Workflow
X @Avi Chawla
Avi Chawlaยท2025-09-22 19:59

Dropout Mechanism - During training, the average neuron input is significantly lower compared to inference, potentially causing numerical instability due to activation scale misalignment [1] - Dropout addresses this by multiplying inputs during training by a factor of 1/(1-p), where 'p' is the dropout rate [2] - For example, with a dropout rate of 50%, an input of 50 is scaled to 100 (50 / (1 - 0.5) = 100) [2] - This scaling ensures coherence between training and inference stages for the neural network [2] Training vs Inference - Consider a layer with 100 neurons, each with an activation value of 1, and a weight of 1 from each neuron to neuron 'A' in the next layer [2] - With a 50% dropout rate, approximately 50 neurons are active during training [2] - During inference, all 100 neurons are active since Dropout is not used [2]