Which technique directly reduces overfitting by randomly disabling neurons during training?
Correct: B
Dropout randomly sets neuron outputs to zero with probability p, forcing the network to learn redundant representations and thus improving generalization.