Dropout is a regularization technique that randomly sets a fraction of neuron outputs to zero during training. This prevents neurons from co-adapting too much and forces the network to learn more robust features that don't depend on any single neuron.
Dropout is like training an ensemble of networks that share parameters.