The website is in Maintenance mode. We are in the process of adding more features.
Any new bookmarks, comments, or user profiles made during this time will not be saved.

AIML.com

Machine Learning Resources

What is Dropout?

Bookmark this question

One of the main drawbacks of deep learning is that it is more prone to overfitting than more traditional machine learning models. However, there are some options at hand that can be employed to mitigate the risk of overfitting, Dropout being one of them.

Dropout refers to randomly turning off hidden units so that a smaller network is trained on a given pass through the dataset. Basically, each node within the hidden layers has a probability of being turned off, so if the network is trained over multiple iterations of the data, the data is fed through different but simpler networks that result in lower variance than if the same, more complex model was used in each pass. Thus, dropout essentially achieves the same reduction in variance as creating an ensemble of complex networks.

Leave your Comments and Suggestions below:

Please Login or Sign Up to leave a comment

Partner Ad  

Find out all the ways
that you can

Explore Questions by Topics