The website is in Maintenance mode. We are in the process of adding more features.
Any new bookmarks, comments, or user profiles made during this time will not be saved.

AIML.com

Machine Learning Resources

What is Gradient Descent?

Bookmark this question

Gradient descent is an iterative optimization algorithm where on each iteration, or step, the parameters of the model are updated so that the loss function sequentially moves in the direction of its minimum until a convergence criteria is achieved. The size of the step taken to reach the optimum is determined by the learning rate parameter. If the learning rate is large, the algorithm might converge faster, but it also risks overstepping the minimum and oscillating in its vicinity. On the other hand, a small learning rate will take the algorithm a long time to train.

Leave your Comments and Suggestions below:

Please Login or Sign Up to leave a comment

Partner Ad  

Find out all the ways
that you can

Explore Questions by Topics

Partner Ad

Learn Data Science with Travis - your AI-powered tutor | LearnEngine.com