Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 63349

How does gradient descent know which values to pick next?

$
0
0

Hi all. I'm a beginner, and was confused on how Gradient Descent (SGD) knows what parameter values to pick on the next iteration if convergence has not been achieved.

I'm aware of the update rule (the formula with the partial derivative and the learning rate), but am still confused on how it chooses the parameter value to plug into the formula.

My guess was that the cost function is plotted, and it picks values along the curve, but am unsure about this. Thanks!

edit: thanks everyone for your answers!

submitted by mangaprincess
[link][12 comments]

Viewing all articles
Browse latest Browse all 63349

Trending Articles