Im going through the ML Class on coursera on Logistic Regression and also the Manning Book Machine Learning in Action im trying to learn by implementing everything in python. Im not able to understand the difference between the cost function and the gradient... there are examples on the net where people compute the cost funciton and then there are places where they dont and just go with the gradient descent funciton w :=w - (alpha) * (delta)w * f(w) what is the difference between the two? or is there any diference im not able to get my head around it ? :/
[link][5 comments]