This is the first ever regression algorithm that I'm writing for practical purposes, so I'm sorry if this seems like child's play to you guys. I'm trying to write a program that models a 3D bump function:
z = f(x,y) = 1/(1+x2 +y2 )
I've identified my cost function as:
f(P1...P5) = (∑i=1:n (z_i - (P1/(1+P2(x_i2 +P3) + P4(y_i2 +P5))))2 / 2n
Where I have a data set of size n, with a z value for each integer x and y location. I'm trying to minimize f(P1...P5) by finding the lowest point where the partials of f(P1...P5) with respect to each P is 0. However, f_P1(P1...P5) evaluates as an expression that can never be 0. I've tried putting P1 in the denominator of the bump function, but again, the derivative involves a fraction and cannot ever be 0.
Am I just bad at calculus, or is there really no way of directly evaluating the function's absolute minimum? If the latter is true, how do I minimize the cost function? My math level is high school AP Calc with some linear algebra and the basics of partials.
oh god how ddid this get here i am not good with computer thank you for help
[link][2 comments]