Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 62700

Regression by Successive Orthogonalization

$
0
0

We're talking of linear regression. Consider the following algorithm:

  1. Initialize z_0 = x_0 = 1.
  2. For j = 1,2,...,p
    • Regress x_j on z_0,z_1,...,z_{j-1} to produce coefficients g_{ij} = <z_i,x_j>/<z_i,z_i>, i = 0,...,j-1 and residual vector z_j = x_j - sum[k=0:j-1] g_{kj}z_k.
  3. Regress y on the residual z_p to give the estimate beta_p.

I can't prove point 3, that is, that beta_p = <y,z\_p>/<z\_p,z\_p>.

submitted by Kiuhnm
[link][comment]

Viewing all articles
Browse latest Browse all 62700

Trending Articles