Pages

Tuesday, December 27, 2016

More on Orthogonal Regression

Some time ago I wrote a post about orthogonal regression. This is where we fit a regression line so that we minimize the sum of the squares of the orthogonal (rather than vertical) distances from the data points to the regression line.

Subsequently, I received the following email comment:
"Thanks for this blog post. I enjoyed reading it. I'm wondering how straightforward you think this would be to extend orthogonal regression to the case of two independent variables? Assume both independent variables are meaningfully measured in the same units."
Well, we don't have to make the latter assumption about units in order to answer this question. And we don't have to limit ourselves to just two regressors. Let's suppose that we have p of them.

In fact, I hint at the answer to the question posed above towards the end of my earlier post, when I say, "Finally, it will come as no surprise to hear that there's a close connection between orthogonal least squares and principal components analysis."

What was I referring to, exactly?