Abstract

The method of ``Total Least Squares'' is proposed as a more natural way (than ordinary least squares) to approximate the data if both the matrix and and the right-hand side are contaminated by ``errors''. In this tutorial note, we give a elementary unified view of ordinary and total least squares problems and their solution. As the geometry underlying the problem setting greatly contributes to the understanding of the solution, we introduce least squares problems and their generalization via interpretations in both column space and (the dual) row space and we shall use both approaches to clarify the solution. After a study of the least squares approximation for simple regression we introduce the notion of approximation in the sense of ``Total Least Squares (TLS)'' for this problem and deduce its solution in a natural way. Next we consider ordinary and total least squares approximations for multiple regression problems and we study the solution of a general overdetermined system of equations in TLS-sense. In a final section we consider generalizations with multiple right-hand sides and with ``frozen'' columns. We remark that a TLS-approximation needs not exist in general; however, the line (or hyperplane) of best approximation in TLS-sense for a regression problem does exist always.

Description

[math/9805076] An Introduction to Total Least Squares

Links and resources

Tags