The method of ``Total Least Squares'' is proposed as a more natural way (than
ordinary least squares) to approximate the data if both the matrix and and the
right-hand side are contaminated by ``errors''. In this tutorial note, we give
a elementary unified view of ordinary and total least squares problems and
their solution. As the geometry underlying the problem setting greatly
contributes to the understanding of the solution, we introduce least squares
problems and their generalization via interpretations in both column space and
(the dual) row space and we shall use both approaches to clarify the solution.
After a study of the least squares approximation for simple regression we
introduce the notion of approximation in the sense of ``Total Least Squares
(TLS)'' for this problem and deduce its solution in a natural way. Next we
consider ordinary and total least squares approximations for multiple
regression problems and we study the solution of a general overdetermined
system of equations in TLS-sense. In a final section we consider
generalizations with multiple right-hand sides and with ``frozen'' columns. We
remark that a TLS-approximation needs not exist in general; however, the line
(or hyperplane) of best approximation in TLS-sense for a regression problem
does exist always.
Описание
[math/9805076] An Introduction to Total Least Squares
%0 Generic
%1 degroen1998introduction
%A de Groen, P. P. N.
%D 1998
%K 1998 arxiv optimization tutorial
%T An Introduction to Total Least Squares
%U http://arxiv.org/abs/math/9805076
%X The method of ``Total Least Squares'' is proposed as a more natural way (than
ordinary least squares) to approximate the data if both the matrix and and the
right-hand side are contaminated by ``errors''. In this tutorial note, we give
a elementary unified view of ordinary and total least squares problems and
their solution. As the geometry underlying the problem setting greatly
contributes to the understanding of the solution, we introduce least squares
problems and their generalization via interpretations in both column space and
(the dual) row space and we shall use both approaches to clarify the solution.
After a study of the least squares approximation for simple regression we
introduce the notion of approximation in the sense of ``Total Least Squares
(TLS)'' for this problem and deduce its solution in a natural way. Next we
consider ordinary and total least squares approximations for multiple
regression problems and we study the solution of a general overdetermined
system of equations in TLS-sense. In a final section we consider
generalizations with multiple right-hand sides and with ``frozen'' columns. We
remark that a TLS-approximation needs not exist in general; however, the line
(or hyperplane) of best approximation in TLS-sense for a regression problem
does exist always.
@misc{degroen1998introduction,
abstract = {The method of ``Total Least Squares'' is proposed as a more natural way (than
ordinary least squares) to approximate the data if both the matrix and and the
right-hand side are contaminated by ``errors''. In this tutorial note, we give
a elementary unified view of ordinary and total least squares problems and
their solution. As the geometry underlying the problem setting greatly
contributes to the understanding of the solution, we introduce least squares
problems and their generalization via interpretations in both column space and
(the dual) row space and we shall use both approaches to clarify the solution.
After a study of the least squares approximation for simple regression we
introduce the notion of approximation in the sense of ``Total Least Squares
(TLS)'' for this problem and deduce its solution in a natural way. Next we
consider ordinary and total least squares approximations for multiple
regression problems and we study the solution of a general overdetermined
system of equations in TLS-sense. In a final section we consider
generalizations with multiple right-hand sides and with ``frozen'' columns. We
remark that a TLS-approximation needs not exist in general; however, the line
(or hyperplane) of best approximation in TLS-sense for a regression problem
does exist always.},
added-at = {2018-07-13T23:01:51.000+0200},
author = {de Groen, P. P. N.},
biburl = {https://www.bibsonomy.org/bibtex/2baacd0e844a0e5189fe24efded7e45a9/analyst},
description = {[math/9805076] An Introduction to Total Least Squares},
interhash = {415f69e0d9e3a36bb2316b12464166f3},
intrahash = {baacd0e844a0e5189fe24efded7e45a9},
keywords = {1998 arxiv optimization tutorial},
note = {cite arxiv:math/9805076},
timestamp = {2018-07-13T23:01:51.000+0200},
title = {An Introduction to Total Least Squares},
url = {http://arxiv.org/abs/math/9805076},
year = 1998
}