Types: N/A
Examples: N/A
Constructions: 1.4 Two Norms in 2D
Generalizations: N/A

Properties: N/A
Sufficiencies: N/A
Questions: N/A

The Full-Rank Least-Squares Problem

Let m, nN with mn. Let ARm×n be a "given" full-rank matrix and bRm be a given vector. Then, the full-rank least-squares problem is to find all xRn to minimize the 2-norm of the residual vector:

||Axb||2

Just like the 1 The Matrix-Vector Multiplication Problem and the 2A The Nonsingular Linear-Systems Problem, we can describe the full-rank least-squares problem using the function f:RnRm with

f(x)=Ax=k=1nxkA(:,k)

Based on this definition, we have:

The full-rank least-squares problem is a backward problem because we start with the function description (defined my matrix A) and we are given one specific output value b in the CODOMAIN of f(x). We are trying to find all possible input values x in the domain that produce output value f(x) "as close as possible" to the vector b.


Using the definition of the least-squares problem, we can classify all 2A The Nonsingular Linear-Systems Problem as least-squares problems. For linear systems problems, since b begins in the range of f(x), we know that

minxRn||bAx||2=0

for each solution. However, not all least-squares problems are linear systems problems. If bRm with bRng(f), we know that f(x)b for all xRn. This is the focus of the least-squares problem. Thus, the least-squares problem is a generalization of the linear systems problem that allows us to create "best-fit" approximations to noisy data.