Singular Value Decomposition

An Example

Calculating a singular value decomposition

Consider the following matrix:
(Lay 7.4, Example 3)

In[1]:=

Clear[a, aTax, evals, evecs, Λ, V, v1, v2, v3, Σ, σ1, σ2, U, u1, u2] 

a = ({{4, 11, 14}, {8, 7, -2}}) ;

Find the eigendata for A^TA.

In[3]:=

aTa = Transpose[a] . a ; 

evals = Eigenvalues[aTa] ;

Λ = DiagonalMatrix[evals] ;

%//MatrixForm

Out[6]//MatrixForm=

( {{360, 0, 0}, {0, 90, 0}, {0, 0, 0}} )

In[7]:=

evecs = Eigenvectors[aTa] ;

%//MatrixForm

Out[8]//MatrixForm=

( {{1, 2, 2}, {-2, -1, 2}, {2, -2, 1}} )

Construct V.

The columns of the matrix V are the orthonormal eigenvectors of A^TA, taken in the proper order.

In[9]:=

v1 = evecs[[1]]/Norm[evecs[[1]]] ;

v2 = evecs[[2]]/Norm[evecs[[2]]] ;

v3 = evecs[[3]]/Norm[evecs[[3]]] ; 

V = Transpose[{v1, v2, v3}] ;

%//MatrixForm

Out[13]//MatrixForm=

( {{1/3, -2/3, 2/3}, {2/3, -1/3, -2/3}, {2/3, 2/3, 1/3}} )

Check that the columns of V are orthonormal.

In[14]:=

Transpose[V] . V ;

%//MatrixForm

Out[15]//MatrixForm=

( {{1, 0, 0}, {0, 1, 0}, {0, 0, 1}} )

Check the diagonalization.

In[16]:=

aTaV . Λ . Inverse[V]

Out[16]=

True

Construct Σ .

In[17]:=

<<LinearAlgebra`MatrixManipulation`

In[18]:=

singularValues = {σ1, σ2, σ3} = evals^(1/2) 

Σ = ZeroMatrix[2, 3] ;

Σ[[1, 1]] = σ1 ;

Σ[[2, 2]] = σ2 ;

Σ//MatrixForm

Out[18]=

{6 10^(1/2), 3 10^(1/2), 0}

Out[22]//MatrixForm=

( {{6 10^(1/2), 0, 0}, {0, 3 10^(1/2), 0}} )

Construct U .

A has rank 2. The two columns of U are the normalized vectors A.v1 and A.v2.

In[23]:=

u1 = (1/σ1) a . v1 ;

u2 = (1/σ2) a . v2 ; 

U = Transpose[{u1, u2}] ;

%//MatrixForm

Out[26]//MatrixForm=

( {{3/10^(1/2), 1/10^(1/2)}, {1/10^(1/2), -3/10^(1/2)}} )

Check the singular value decomposition, A == U.Σ.V^T.

In[27]:=

Print[a//MatrixForm, "", U//MatrixForm, ".", Σ//MatrixForm, ".", Transpose[V]//MatrixForm] ; 

aU . Σ . Transpose[V]

( {{4, 11, 14}, {8, 7, -2}} )  ( {{3/10^(1/2 ... )  .  ( {{1/3, 2/3, 2/3}, {-2/3, -1/3, 2/3}, {2/3, -2/3, 1/3}} )

Out[28]=

True

Compare with Mathematica's singular value decomposition.

Mathematica's numerical result.

In[29]:=

a = ({{4., 11, 14}, {8, 7, -2}}) ;

{, , } = SingularValueDecomposition[a] ; 

Print[a//MatrixForm, "", //MatrixForm, ".", //MatrixForm, ".", Transpose[]//MatrixForm] ;

a .  . Transpose[]

( {{4., 11, 14}, {8, 7, -2}} )  ( {{-0.94868 ... 66667, -0.666667}, {-0.666667, -0.333333, 0.666667}, {-0.666667, 0.666667, -0.333333}} )

Out[32]=

True

Our numerical result.
These two results agree up to a sign change on some rows and columns.

In[33]:=

Print[a//MatrixForm, "", U//MatrixForm//N, ".", Σ//MatrixForm//N, ".", Transpose[V]//MatrixForm//N] ; 

aU . Σ . Transpose[V]

( {{4., 11, 14}, {8, 7, -2}} )  ( {{0.948683 ... .666667, 0.666667}, {-0.666667, -0.333333, 0.666667}, {0.666667, -0.666667, 0.333333}} )

Out[34]=

True

The Four Fundamental Subspaces

Consider the matrix of the previous example, ...

In[35]:=

a = ({{4, 11, 14}, {8, 7, -2}}) ;

... and its singular value decomposition, A == U.Σ.V^T.

In[36]:=

Print[a//MatrixForm, "", U//MatrixForm, ".", Σ//MatrixForm, ".", Transpose[V]//MatrixForm] ; 

aU . Σ . Transpose[V]

( {{4, 11, 14}, {8, 7, -2}} )  ( {{3/10^(1/2 ... )  .  ( {{1/3, 2/3, 2/3}, {-2/3, -1/3, 2/3}, {2/3, -2/3, 1/3}} )

Out[37]=

True

a is an mxn matrix with rank r.

In[38]:=

m = 2 ;

n = 3 ;

r = 2 ;

The singular value decomposition of a displays orthogonal bases for the four fundamental subspaces of a as follows:
(See Lay 7.4, p479)

In[41]:=

(* α  [v1, v2]  [v_1, ..., v_r]    basis for Row a *) ... nbsp;      [u_ (r + 1), ..., u_m]  basis for Nul a^T *)

The Moore-Penrose Inverse and Least-Squares Solutions

Consider the matrix of the previous example, ...

In[42]:=

a = ({{4, 11, 14}, {8, 7, -2}}) ;

... and its singular value decomposition, A == U.Σ.V^T.

In[43]:=

Print[a//MatrixForm, "", U//MatrixForm, ".", Σ//MatrixForm, ".", Transpose[V]//MatrixForm] ; 

aU . Σ . Transpose[V]

( {{4, 11, 14}, {8, 7, -2}} )  ( {{3/10^(1/2 ... )  .  ( {{1/3, 2/3, 2/3}, {-2/3, -1/3, 2/3}, {2/3, -2/3, 1/3}} )

Out[44]=

True

a is an mxn matrix with rank r.

In[45]:=

m = 2 ;

n = 3 ;

r = 2 ;

The Moore-Penrose Inverse, A+, or pseudoinverse of A is computed from its singular value decomposition, A == U.Σ.V^T, as follows:
(See Lay 7.4, p479)

In[48]:=

Ur = U[[All, {1, 2}]] ;

Dr = DiagonalMatrix[{σ1, σ2}] ;

Vr = V[[All, {1, 2}]] ; 

aPlus = Vr . Inverse[Dr] . Transpose[Ur] ; 

Print[aPlus//MatrixForm, "  ", Vr//MatrixForm, ".", Inverse[Dr]//MatrixForm, ".", Transpose[Ur]//MatrixForm] ;

( {{-1/180, 13/180}, {1/45, 2/45}, {1/18, -1/18}} )   ᡖ ... ;)  .  ( {{3/10^(1/2), 1/10^(1/2)}, {1/10^(1/2), -3/10^(1/2)}} )

Suppose we wish to solve the least-squares problem A.x == b.

In[53]:=

(* a . x == b *)

Let

In[54]:=

(* xHat = aPlus . b *)

Then,

In[55]:=

(* a . xHat  (Ur . D . Vr^T) . (Vr . D^(-1) . Ur^T . b)  (Ur . D . D^(-1) . Ur^T . b)  Ur . Ur^T . b  bHat *)

where bHat is the orthogonal projection of b onto Col A == Col Ur.
Therefore, xHat is a least squares solution of A.x == b.

It can be shown that this xHat, obtained via the Moore-Penrose inverse of A,
is the shortest least-squares solution of A.x == b
(See Lay 7.4, p480)

Another Example

Calculating another singular value decomposition

Consider the following matrix:
(Lay 7.4, Example 4)

In[56]:=

Clear[a, aTax, evals, evecs, Λ, V, v1, v2, v3, Σ, σ1, σ2, U, u1, u2] 

a = ({{1, -1}, {-2, 2}, {2, -2}}) ;

Find the eigendata for A^TA.

In[58]:=

aTa = Transpose[a] . a ; 

evals = Eigenvalues[aTa] ;

Λ = DiagonalMatrix[evals] ;

%//MatrixForm

Out[61]//MatrixForm=

( {{18, 0}, {0, 0}} )

In[62]:=

evecs = Eigenvectors[aTa] ;

%//MatrixForm

Out[63]//MatrixForm=

( {{-1, 1}, {1, 1}} )

Construct V.

The columns of the matrix V are the orthonormal eigenvectors of A^TA, taken in the proper order.

In[64]:=

v1 = evecs[[1]]/Norm[evecs[[1]]] ;

v2 = evecs[[2]]/Norm[evecs[[2]]] ; 

V = Transpose[{v1, v2}] ;

%//MatrixForm

Out[67]//MatrixForm=

( {{-1/2^(1/2), 1/2^(1/2)}, {1/2^(1/2), 1/2^(1/2)}} )

Check that the columns of V are orthonormal.

In[68]:=

Transpose[V] . V ;

%//MatrixForm

Out[69]//MatrixForm=

( {{1, 0}, {0, 1}} )

Check the diagonalization.

In[70]:=

aTaV . Λ . Inverse[V]

Out[70]=

True

Construct Σ .

In[71]:=

<<LinearAlgebra`MatrixManipulation`

In[72]:=

singularValues = {σ1, σ2} = evals^(1/2) 

Σ = ZeroMatrix[3, 2] ;

Σ[[1, 1]] = σ1 ;

Σ//MatrixForm

Out[72]=

{3 2^(1/2), 0}

Out[75]//MatrixForm=

( {{3 2^(1/2), 0}, {0, 0}, {0, 0}} )

Construct U .

Calculate u1

In[76]:=

u1 = (1/σ1) a . v1

Out[76]=

{-1/3, 2/3, -2/3}

Extend {u1} to an orthonormal basis of R^3.

In[77]:=

u1 = (1/σ1) a . v1 ; 

e1 = {1, 0, 0} ;

f1 = (e1 - e1 . u1 u1) ;

u2 = f1/Norm[f1] ; 

e2 = {0, 1, 0} ;

f2 = (e2 - e2 . u1 u1 - e2 . u2 u2) ;

u3 = f2/Norm[f2] ; 

U = Transpose[{u1, u2, u3}] ;

%//MatrixForm

Out[85]//MatrixForm=

( {{-1/3, (2 2^(1/2))/3, 0}, {2/3, 1/(3 2^(1/2)), 1/2^(1/2)}, {-2/3, -1/(3 2^(1/2)), 1/2^(1/2)}} )

Check that the columns of U are orthonormal.

In[86]:=

Transpose[U] . U ;

%//MatrixForm

Out[87]//MatrixForm=

( {{1, 0, 0}, {0, 1, 0}, {0, 0, 1}} )

Check the singular value decomposition, A == U.Σ.V^T.

In[88]:=

Print[a//MatrixForm, "", U//MatrixForm, ".", Σ//MatrixForm, ".", Transpose[V]//MatrixForm] ;

aU . Σ . Transpose[V]

( {{1, -1}, {-2, 2}, {2, -2}} )  ( {{-1/3, ( ... 2370;)  .  ( {{-1/2^(1/2), 1/2^(1/2)}, {1/2^(1/2), 1/2^(1/2)}} )

Out[89]=

True

Compare with Mathematica's singular value decomposition.

Mathematica's numerical result.

In[90]:=

a = ({{1., -1}, {-2, 2}, {2, -2}}) ; 

{, , } = SingularValueDecomposition[a] ; 

Print[a//MatrixForm, "", //MatrixForm, ".", //MatrixForm, ".", Transpose[]//MatrixForm] ;

a .  . Transpose[]

( {{1., -1}, {-2, 2}, {2, -2}} )  ( {{-0.333 ...  )  .  ( {{-0.707107, 0.707107}, {0.707107, 0.707107}} )

Out[93]=

True

Our numerical result.
These agree up to the precision of the computer arithmetic that is involved.

In[94]:=

Print[a//MatrixForm, "", U//MatrixForm//N, ".", Σ//MatrixForm//N, ".", Transpose[V]//MatrixForm//N] ;

aU . Σ . Transpose[V]

( {{1., -1}, {-2, 2}, {2, -2}} )  ( {{-0.333 ...  )  .  ( {{-0.707107, 0.707107}, {0.707107, 0.707107}} )

Out[95]=

True

The Four Fundamental Subspaces

Consider the matrix of the previous example, ...

In[96]:=

a = ({{1, -1}, {-2, 2}, {2, -2}}) ;

... and its singular value decomposition, A == U.Σ.V^T.

In[97]:=

Print[a//MatrixForm, "", U//MatrixForm, ".", Σ//MatrixForm, ".", Transpose[V]//MatrixForm] ; 

aU . Σ . Transpose[V]

( {{1, -1}, {-2, 2}, {2, -2}} )  ( {{-1/3, ( ... 2370;)  .  ( {{-1/2^(1/2), 1/2^(1/2)}, {1/2^(1/2), 1/2^(1/2)}} )

Out[98]=

True

a is an mxn matrix with rank r.

In[99]:=

m = 3 ;

n = 2 ;

r = 1 ;

The singular value decomposition of a displays orthogonal bases for the four fundamental subspaces of a as follows:
(See Lay 7.4, p479)

In[102]:=

(* α  [v1]  [v_1, ..., v_r]    basis for Row a *)> ... #62371;(* δ  [u2, u3]  [u_ (r + 1), ..., u_m]  basis for Nul a^T *)

The Moore-Penrose Inverse and Least-Squares Solutions

Consider the matrix of the previous example, ...

In[103]:=

a = ({{1, -1}, {-2, 2}, {2, -2}}) ;

... and its singular value decomposition, A == U.Σ.V^T.

In[104]:=

Print[a//MatrixForm, "", U//MatrixForm, ".", Σ//MatrixForm, ".", Transpose[V]//MatrixForm] ; 

aU . Σ . Transpose[V]

( {{1, -1}, {-2, 2}, {2, -2}} )  ( {{-1/3, ( ... 2370;)  .  ( {{-1/2^(1/2), 1/2^(1/2)}, {1/2^(1/2), 1/2^(1/2)}} )

Out[105]=

True

a is an mxn matrix with rank r.

In[106]:=

m = 3 ;

n = 2 ;

r = 1 ;

The Moore-Penrose Inverse, A+, or pseudoinverse of A is computed from its singular value decomposition, A == U.Σ.V^T, as follows:
(See Lay 7.4, p479)

In[109]:=

Ur = Map[List, U[[All, 1]]] ;

Dr = {{σ1}} ;

Vr = Map[List, V[[All, 1]]] ; 

aPlus = Vr . Inverse[Dr] . Transpose[Ur] ; 

Print[aPlus//MatrixForm, "  ", Vr//MatrixForm, ".", Inverse[Dr]//MatrixForm, ".", Transpose[Ur]//MatrixForm] ;

( {{1/18, -1/9, 1/9}, {-1/18, 1/9, -1/9}} )    (> ...  {{1/(3 2^(1/2))}} )  .  ( {{-1/3, 2/3, -2/3}} )

Suppose we wish to solve the least-squares problem A.x == b.

In[114]:=

(* a . x == b *)

Let

In[115]:=

(* xHat = aPlus . b *)

Then,

In[116]:=

(* a . xHat  (Ur . D . Vr^T) . (Vr . D^(-1) . Ur^T . b)  (Ur . D . D^(-1) . Ur^T . b)  Ur . Ur^T . b  bHat *)

where bHat is the orthogonal projection of b onto Col A == Col Ur.
Therefore, xHat is a least squares solution of A.x == b.

It can be shown that this xHat, obtained via the Moore-Penrose inverse of A,
is the shortest least-squares solution of A.x == b
(See Lay 7.4, p480)


Created by Mathematica  (April 5, 2005) Valid XHTML 1.1!