Singular Value Decomposition (SVD)

@(LinearAlgebra)

SVD

Given a rectangular matrix , its singular value decomposition is written as

where

  • , : matrices with orthonormal columns, providing an orthonormal basis of Col and Row , respectively.
  • : a diagonal matrix whose entries are in a decreasing order, i.e.,

Basic Form of SVD

Given a rectangular matrix where , SVD gives

01

SVD as Sum of Rand 1 Outer Products

can also be represented as the sum of outer products

02

Reduced Form of SVD

can also be represented as the sum of outer products

03

Another Perspective of SVD

  • We can easily find two orthonormal basis sets, for Col and for Row , by using, say, Gram–Schmidt orthogonalization.
  • Are these unique orthonormal basis sets?
  • No. Then, can we jointly find them such that

  • Let us denote , , and

  • Consider and
  • since has orthonormal columns.
  • Thus, .

Computing SVD

  • First, we form and and compute eigendecomposition of each:
  • Can we find the following?
    1. Orthogonal eigenvector matrices and
    2. Eigenvalues in that are all positive
    3. Eigenvalues in that are shared by and
  • Yes, since and are symmetric positive (semi-)definite.

Symmetric Positive Definite Matrices and Spectral Decomposition

• If is symmetric and positive-definite, then the spectral decomposition will have all positive eigvenvalues:

where .

and are symmetric positive (semi-)definite!

  • Symmetric:

  • Positive (semi-)definite

  • Thus, we can find

    1. Orthogonal eigenvector matrices and .
    2. Eigenvalues in that are all positive

Things to Note

  • Given any rectangular matrix , its SVD always exists.
  • Given a square matrix , its eigendecomposition does not always exist, but its SVD always exists.
  • Given a square, symmetric positive (semi-)definite matrix , its eigendecomposition always exists, and it is actually the same as its SVD.

results matching ""

    No results matching ""