Jump to content

Complete orthogonal decomposition

From Wikipedia, the free encyclopedia

In linear algebra, the complete orthogonal decomposition is a matrix decomposition.[1][2] It is similar to the singular value decomposition, but typically somewhat[3] cheaper to compute and in particular much cheaper and easier to update when the original matrix is slightly altered.[1]

Specifically, the complete orthogonal decomposition factorizes an arbitrary complex matrix into a product of three matrices, , where and are unitary matrices and is a triangular matrix. For a matrix of rank , the triangular matrix can be chosen such that only its top-left block is nonzero, making the decomposition rank-revealing.

For a matrix of size , assuming , the complete orthogonal decomposition requires floating point operations and auxiliary memory to compute, similar to other rank-revealing decompositions.[1] Crucially however, if a row/column is added or removed, its decomposition can be updated in operations.[1]

Because of its form, , the decomposition is also known as UTV decomposition.[4] Depending on whether a left-triangular or right-triangular matrix is used in place of , it is also referred to as ULV decomposition[3] or URV decomposition,[5] respectively.

Construction

[edit]

The UTV decomposition is usually[6][7] computed by means of a pair of QR decompositions: one QR decomposition is applied to the matrix from the left, which yields , another applied from the right, which yields , which "sandwiches" triangular matrix in the middle.

Let be a matrix of rank . One first performs a QR decomposition with column pivoting:

,

where is a permutation matrix, is a unitary matrix, is a upper triangular matrix and is a matrix. One then performs another QR decomposition on the adjoint of :

,

where is a unitary matrix and is an lower (left) triangular matrix. Setting yields the complete orthogonal (UTV) decomposition:[1]

.

Since any diagonal matrix is by construction triangular, the singular value decomposition, , where , is a special case of the UTV decomposition. Computing the SVD is slightly more expensive than the UTV decomposition,[3] but has a stronger rank-revealing property.

See also

[edit]

References

[edit]
  1. ^ a b c d e Golub, Gene; van Loon, Charles F. (15 October 1996). Matrix Computations (Third ed.). Johns Hopkins University Press. ISBN 0-8018-5414-8.
  2. ^ Björck, Åke (December 1996). Numerical methods for least squares problems. SIAM. ISBN 0-89871-360-9.
  3. ^ a b c Chandrasekaran, S.; Gu, M.; Pals, T. (January 2006). "A Fast ULV Decomposition Solver for Hierarchically Semiseparable Representations". SIAM Journal on Matrix Analysis and Applications. 28 (3): 603–622. doi:10.1137/S0895479803436652.
  4. ^ Fierro, Ricardo D.; Hansen, Per Christian; Hansen, Peter Søren Kirk (1999). "UTV Tools: Matlab templates for rank-revealing UTV decompositions" (PDF). Numerical Algorithms. 20 (2/3): 165–194. doi:10.1023/A:1019112103049. S2CID 19861950.
  5. ^ Adams, G.; Griffin, M.F.; Stewart, G.W. (1991). "Direction-of-arrival estimation using the rank-revealing URV decomposition". [Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing. Proc. Of IEEE Internat. Conf. On Acoustics, Speech, and Signal Processing. pp. 1385-1388 vol.2. doi:10.1109/icassp.1991.150681. hdl:1903/555. ISBN 0-7803-0003-3. S2CID 9201732.
  6. ^ "LAPACK – Complete Orthogonal Factorization". netlib.org.
  7. ^ "Eigen::CompleteOrthogonalDecomposition". Eigen 3.3 reference documentation. Retrieved 2023-08-07.