# Matrix¶

## Eigenvectors in ellipsoids¶

- Correspondence between eigenvalues and eigenvectors in ellipsoids
- Relationship between ellipsoid radii and eigenvalues

## symmetric positive semidefinite¶

## Can matrix exponentials be negative?¶

Refer to Can matrix exponentials ever be negative? If so, under what conditions?.

One argument: if the matrix to have non-negative entries off-diagonal, then the matrix exponentials must be positive.

## Spectral Theorem¶

Refer to Brillinger, D. R. (1981). Time series: data analysis and theory (Vol. 36). Siam..

### Theorem 3.7.1¶

If \H is a J\times J Hermitian matrix, then

where \mu_j is the j-th latent value of \H and \U_j is the corrsponding latent vector.

### Corollary 3.7.1¶

If \H is J\times J Hermitian, then it may written \U\M\bar \U^T where \M = \diag\{\mu_j;j=1,\ldots,J\} and \U=[\U_1,\ldots,\U_J] is unitary. Also if \H is non-negative definite, then \mu_j\ge 0,j=1,\ldots,J.

### Theorem 3.7.2¶

If \Z is J\times K, then

where \mu_j^2 is the j-th latent value of \Z\bar\Z^T (or \bar \Z^T\Z), \U_j is the j-th latent vector of \Z\bar\Z^T and \V_j is the j-th latent vector of \bar \Z^T\bar \Z and it is understood \mu_j\ge 0.

### Corollary 3.7.2¶

If \Z is J\times K, then it may be written \U\M\bar\V^T where the J\times K \M=\diag\{\mu_j:j=1,\ldots,J\}, the J\times J \U is unitary and K\times K \V is also unitary.

### Theorem 3.7.4¶

Let \Z be J\times K. Among J\times K matrices \A of rank L\le J,K

is minimized by

The minimum achieved is \mu_{j+L}^2.

### Corollary 3.7.4¶

The above choice of \A also minimizes

for \A of rank L\le J,K. The minimum achieved is

## Orthogonal matrix¶

Orthogonal matrix implies that both of columns and rows are orthogonal.

Refer to Column Vectors orthogonal implies Row Vectors also orthogonal?

## Decomposition¶

### Cholesky¶

A symmetric, positive definite square matrix A has a Cholesky decomposition into a product of a lower triangular matrix L and its transpose L^T,

This decomposition can be used to convert the linear system Ax=b into a pair of triangular systems, Ly=b,L^Tx=y, which can be solved by forward and back-substitution.

If the matrix A is near singular, it is sometimes possible to reduce the condition number and recover a more accurate a more accurate solution vector x by scaling as

where S is a diagonal matrix whose elements are given by S_{ii}=1/\sqrt{A_{ii}}. This scaling is also known as **Jacobi preconditioning**.

## QR¶

A general rectangular M-by-N matrix A has a QR decomposition into the product of an orthogonal M-by-M square matrix Q, where Q^TQ=I, and an M-by-N right-triangular matrix R,

This decomposition can be used to convert the linear system Ax=b into the triangular system Rx=Q^Tb, which can be solved by back-substitution.

Another use of the QR decomposition is to compute an orthonormal basis for a set of vectors. The first N columns of Q form an orthonormal basis for the range of A, ran(A), when A has full column rank.

## \rk(AB) \leq \rk(A)¶

The range of the matrix M is

Recall that the rank of a matrix M is the dimension of the range R(M) of the matrix M. So we have

In general, if a vector space V is a subset of a vector space W, then we have

Thus, it suffices to show that the vector space \calR(AB) is a subset of the vector space \calR(A).

Consider any vector \y\in\calR(AB). Then there exists a vector \x\in\R^l such that \y=(AB)\x by the definition of the range.

Let \z=B\x\in\R^n.

Then we have

and thus the vector \y is in \calR(A). Thus \calR(AB) is a subset of \calR(A) and we have

refer to Rank of the Product of Matrices AB is Less than or Equal to the Rank of A

## Only real eigenvalues in symmetric matrices¶

refer to The Case of Complex Eigenvalues