Matrix Equations

Go to: Introduction, Notation, Index



In all the equations below, x, y, z, X, Y and Z are the unknown vectors or matrices.


Discrete-time Lyapunov Equation

The discrete-time Lyapunov equation is AXAH - X + Q = 0 where Q is hermitian. This is a special case of the Stein equation.

The equivalent equation for continuous-time systems is the Lyapunov equation.


Discrete Riccati Equation

The discrete Riccati equation is the quadratic equation [A, X: n#n; B: n#m; C: m#n; R, Q: hermitian] X = AHXA - (C+BHXA)H(R+BHXB)-1(C+BHXA) + Q


Quadratic Form Optimization

Suppose H[n#n]=UDUH is hermitian, U is unitary and D=diag(d)=diag(eig(H)) contains the eigenvalues in decreasing order. Then the corresponding quadratic form is the real-valued expression xHHx.

We can generalize the Rayleigh-Ritz theorem to multiple dimensions in either of two ways which surprisingly turn out to be equivalent. If W is +ve definite Hermitian and B is Hermitian, then

where d are the eigenvalues of W-1B sorted into decreasing order and these bounds are attained by taking the columns of X to be the corresponding eigenvectors.

Linear Discriminant Analysis (LDA): If vectors x are randomly generated from a number of classes with B the covariance of the class means and W the average covariance within each class, then tr((XHWX)-1 XHBX) and det((XHWX)-1 XHBX) are two alternative measures of class separability. We can find a dimension-reducing transformation that maximizes separability by taking y = ATx where the columns of A[k#n] are the eigenvectors of W-1B corresponding to the k largest eigenvalues. This choice maximizes both separability measures for any given k.


Linear Equation

A linear equation has the form Ax - b = 0.

Exact Solution

Least Squares solutions

If there is no exact solution, we can find the x that minimizes d = ||Ax-b|| = (Ax - b)H(Ax - b) .

Recursive Least Squares

We can express the least squares solution to the augmented equation [A; U]y - [b; v] = 0 in terms of the least squares solution to Ax - b = 0.

[rank(Am#n)=n] The least squares solution to the is y = x + K(v-Ux) where x is the least squares solution to Ax-b=0 and K = (AHA)-1UH(I+U(AHA)-1UH)-1. The inverse of the augmented grammian is given by ([A; U]H[A; U])-1 = (AHA)-1-KU(AHA)-1. Thus finding the least squares solution of the augmented equation requires the inversion of a matrix, (I+U(AHA)-1UH), whose dimension equals the number of rows of U instead of the number of rows of  [A; U]. The process is particularly simple if U has only one row. The computation may be reduced at the expense of numerical stability by calculating (AHA)-1UH as (U(AHA)-1)H.


Lyapunov Equation

The (continuous) Lyapunov equation is AX + XAH + Q = 0 where Q is hermitian. This is a special case of the Sylvester equation.

The equivalent equation for discrete-time systems is the Stein equation.


Riccati Equation

The (continuous) Riccati equation is the quadratic equation [A, X, C, D: n#n; C, D: hermitian] XDX + XA + AHX - C = 0


Stein Equation

A Stein equation has the form AXB - X + Q = 0.


Sylvester Equation

The Sylvester equation is AX + XB + Q = 0


This page is part of The Matrix Reference Manual. Copyright © 1998-2022 Mike Brookes, Imperial College, London, UK. See the file gfl.html for copying instructions. Please send any comments or suggestions to "mike.brookes" at "imperial.ac.uk".
Updated: $Id: equation.html 11291 2021-01-05 18:26:10Z dmb $