This page is optimized for mobile devices, if you would prefer the desktop version just click here

5.1 Eigenanalysis of lti systems (finite-length signals)

In the study of discrete-time signals and systems, concepts from linear algebra often provide additional insight. When it comes to LTI systems, a certain area of linear algebra is particularly helpful: eigenanalysis.

Eigenvectors and eigenvalues

Given a square matrix (one that has the same number of rows as columns) $A$, a vector $v$ is an eigenvector with corresponding scaler eigenvalue $\lambda$ if: $Av=\lambda v$There is a geometric interpretation to this eigenanalysis of the matrix $A$. Multiplying a matrix by one of its eigenvectors produces simply a scaled version of that same eigenvector (scaled by a factor of $\lambda$), so a matrix multiplication of an eigenvector does not change its orientation, only its strength.

Consider this example in two dimensions, a square matrix $A$: $A=\begin{bmatrix}3&1 \\ 1&3\end{bmatrix}$ and the vector $v$:$v=\begin{bmatrix}1 \\-1\end{bmatrix}$ Note what happens when we multiply the matrix $A$ by the vector $v$:$\begin{align*} Av&=\begin{bmatrix}3&1 \\ 1&3 \end{bmatrix}\begin{bmatrix}1 \\ -1\end{bmatrix}\\&=\begin{bmatrix}(3-1) \\ (1-3)\end{bmatrix}\\&=\begin{bmatrix}2 \\ -2\end{bmatrix}\\&=2\begin{bmatrix}1 \\ -1\end{bmatrix}\\&=2v \end{align*}$So running the vector $v$ through the matrix $A$ simply scales the vector by 2.

Eigendecomposition: handling multiple eigenvectors and eigenvalues

So we have seen what an eigvenvectors and eigenvalues are for square matrices. Now, an $N\times N$ matrix will have $N$ eigenvectors (not necessarily distinct), each with its own eigenvalue. We can put all of these vectors and values into their own matrices. Suppose the eigenvectors of the matrix are $\{v_m\}_{m=0}^{N-1}$ and the values are $\{\lambda_m\}_{m=0}^{N-1}$. Then we can organize all of them like this: $V= \begin{bmatrix}v_0 | v_1 | \cdots | v_{N-1} \end{bmatrix}$$\Lambda = \begin{bmatrix}\lambda_0 \\&\lambda_1 \\&&\ddots \\&&&\lambda_{N-1} \end{bmatrix} $With those vectors and values collected like that, we can express the eigenvector/value property $Av=\lambda v$ for all of the eigenvectors and eigvenvalues of the matrix $A$ at once: $AV=V\Lambda$

Diagonalization

Now, if the square matrix $V$ is invertible (which will be the case if the columns of $A$ are linearly independent, which of course will happen if the vectors happen to form a basis), then we can do some special things with it.

Recall the eigendecomposition relationship: $AV=V\Lambda$If $V$ is invertible, then we can multiply each side of the equation by $V^{-1}$: $V^{-1}AV=V^{-1}V\Lambda=I\Lambda=\Lambda$So $V^{-1}AV=\Lambda$. Because multiplying $A$ by $V$ on one side and $V^{-1}$ on the other produces a diagonal matrix, we say that the matrix $V$ diagonalizes $A$. If we multiply the eigendecomposition matrix the other way, we will have that $A=V\Lambda V^{-1}$. One of the reasons (more of which we'll see later) why we consider this diagonalization of $A$ is that it is easier to matrix multiply with diagonal matrices than full ones.

Lti systems and eigenanalysis

Perhaps you may be wondering how all of this linear algebra relates to discrete-time systems. For LTI systems operating on finite-length discrete-time systems, the input output relationship is: $y=Hx,$where $H$ is a circulant matrix (each row being a circularly shifted version of the system impulse response $h$).

<< Chapter < Page Page > Chapter >>

Read also:

OpenStax, Discrete-time signals and systems. OpenStax CNX. Oct 07, 2015 Download for free at https://legacy.cnx.org/content/col11868/1.2
Google Play and the Google Play logo are trademarks of Google Inc.
Jobilize.com uses cookies to ensure that you get the best experience. By continuing to use Jobilize.com web-site, you agree to the Terms of Use and Privacy Policy.