In the study of discrete-time signals and systems, concepts from linear algebra often provide additional insight. When it comes to LTI systems, a certain area of linear algebra is particularly helpful: eigenanalysis.
Eigenvectors and eigenvalues
Given a square matrix (one that has the same number of rows as columns) $A$, a vector $v$ is an eigenvector with corresponding scaler eigenvalue $\lambda$ if: $Av=\lambda v$There is a geometric interpretation to this eigenanalysis of the matrix $A$. Multiplying a matrix by one of its eigenvectors produces simply a scaled version of that same eigenvector (scaled by a factor of $\lambda$), so a matrix multiplication of an eigenvector does not change its orientation, only its strength.Consider this example in two dimensions, a square matrix $A$: $A=\begin{bmatrix}3&1 \\ 1&3\end{bmatrix}$ and the vector $v$:$v=\begin{bmatrix}1 \\-1\end{bmatrix}$ Note what happens when we multiply the matrix $A$ by the vector $v$:$\begin{align*} Av&=\begin{bmatrix}3&1 \\ 1&3 \end{bmatrix}\begin{bmatrix}1 \\ -1\end{bmatrix}\\&=\begin{bmatrix}(3-1) \\ (1-3)\end{bmatrix}\\&=\begin{bmatrix}2 \\ -2\end{bmatrix}\\&=2\begin{bmatrix}1 \\ -1\end{bmatrix}\\&=2v \end{align*}$So running the vector $v$ through the matrix $A$ simply scales the vector by 2.
Eigendecomposition: handling multiple eigenvectors and eigenvalues
So we have seen what an eigvenvectors and eigenvalues are for square matrices. Now, an $N\times N$ matrix will have $N$ eigenvectors (not necessarily distinct), each with its own eigenvalue. We can put all of these vectors and values into their own matrices. Suppose the eigenvectors of the matrix are $\{v_m\}_{m=0}^{N-1}$ and the values are $\{\lambda_m\}_{m=0}^{N-1}$. Then we can organize all of them like this: $V= \begin{bmatrix}v_0 | v_1 | \cdots | v_{N-1} \end{bmatrix}$$\Lambda = \begin{bmatrix}\lambda_0 \\&\lambda_1 \\&&\ddots \\&&&\lambda_{N-1} \end{bmatrix} $With those vectors and values collected like that, we can express the eigenvector/value property $Av=\lambda v$ for all of the eigenvectors and eigvenvalues of the matrix $A$ at once: $AV=V\Lambda$
Diagonalization
Now, if the square matrix $V$ is invertible (which will be the case if the columns of $A$ are linearly independent, which of course will happen if the vectors happen to form a basis), then we can do some special things with it.Recall the eigendecomposition relationship: $AV=V\Lambda$If $V$ is invertible, then we can multiply each side of the equation by $V^{-1}$: $V^{-1}AV=V^{-1}V\Lambda=I\Lambda=\Lambda$So $V^{-1}AV=\Lambda$. Because multiplying $A$ by $V$ on one side and $V^{-1}$ on the other produces a diagonal matrix, we say that the matrix $V$ diagonalizes $A$. If we multiply the eigendecomposition matrix the other way, we will have that $A=V\Lambda V^{-1}$. One of the reasons (more of which we'll see later) why we consider this diagonalization of $A$ is that it is easier to matrix multiply with diagonal matrices than full ones.