This page is optimized for mobile devices, if you would prefer the desktop version just click here

Orthogonal bases

Recall from before (LINK), how we can understand discrete-time signals to be vectors in a vector space. There are some very useful reasons why we might want to express some signal in a vector space in terms of other signals in that space. To better understand how all of this works, and to give us some mathematical foundations for it, we will consider the concept of bases.

The basis of a vector space

Suppose we have some vector space $V$, such as $R^N$ or $C^N$, i.e., real or complex valued finite-length (of length $N$) discrete time signals. We define a basis for $V$ is a set of vectors $\{b_k\},b_k\in V$ which span $V$ and are linearly independent. By spanning $V$, we mean that any vector in $V$ can be expressed as a linear combination of one or more vectors in $\{ b_k \}_{k=0}^{N-1}$: $\sum_{k=0}^{N-1} \alpha_k \, b_k ~=~ \alpha_0\,b_0 + \alpha_1\,b_1 + \cdots + \alpha_{N-1}\,b_{N-1} \quad \forall \, x\in V, \alpha_k\in C$By the vectors in the set $\{b_k\}$ being linearly independent, we mean that no vector in that set can be expressed as a linear combination of any of the others. The number of these spanning and linearly independent vectors in the basis is the dimension of the basis. Our example bases $R^N$ or $C^N$ are of dimension--you guessed it--$N$.

The basis matrix

For the sake of cleaner and simpler mathematical expression--as well as the ability to connect the concept of a basis with other linear algebra tools--we can create a matrix with basis vectors as columns. If the dimension of the basis is $N$, then this collection of basis vectors will be an $N\times N$ matrix, which we'll call $B$: $\textbf{B} ~=~ \begin{bmatrix} b_0 | b_1 | \cdots | b_{N-1} \end{bmatrix}$Recall we can express any vector in a vector space as a linear combination of the basis vectors. We can put these weights $\{\alpha_k\}$ into an $N\times 1$ column vector: $a = \begin{bmatrix}\alpha_0 \\ \alpha_1 \\ \vdots \\ \alpha_{N-1} \end{bmatrix}$With the basis matrix $\textbf{B}$ and the weights vector $a$, we can express the linear combination using a simple matrix multiplication: $x=\textbf{B}a$So we see that we can use the basis matrix and weights vector to refer to any vector in the vector space, through the linear combination of the basis vectors (which we can express with a matrix multiplication. Now, it is a natural question to ask, given some vector $x$ in the vector space, how do we find the weights $a$ that will produce the expression $x=\textbf{B}a$? If we would like to express $x$ as a linear combination of basis vectors, it is important that we know how to find those weights!Thankfully, it is very straightforward. Simply multiply each side of the matrix multiplication equation by the inverse of the basis matrix: $\begin{align*}x&=\textbf{B}a\\ \textbf{B}^{-1}x&=\textbf{B}^{-1}\textbf{B}a\\ \textbf{B}^{-1}x&=\textbf{I}a\\ \textbf{B}^{-1}x&=a\\ \end{align*}$The weight vector is simply the inverse of the basis matrix, times the vector $x$ (the one we want to express in terms of the basis matrix).

Orthogonal and orthonormal: special bases

We've already seen that a basis is a special collection of vectors from some vector space: it is a collection that spans the space, and is mutually linearly independent. If we add another requirement or two, we end up with two important sub-classes of bases. Suppose the vectors in some basis are not only spanning and linearly independent (which of course they must be, by definition, to form a basis), but that they are also mutually orthogonal, that the inner product of any basis vector with any other basis vector is 0: $\forall b_k\in\{ b_k \}_{k=0}^{N-1},\langle b_k, b_l \rangle ~=~ 0, \quad k\neq l$If such is the case, than this basis is said to be an orthogonal basis .

So an orthogonal basis is a particular kind of basis, one whose vectors are mutually orthogonal. Among orthogonal bases, there are some whose vectors have unit 2-norms: $\langle b_k, b_l \rangle ~=~ 0, \quad k\neq l\\\| b_k \|_2 ~=~ 1$ Bases with this additional property are known as orthonormal bases.

Basis matrix of an orthonormal basis

Like any other basis, the vectors of an orthonormal basis can be put together to form a basis matrix. Recall how we find the weights $a$ to express some vector $x$ in terms of the basis matrix $\textbf{B}$: $a=\textbf{B}^{-1}x$.The reason that orthonormal bases are so special is that, in contrast to other bases, their matrix inverses are extremely easy to find. For orthonormal basis matrices, $\textbf{B}^{-1}=\textbf{B}^H$The inverse of an orthonormal basis matrix is simply its Hermitian (conjugate) transpose! So for these bases, $a=\textbf{B}^{H}x$.In linear algebra, a matrix that has the property that its inverse is simply its Hermitian transpose is called a unitary matrix . If the matrix is real-valued, it is called an orthogonal matrix (a bit of unfortunate nomenclature, considering its columns are actually mutually ORTHONORMAL).

Orthonormal basis signal representation

Putting this all together, we can see the two key aspects of signal representation with orthonormal bases. There is the synthesis side, that we can build up any vector $x$ in the vector space through a linear combination of basis vectors. And there is the analysis side, that we can find the proper weights of this linear combination by multiplying $x$ by the Hermitian transpose of the basis matrix. --Synthesis: $x ~=~ \textbf{B}a ~=~ \sum_{k=0}^{N-1} \alpha_k \, b_k$ --Analysis: $a=\textbf{B}^H x~,\textrm{or,}~ \alpha_k ~=~\langle x, b_k$

<< Chapter < Page Page > Chapter >>

Read also:

OpenStax, Discrete-time signals and systems. OpenStax CNX. Oct 07, 2015 Download for free at https://legacy.cnx.org/content/col11868/1.2
Google Play and the Google Play logo are trademarks of Google Inc.
Jobilize.com uses cookies to ensure that you get the best experience. By continuing to use Jobilize.com web-site, you agree to the Terms of Use and Privacy Policy.