From our understanding of eigenvalues and eigenvectors we have
discovered several things about our operator matrix,
$A$. We know that if the
eigenvectors of $A$ span
${\mathbb{C}}^{n}$
**and** we know how to express any vector
$x$ in terms of
$\left\{{v}_{1},{v}_{2},\dots ,{v}_{n}\right\}$, then we have the operator
$A$ all figured out. If we have
$A$ acting on $x$, then this is equal to
$A$ acting on the combinations of
eigenvectors. Which we know proves to be fairly easy!

We are still left with two questions that need to be addressed:

- When do the eigenvectors $\left\{{v}_{1},{v}_{2},\dots ,{v}_{n}\right\}$ of $A$ span ${\mathbb{C}}^{n}$ (assuming $\left\{{v}_{1},{v}_{2},\dots ,{v}_{n}\right\}$ are linearly independent)?
- How do we express a given vector $x$ in terms of $\left\{{v}_{1},{v}_{2},\dots ,{v}_{n}\right\}$?

Question #1:

When do the eigenvectors
$\left\{{v}_{1},{v}_{2},\dots ,{v}_{n}\right\}$ of $A$ span
${\mathbb{C}}^{n}$?
Aside:

The proof of this statement is not very hard, but is not
really interesting enough to include here. If you wish to
research this idea further, read Strang, G., "Linear Algebra
and its Application" for the proof.

Question #2:

How do we express a given vector $x$ in terms of
$\left\{{v}_{1},{v}_{2},\dots ,{v}_{n}\right\}$?
$$x={\alpha}_{1}{v}_{1}+{\alpha}_{2}{v}_{2}+\dots +{\alpha}_{n}{v}_{n}$$

1

Let us recall our knowledge of functions and their basis and examine the role of $V$. $$x=V\alpha $$ $$\left(\begin{array}{c}{x}_{1}\\ \vdots \\ {x}_{n}\end{array}\right)=V\left(\begin{array}{c}{\alpha}_{1}\\ \vdots \\ {\alpha}_{n}\end{array}\right)$$ where $\alpha $ is just $x$ expressed in a different basis: $$x={x}_{1}\left(\begin{array}{c}1\\ 0\\ \vdots \\ 0\end{array}\right)+{x}_{2}\left(\begin{array}{c}0\\ 1\\ \vdots \\ 0\end{array}\right)+\dots +{x}_{n}\left(\begin{array}{c}0\\ 0\\ \vdots \\ 1\end{array}\right)$$ $$x={\alpha}_{1}\left(\begin{array}{c}\vdots \\ {v}_{1}\\ \vdots \end{array}\right)+{\alpha}_{2}\left(\begin{array}{c}\vdots \\ {v}_{2}\\ \vdots \end{array}\right)+\dots +{\alpha}_{n}\left(\begin{array}{c}\vdots \\ {v}_{n}\\ \vdots \end{array}\right)$$ $V$ transforms $x$ from the standard basis to the basis $\left\{{v}_{1},{v}_{2},\dots ,{v}_{n}\right\}$

We can also use the vectors $\left\{{v}_{1},{v}_{2},\dots ,{v}_{n}\right\}$ to represent the output, $b$, of a system: $$b=Ax=A({\alpha}_{1}{v}_{1}+{\alpha}_{2}{v}_{2}+\dots +{\alpha}_{n}{v}_{n})$$ $$Ax={\alpha}_{1}{\lambda}_{1}{v}_{1}+{\alpha}_{2}{\lambda}_{2}{v}_{2}+\dots +{\alpha}_{n}{\lambda}_{n}{v}_{n}=b$$ $$Ax=\left(\begin{array}{cccc}\vdots & \vdots & & \vdots \\ {v}_{1}& {v}_{2}& \dots & {v}_{n}\\ \vdots & \vdots & & \vdots \end{array}\right)\left(\begin{array}{c}{\lambda}_{1}{\alpha}_{1}\\ \vdots \\ {\lambda}_{1}{\alpha}_{n}\end{array}\right)$$ $$Ax=V\Lambda \alpha $$ $$Ax=V\Lambda {V}^{-1}x$$ where $\Lambda $ is the matrix with the eigenvalues down the diagonal: $$\Lambda =\left(\begin{array}{cccc}{\lambda}_{1}& 0& \dots & 0\\ 0& {\lambda}_{2}& \dots & 0\\ \mathrm{\vdots}& \mathrm{\vdots}& \ddots & \mathrm{\vdots}\\ 0& 0& \dots & {\lambda}_{n}\end{array}\right)$$ Finally, we can cancel out the $x$ and are left with a final equation for $A$: $$A=V\Lambda {V}^{-1}$$

For our interpretation, recall our key formulas: $$\alpha ={V}^{-1}x$$ $$b=\sum _{i}^{}{\alpha}_{i}{\lambda}_{i}{v}_{i}$$ We can interpret operating on $x$ with $A$ as: $$\left(\begin{array}{c}{x}_{1}\\ \vdots \\ {x}_{n}\end{array}\right)\to \left(\begin{array}{c}{\alpha}_{1}\\ \vdots \\ {\alpha}_{n}\end{array}\right)\to \left(\begin{array}{c}{\lambda}_{1}{\alpha}_{1}\\ \vdots \\ {\lambda}_{1}{\alpha}_{n}\end{array}\right)\to \left(\begin{array}{c}{b}_{1}\\ \vdots \\ {b}_{n}\end{array}\right)$$ where the three steps (arrows) in the above illustration represent the following three operations:

- Transform $x$ using ${V}^{-1}$, which yields $\alpha $
- Multiplication by $\Lambda $
- Inverse transform using $V$, which gives us $b$