Eigenvalues of a matrix are a fundamental concept in linear algebra that provide valuable information about the behavior of matrices. They play a crucial role in various fields, including physics, engineering, data analysis, and computer science. In this article, we will explore the concept of eigenvalues, understand their significance, and delve into methods for finding the eigenvalues of a matrix.

We assume that you are familiarized with the matrix operations.

## What are the eigenvalues of a matrix?

Eigenvalues of a matrix are scalar values associated with square matrices that reveal important properties of the matrices. For an $n \times n$ matrix $A$, an eigenvalue $\lambda$ is a value for which there exists a non-zero vector $x$, known as an eigenvector, such that $Ax = λx$. Eigenvalues and eigenvectors capture how matrices stretch, rotate, or scale vectors in different directions.

## The characteristic equation of matrix

One way to find eigenvalues is by solving the characteristic equation. The characteristic equation for matrix A is given by $$P_A(\lambda):=\det(\lambda I-A):=|\lambda I-A| = 0,$$where I is the identity matrix. Solving this equation yields the eigenvalues of the matrix $A$.

Note that $P_A(\cdot)$ is a polynomial, called the characteristic polynomial of the matrix $A$.

The characteristic equation allows us to uncover the eigenvalues by finding the values of $\lambda$ that make the determinant of $\lambda I-A$ equal to zero.

The set of eigenvalues of a matrix $A$ is called the spectrum of $A$ and will be denoted by $\sigma(A)$. Some time to compute the eigenvalues of a matrix it suffices to solve simple quadratic equations.

## The eigenspace associated with that eigenvalue

By definition, the eigenspace associated with an eigenvalue $\lambda$ of a matrix $A$ is the kernel of the matrix $\lambda I-A$, that is, $\ker(\lambda I-A)$. In the following example, we show how to determine the set of eigenvalues and the associated eigenspace.

### Example

Determine the eigenvalues and their associated eigenspaces of the following matrix $$A=\begin{pmatrix} 2&-1&1\\ -1&2&-1\\ -1&1&0\end{pmatrix}.$$

For any $\lambda\in\mathbb{C}$ we have \begin{align*} \det( \lambda I_3-A)= \begin{vmatrix} \lambda-2&1&-1\\1&\lambda-2&1\\1&-1&\lambda\end{vmatrix}. \end{align*} We know that the determinant is unchanged if we replace any line of the matrix with a combination of the other lines “also if we replace any column with a linear combination of the others columns”. We denote by $L_i$ for $i=1,2,3,$ the lines of any matrix of order $3$. By replacing the line $L_1$ by $L_1+L_3$, we obtain \begin{align*} \det( \lambda I_3-A)&= \begin{vmatrix} \lambda-1&0&\lambda-1\\1&\lambda-2&1\\1&-1&\lambda\end{vmatrix}\cr &= (\lambda-1)\begin{vmatrix} 1&0&1\\1&\lambda-2&1\\1&-1&\lambda\end{vmatrix} \end{align*} In the lest determinant we replace the line $L_2$ by $L_2-L_3,$ we obtain \begin{align*} \det( \lambda I_3-A)&= (\lambda-1)\begin{vmatrix} 1&0&1\\0&\lambda-1&1-\lambda\\1&-1&\lambda\end{vmatrix}\cr &=(\lambda-1)^2\begin{vmatrix} 1&0&1\\0&1&-1\\1&-1&\lambda\end{vmatrix} \end{align*} We replace the line $L_3$ by $L_3+L_2-L_1$, we obtain \begin{align*} \det( \lambda I_3-A) &=(\lambda-1)^2\begin{vmatrix} 1&0&1\\0&1&-1\\0&0&\lambda-2\end{vmatrix}\cr &= (\lambda-1)^2(\lambda-2). \end{align*} Then the matrix $A$ process two eigenvalues “$\lambda_1=1$ is a double eigenvalue and $\lambda_2=2$ is a simple eigenvalue”.

Let us denote by $E_1$ the characteristic space associated with eigenvalue $\lambda_1=1$ and $E_2$ the characteristic space associated with eigenvalue $\lambda_2=2$. By definition we have \begin{align*} E_1=\ker(I_3-A),\quad E_2=\ker(2I_3-A). \end{align*} Then \begin{align*} X=\left(\begin{smallmatrix}x\\y\\z\end{smallmatrix}\right)\in E_1&\;\Longleftrightarrow\; A X=X\cr &\;\Longleftrightarrow\;\begin{cases} x-y+z=0\\ -x+y_z=0\\ -x+y-z=0\end{cases} \cr &\;\Longleftrightarrow\; x-y+z=0 \cr &\;\Longleftrightarrow\; X=\left(\begin{smallmatrix}y-z\\y\\z\end{smallmatrix}\right) \cr &\;\Longleftrightarrow\; X=\left(\begin{smallmatrix}y\\y\\0\end{smallmatrix}\right)+\left(\begin{smallmatrix}-z\\0\\z\end{smallmatrix}\right)\cr &\;\Longleftrightarrow\; X=y\left(\begin{smallmatrix}1\\1\\0\end{smallmatrix}\right)+z\left(\begin{smallmatrix}-1\\0\\1\end{smallmatrix}\right)\cr &;\Longleftrightarrow\; X\in {\rm span}\left\{\left(\begin{smallmatrix}1\\1\\0\end{smallmatrix}\right),\left(\begin{smallmatrix}-1\\0\\1\end{smallmatrix}\right)\right\}. \end{align*} This shows that \begin{align*} E_1={\rm span}\left\{\left(\begin{smallmatrix}1\\1\\0\end{smallmatrix}\right),\left(\begin{smallmatrix}-1\\0\\1\end{smallmatrix}\right)\right\}. \end{align*} Now we calculate $E_2$. Let $X=\left(\begin{smallmatrix}x\\y\\z\end{smallmatrix}\right)$. Then \begin{align*} X\in E^2&\;\Longleftrightarrow\; AX=2X\cr &\;\Longleftrightarrow\;\begin{cases} -y+z=0\\ -x-z=0\\-x+y-2z=0\end{cases} \cr &\;\Longleftrightarrow\; y=z,\quad x=-z \cr &\;\Longleftrightarrow\; X=\left(\begin{smallmatrix}-z\\z\\z\end{smallmatrix}\right)\cr &\;\Longleftrightarrow\; X=z\left(\begin{smallmatrix}-1\\1\\1\end{smallmatrix}\right)\cr &\;\Longleftrightarrow\; X\in {\rm span}\left\{\left(\begin{smallmatrix}-1\\1\\1\end{smallmatrix}\right)\right\}. \end{align*} Hence \begin{align*} E_2={\rm span}\left\{\left(\begin{smallmatrix}-1\\1\\1\end{smallmatrix}\right)\right\}. \end{align*}

## Algebraic Multiplicity and Geometric Multiplicity

Eigenvalues can have both algebraic multiplicity and geometric multiplicity. The algebraic multiplicity represents the number of times an eigenvalue appears as a root of the characteristic equation, while the geometric multiplicity reflects the dimension of the eigenspace associated with that eigenvalue. Understanding the relationship between these multiplicities provides insights into the behavior of the matrix.

In the previous example, we have seen that the matrix $A$ has two eigenvalues $\lambda_1=1$ and $\lambda_2=2$. Clearly, the algebraic multiplicity of $\lambda_1$ is equal to 1, while the algebraic multiplicity of $\lambda_2$ is 2. We have also proved that the eigenspace $\ker(I-A)$ is spanned by a non-null vector, that $\dim(\ker(I-A))=1$. Thus the geometric multiplicity of the eigenvalue $\lambda_1=1$ is equal to $1$. On the other hand, we have also proved that the eigenspace $\ker(I-A)$ is spanned by two linearly independent vectors. Therefore, $\dim(\ker(2I-A))=2$. Thus the geometric multiplicity of the eigenvalue $2$ is equal to $2$.

## Computing Eigenvalues

Several methods can be employed to find eigenvalues efficiently. These include power iteration, which iteratively calculates dominant eigenvalues, and the QR algorithm, which provides a systematic approach to finding all eigenvalues. Additionally, specialized algorithms such as the Jacobi method and the Arnoldi iteration are used for large and sparse matrices.

## Applications of Eigenvalues of a Matrix

Eigenvalues have broad applications in different fields. In physics, they help understand the behavior of physical systems, such as quantum mechanics and oscillatory systems. In data analysis and machine learning, eigenvalues are crucial for dimensionality reduction techniques like principal component analysis (PCA). Eigenvalues are also essential in solving differential equations and studying dynamic systems.

## Conclusion Eigenvalues of a matrix

Eigenvalues provide valuable insights into the behavior of matrices and have extensive applications across various fields. By understanding the concept of eigenvalues and employing methods for their computation, we gain a deeper understanding of the underlying structure and properties of matrices. The ability to find and analyze eigenvalues empowers us to tackle complex problems, make accurate predictions, and unlock the potential of linear algebra in diverse domains.

Previous Story

Next Story