In linear algebra, a scalar λ is called an

**eigenvalue**(in some older texts, a

**characteristic value**) of a linear mapping

*A*if there exists a nonzero vector

*x*such that

*Ax*=λ

*x*. The vector

*x*is called an eigenvector.

In matrix theory, an element in the underlying ring *R* of a square matrix *A* is called a **right eigenvalue** if there exists a nonzero column vector *x* such that *Ax*=λ*x*, or a **left eigenvalue** if the exists a nonzero row vector *y* such that *yA*=*y*λ. If *R* is commutative, the left eigenvalues of *A* are exactly the right eigenvalues of *A* and are just called **eigenvalues**. If *R* is not commutative, e.g. quaternions, they may be different.

Table of contents |

2 Spectrum 3 Multiset of eigenvalues 4 Trace and Determinant 5 See also |

## Multiplicity

Suppose*A*is a square matrix over commutative ring. The

**algebraic multiplicity**(or simply

**multiplicity**) of an eigenvalue λ of

*A*is the number of factor

*t*-λ of the characteristic polynomial of

*A*. The

**geometric multiplicity**of λ is the number of factor

*t*-λ of the minimal polynomial of

*A*or equivalently the nullity of (λI-

*A*).

An eigenvalue of algebraic multiplicity 1 is called a *simple eigenvalue*.

## Spectrum

In functional analysis, a spectrum of a linear operator*A*is the set of scalar ν such that νI-

*A*is not invertible. If the underlying Hilbert space is of finite dimensional, then the spectrum of

*A*is the same of the set of eigenvalues of

*A*.

## Multiset of eigenvalues

Occasionally, in an article on matrix theory, one may read a statement like:- The eigenvalues of a matrix
*A*are 4,4,3,3,3,2,2,1.

This style is used because algebraic multiplicity is the key to many mathematical proofs in matrix theory.

## Trace and Determinant

Suppose the eigenvalues of a matrix*A*are λ

_{1},λ

_{2},...,λ

_{n}. Then the trace of

*A*is λ

_{1}+λ

_{2}+...+λ

_{n}and the determinant of

*A*is λ

_{1}λ

_{2}λ

_{n}. These two are very important concepts in matrix theory.