In linear algebra, an orthogonal matrix is a square matrix G whose transpose is its inverse, i.e.,
- GGT = GTG = In.
A real square matrix is orthogonal if and only if its columns form an orthonormal basis of Rn with the ordinary Euclidean dot product, which is the case if and only if its rows form an orthonormal basis of Rn.
Geometrically, orthogonal matrices describe linear transformations of Rn which preserve angles and lengths, such as rotations and reflections. They are compatible with the Euclidean inner product in the following sense: if G is orthogonal and x and y are vectors in Rn, then
- <Gx, Gy> = <x, y>.
- <f(x), f(y)> = <x, y>
The inverse of every orthogonal matrix is again orthogonal, as is the matrix product of two orthogonal matrices. This shows that the set of all n-by-n orthogonal matrices forms a group. It is a Lie group of dimension n(n+1)/2 and is called the orthogonal group, denoted by O(n).
The determinant of any orthogonal matrix is 1 or -1. That can be shown as follows:
- 1 = det(I) = det(GGT) = det(G) det(GT) = (det(G))2.
All eigenvalues of an orthogonal matrix, even the complex ones, have absolute value 1. Eigenvectors for different eigenvalues are orthogonal.
If Q is orthogonal, then one can always find an orthogonal matrix P such that
If A is an arbitrary m-by-n matrix of rank n, we can always write
The complex analog to orthogonal matrices are the unitary matrices.