If eigenvectors of a matrix A are independent, it is a happy property. Because the matrix A can be diagonalized with a matrix S that column vectors are eigenvectors of A. For example,
Why this is a happy property of A? Because I can find A's power easily.
A^{10} is not a big deal. Because Λ is a diagonal matrix and power of a diagonal matrix is quite simple.
But, this is only possible when the matrix S's columns are independent. Because S^{-1} must be exist.
Now I come back to my first question. Is the λ's multiplicity related with the number of eigenvectors? This time I found this has the name.
There is no rigid relationship between them. There is only an inequality relationship GM <= AM.
For example, a 4x4 matrix's AM = 3 (The number of different λs is 2.), GM is not necessary to be 2.
By the way, this S is a special matrix and called Hadamard matrix. I wrote a blog entry how to compute this matrix. This matrix is so special, it is symmetric, orthogonal, and only contains 1 and -1.
The identity matrix is also an example of such matrix. The eigenvalues of 4x4 identity matrix is λ = 1,1,1,1 and eigenvectors are
I took a day to realize this. But Marc immediately pointed this out.
Though, I still think one λ value corresponds to one eigenvector in general. The number of independent eigenvector is the dimension of null space of A - λ I. The eigenvalue multiplicity is based on this as the form of characteristic function. But, I feel I need to study more to find the deep understanding of this relationship.
Anyway, an interesting thing to me is one eigenvalue can have multiple corresponding eigenvectors.
References:
Gilbert Strang, Introduction to Linear Algebra, 4th Ed.
Why this is a happy property of A? Because I can find A's power easily.
A^{10} is not a big deal. Because Λ is a diagonal matrix and power of a diagonal matrix is quite simple.
A^{10} = SΛ^{10} S^{-1}Then, why if I want to compute power of A? That is the same reason to find eigenvectors. Eigenvectors are a basis of a matrix. A matrix can be represented by a single scalar. I repeat this again. This is the happy point, a matrix becomes a scalar. What can be simpler than a scalar value.
But, this is only possible when the matrix S's columns are independent. Because S^{-1} must be exist.
Now I come back to my first question. Is the λ's multiplicity related with the number of eigenvectors? This time I found this has the name.
- Geometric multiplicity (GM): the number of independent eigenvectors
- Algebratic multiplicity (AM): the number of multiplicity of eigenvalues
There is no rigid relationship between them. There is only an inequality relationship GM <= AM.
For example, a 4x4 matrix's AM = 3 (The number of different λs is 2.), GM is not necessary to be 2.
By the way, this S is a special matrix and called Hadamard matrix. I wrote a blog entry how to compute this matrix. This matrix is so special, it is symmetric, orthogonal, and only contains 1 and -1.
The identity matrix is also an example of such matrix. The eigenvalues of 4x4 identity matrix is λ = 1,1,1,1 and eigenvectors are
I took a day to realize this. But Marc immediately pointed this out.
Though, I still think one λ value corresponds to one eigenvector in general. The number of independent eigenvector is the dimension of null space of A - λ I. The eigenvalue multiplicity is based on this as the form of characteristic function. But, I feel I need to study more to find the deep understanding of this relationship.
Anyway, an interesting thing to me is one eigenvalue can have multiple corresponding eigenvectors.
References:
Gilbert Strang, Introduction to Linear Algebra, 4th Ed.
Comments