Next, let's think about vector. I assume the readers know about matrix multiplication a bit. When a matrix applies to a vector, then this generates a new vector. For example, a matrix can rotate a vector, or enlarge/shrink a vector. We say we can apply a matrix A to a vector x. The matrix A could be a rotation operation, or any. The result of application creates a new vector b.
If you see this, it looks like scalar's multiplication. But, usually A is quite complex and hard to understand what it is. But if there is a scalar λ and a vector x', such that
A x' = λ x'.
We can replace a complex matrix A with a scalar value λ.I think this sentence has the whole idea of this topic. Usually it is not possible to find such λ for any vector. But, we have a chance to find a specific x' and its relating a scalar value λ.
When I saw this, I said, ``Wow.'' This is a powerful idea. If I see a 100x100 matrix, I have no idea what this matrix does unless it has a very specific form (e.g., I can tell what an identity matrix does). But, if we could find this pair, a matrix A multiplies the vector x' λ times.
Here, A does something on a vector. For instance, it could be a rotation, a transformation, a magnification. The substance of the matrix is in the λ and x'. These λ and x' are called the substance or property of the matrix: eigenvalue λ and eigenvector x'.
Now you see why mathematicians think about eigenvalues and eigenvectors. A matrix A is usually so complex and nobody can understand it. But, if we can find a simpler form, scalar multiplication, we can see ``an aspect'' of A. This is called eigenanalysis.
By the way, one n by n matrix A (n>1) has usually many eigenvalues and eigenvectors. If we could always have one eigenvalue and one eigenvector for a huge matrix, it sounds simple, but, the world is not so easy. Please note that there are many eigenvalues for one matrix, therefore, I use ``an aspect'' of a matrix.