Why A^{T}A is invertible? (3) Linear Algebra

By the way, one of my friend told me my blog is totally ununderstandable. If you share my friend's opinion, and if something I could improve, please let me know. Thanks!

Explanation from Linear combination and basis transformation

Let me explain this from another point of view. This explanation doesn't need null space, but, it lacks a rigorous proof. However, we have a proof by null space and so I think this is correct. For me this is a more intuitive explanation.

From the independence of the columns of A, the independent vectors will be organized as the following. You can also see the how the resulting columns are computed.
Since A^T is a linear operator,
This becomes 0 when A^{T} a_* degenerates. However, this is also the same form of basis transformation. Each basis is each column. Because, A^{T}'s row are independent (since A's column are independent, transposing A exchanges columns with rows.) Each independent row vector is projected to the each independent columns. The column vector's independence is preserved unless A^{T} a_* degenerates.

Unfortunately, I have no proof of degeneration. But, what happens is an independent basis is transformed into another independent basis. I don't see how this independence breaks under this condition.

By the way, this matrix is called metric tensor. My favorite explanation of this is in the book ``Mathematical sciences of graphics'' by Koukichi Sugihara, Chapter 1. (I have the first edition, the third printing. This has a typo in Equation 1.20, page 19. But, the introducing equations are correct and only the result equation has a typo.)

Next I would like to have an appendix, explanation about null space, column space, and existence of inverse. This might not be necessary, so the story about A^T A ended here.

No comments: