Skip to main content

Posts

Showing posts from December, 2010

Why A^{T}A is invertible? (4) Linear Algebra

Appendix A: null space and column space We use the null space for the proof, therefore, I will explain the null space a bit. If you know about the null space, of course you can skip this entry. The null space of a matrix A is a set of non-zero vector x that satisfies A x = 0 . Let me show you an example square matrix A that has null space. When x \neq 0 , following x is a solution. Therefore, this x is a null space of A . When an a  is an scalar, a x , a \neq 0 are also the solutions. It means these are also null space. In this example, the matrix is singular (and square). If the matrix is not singular, the solution must be 0 only. Because, if a square matrix is not singular, there is the inverse, Therefore, x = 0 . In this case, we say there is no null space. Let me show you another example, but this time a rectangle matrix A that has null space. The solution is the same as the last example. By the way, these are all about definition of null space. I could fin

Why A^{T}A is invertible? (3) Linear Algebra

By the way, one of my friend told me my blog is totally ununderstandable. If you share my friend's opinion, and if something I could improve, please let me know. Thanks! Explanation from Linear combination and basis transformation Let me explain this from another point of view. This explanation doesn't need null space, but, it lacks a rigorous proof. However, we have a proof by null space and so I think this is correct. For me this is a more intuitive explanation. From the independence of the columns of A , the independent vectors will be organized as the following. You can also see the how the resulting columns are computed. Since A^T is a linear operator, This becomes 0 when A^{T}  a _*  degenerates. However, this is also the same form of basis transformation. Each basis is each column. Because, A^{T} 's row are independent (since A 's column are independent, transposing A exchanges columns with rows.) Each independent row vector is projected to the each ind

Why A^{T}A is invertible? (2) Linear Algebra

Why A^{T}A has the inverse Let me explain why A^{T}A has the inverse, if the columns of A are independent. First, if a matrix is n by n, and all the columns are independent, then this is a square full rank matrix. Therefore, there is the inverse. So, the problem is when A is a m by n, rectangle matrix.  Strang's explanation is based on null space. Null space and column space are the fundamental of the linear algebra. This explanation is simple and clear. However, when I was a University student, I did not recall the explanation of the null space in my linear algebra class. Maybe I was careless. I regret that... Explanation based on null space This explanation is based on Strang's book. Column space and null space are the main characters. Let's start with this explanation. Assume  x  where x is in the null space of A .  The matrices ( A^{T} A ) and A share the null space as the following: This means, if x is in the null space of A , x is also in the null spa

Why A^{T}A is invertible? (1) Linear Algebra

We can see matrix multiplication A^{T}A frequently, for example, in the least square method. If matrix A has independent columns, this matrix has the following great properties: it is square, symmetric, and has the inverse. I am interested in why A^{T}A has this properties and would like to explain that. The explanation is based on Gilbert Strang's Introduction of Linear Algebra, 4th edition. The explanation uses null space and quadric form, it is an elegant explanation. But, if I just follow the Strang's explanation, I only need to write: see Strang book. So, I would like to add a geometric explanation. Although, I don't have a rigid proof, it is just a little bit intuitive for me. Yet, I hope you can enjoy that. Even if my explanation is bad, you can still see the Strang's explanation. Properties of A^{T}A We can see matrix multiplication A^{T}A frequently. We can see this pattern in a projection operation, least square computation (Actually, those two are the