*A^{T}A*frequently, for example, in the least square method. If matrix

*A*has independent columns, this matrix has the following great properties: it is square, symmetric, and has the inverse. I am interested in why

*A^{T}A*has this properties and would like to explain that. The explanation is based on Gilbert Strang's Introduction of Linear Algebra, 4th edition. The explanation uses null space and quadric form, it is an elegant explanation. But, if I just follow the Strang's explanation, I only need to write: see Strang book. So, I would like to add a geometric explanation. Although, I don't have a rigid proof, it is just a little bit intuitive for me. Yet, I hope you can enjoy that. Even if my explanation is bad, you can still see the Strang's explanation.

Properties of

*A^{T}A*

We can see matrix multiplication

*A^{T}A*frequently. We can see this pattern in a projection operation, least square computation (Actually, those two are the same).

In general, we can assume the following assumptions.

- When A is not a square matrix and would like to know the least square solution. In this case, the matrix is m by n and m > n. This means, the number of equations is larger than the unknowns. When this happens? For example, you observe some signals or take some statistics, like take a photograph to analyze something. You sometimes take samples exact the same, but the result might differ because of observation error. In this case, you have more samples than the parameters. If m < n case, you can still take more samples. So, m by n is a general case.
- We could make the columns of
*A*independent. If not, we can trash such columns. If we don't know how many parameters are in the model and remove too many columns, we might have a wrong answer. This is rather a technical issue, it is just possible. This just says, we can do a kind of best effort.

If the second assumption, independent columns, is true,

*A^{T}A*has the following nice properties:

- Square
- Symmetric
- Existence of inverse

Therefore, this matrix is useful.

I would like to stress that the column independence of

*A*is necessary. Otherwise, there is no inverse of

*A^{T}A*. You can easily see the following.

It is rather easy to see why

*A^{T}A*is square matrix, since [n by m] [m by n] is [n by n] because of the matrix multiplication rule.

You can also see the following shows the symmetric property.

Where

*A^{{T}^{T}} = A*, transpose of transpose is the original matrix. The question is existence of the inverse.

## No comments:

Post a Comment