In: Advanced Math
Gram method for computing least squares approximate solution. Algorithm 12.1 in the textbook uses the QR factorization to compute the least squares approximate solution xˆ = A†b, where the m × n matrix A has linearly independent columns. It has a complexity of 2mn2 flops. In this exercise we consider an alternative method: First, form the Gram matrix G = AT A and the vector h = AT b; and then compute xˆ = G−1h (using algorithm 11.2 in the textbook). What is the complexity of this method? Compare it to algorithm 12.1. Remark. You might find that the Gram algorithm appears to be a bit faster than the QR method, but the factor is not large enough to have any practical significance. The idea is useful in situations where G is partially available and can be computed more efficiently than by multiplying A and its transpose.
Gram method for computing least squares approximate solution. Algorithm 12.1 in the textbook uses the QR factorization to compute the least squares approximate solution xˆ = A†b, where the m × n matrix A has linearly independent columns. It has a complexity of 2mn2 flops. In this exercise we consider an alternative method: First, form the Gram matrix G = AT A and the vector h = AT b; and then compute xˆ = G−1h (using algorithm 11.2 in the textbook). What is the complexity of this method? Compare it to algorithm 12.1. Remark. You might find that the Gram algorithm appears to be a bit faster than the QR method, but the factor is not large enough to have any practical significance. The idea is useful in situations where G is partially available and can be computed more efficiently than by multiplying A and its transpose.
The coplecity of this method mm2 flops.