In: Advanced Math
The main idea is to prove that each row of each matrix is a linear combination of the rows of the other matrix. First, suppose that one of the matrices R, R’ has one zero row. Then the solutions of the corresponding system depend on 2 parameters. Hence, the other matrix also has a zero row, and the second rows of R, R’ coincide (they are both zero). Consider a 2 × 3-matrix A whose first row is the first row of R and whose second row is the first row of R’ . The corresponding system again has the same 2-parameter family of solutions as the systems RX = 0, R’X = 0. Thus a rowreduced echelon matrix equivalent to A should have one zero row. This is possible only if the rows of the matrix A coincide (since both rows of A have leading coefficients 1).
Now suppose that all rows of matrices R, R’ are non-zero. Then the solutions of the corresponding system depend on 1 parameter. Form a 3 × 4-matrix B whose first two rows are the rows of R and whose last two rows are the rows of R’ . Then B should be row equivalent to a matrix with two zero rows (otherwise the system BX = 0 would not have a 1-parameter family of solutions). Hence, the rows of of R’ can be represented as linear combinations of the rows of R and vice versa. This is possible only if the leading coefficients of their first rows occur at the same place. Indeed, if for instance, the first row R1’ of R’ starts with more zeros than the first row of R, then so do the second row R2’ of R’ and any linear combination of R1’, R2’. Using similar arguments one can prove that the leading coefficients of the second rows of R, R’ occur at the same place. Then it is easy to see that the only way for a row of R to be a linear combination of the rows of R’ is to coincide with the respective row of R’ .