The homework tag is to express that I am a student with no working knowledge of math.
I know how to use elimination to solve systems of linear equations. I set up the matrix, perform row operations until I can get the resulting matrix into row echelon form or reduced row echelon form, and then I back substitute and get the values for each of my variables.
Just some random equations: $a + 3b + c = 5 \\ a + 0 + 5c = -2$
My question is, don't you always want to be using an augmented matrix to solve systems of linear equations? By augmenting the matrix, you're performing row operations to both the left and right side of those equations. By not augmenting the matrix, aren't you missing out on performing row operations on the right side of the equations (the $5$ and $-2$)?
The sort of problems I'm talking about are homework/test-level questions (as opposed to real world harder data and more complex solving methods?) where they give you $Ax = b$ and ask you to solve it using matrix elimination.
Here is what I mean mathematically:
$[A] = \begin{bmatrix} 1 & 3 & 1 \\ 1 & 0 & 5 \\ \end{bmatrix} $
$ [A|b] = \left[\begin{array}{ccc|c} 1 & 3 & 1 & 5\\ 1 & 0 & 5 & -2\\ \end{array}\right] $
So, to properly solve the systems of equations above, you want to be using $[A|b]$ to perform Elementary Row Operations and you do not want to only use $[A]$ right?
The answer is: yes, if you use this method of matrices to solve systems of linear equations, you must use the augmented form $[A|b]$ in order to perform EROs to both $A$ and $b$.
$\endgroup$2 Answers
$\begingroup$Augmenting the matrix is just notational shorthand.
When you do row operations, what you're really doing is multiplying both sides of an equation by some matrix $P_1$:
$$\begin{align*} Ax &= b \\ P_1Ax &= P_1b \tag{$P_1$ represents some row operation} \\ P_2P_1Ax &= P_2 P_1 b \tag{$P_2$ represents some row operation} \\ &\vdots \\ Dx &= P_k \cdots P_2 P_1b \tag{$D$ is now a triangular matrix} \end{align*} $$
Doing in situ row operations obscures this fact, and augmenting it obscures it even further.
Here is an example of solving a system using augmented matrices:
$$\begin{pmatrix} 1 & 1 & 1 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \\ z\end{pmatrix} = \begin{pmatrix} 1 \\ 2 \\ 3\end{pmatrix}$$ becomes $$ \left(\begin{array}{ccc|c} 1 & 1 & 1 & 1 \\ 1 & 0 & 1 & 2 \\ 0 & 1 & 1 & 3 \end{array}\right)$$
Then you do the following operations:
$$ \left(\begin{array}{ccc|c} 1 & 1 & 1 & 1 \\ 0 & -1 & 0 & 1 \\ 0 & 1 & 1 & 3 \end{array}\right) \tag{subtract row 1 from row 2}$$
$$ \left(\begin{array}{ccc|c} 1 & 1 & 1 & 1 \\ 0 & -1 & 0 & 1 \\ 0 & 0 & 1 & 4 \end{array}\right) \tag{add row 2 to row 3}$$
And now you have an upper-triangular matrix that you can backsolve.
What you're really doing, however, is this:
$$\begin{align*} \begin{pmatrix} 1 & 1 & 1 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \\ z\end{pmatrix} &= \begin{pmatrix} 1 \\ 2 \\ 3\end{pmatrix} \\ \begin{pmatrix} 1 & 0 & 0 \\ -1 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} 1 & 1 & 1 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \\ z\end{pmatrix} &= \begin{pmatrix} 1 & 0 & 0 \\ -1 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} 1 \\ 2 \\ 3\end{pmatrix} \\ \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 1 & 1 \end{pmatrix}\begin{pmatrix} 1 & 0 & 0 \\ -1 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} 1 & 1 & 1 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \\ z\end{pmatrix} &= \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 1 & 1 \end{pmatrix}\begin{pmatrix} 1 & 0 & 0 \\ -1 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}\begin{pmatrix} 1 \\ 2 \\ 3\end{pmatrix} \end{align*}$$
Multiply everything out, you get:
$$\begin{pmatrix} 1 & 1 & 1 \\ 0 & -1 & 0 \\ 0 & 0 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \\ z\end{pmatrix} = \begin{pmatrix} 1 \\ 1 \\ 4\end{pmatrix}$$
which is exactly what appears in your augmented form.
$\endgroup$ 8 $\begingroup$I certainly wouldn't use an augmented matrix to solve the following:
$x = 3$
$y - x = 6$
When you solve a system of equations, if doing so correctly, one does indeed perform "elementary row operations", and in any case, when working with any equations, to solve for $y$ above, for example, I would add $x$ to each side of the second equation to get $y = x + 6 = 3 + 6 = 9$
Note: we do treat systems of equations like "row operations" (in fact, row operations are simply modeled after precisely those operations which are legitimate to perform on systems of equations)
$$2x + 2y = 4 \iff x + y = 2\tag{multiply each side by 1/2}$$
$$\qquad\qquad x + y = 1$$ $$\qquad\qquad x - y = 1$$ $$\iff 2x = 2\tag{add equation 1 and 2}$$
$$x+ y + z = 1$$ $$3x + 2y + z = 2$$ $$-x - y + z = 3$$
We can certainly switch "equation 2 and equation 3" to make "adding equation 3 to equation 1" more obvious.
I do believe that using an augmented coefficient matrix is very worthwhile, very often, to "get variables out of the way" temporarily, and for dealing with many equations in many unknowns: and even for $3\times 3$ systems when the associated augmented coefficient matrix has mostly non-zero entries. It just makes more explicit (and easier to tackle) the process one can use with the corresponding system of equations it represents.
$\endgroup$ 11