Web Analytics
MTH 215 Chapter 2

MTH 215 — Intro to Linear Algebra

Kyle Monette
Spring 2026

Section 2.1: Matrix Operations

First, some essential terminology for an $m\times n$ matrix $A$: \[ \mat{ccccc} a_{11} & \dots & a_{1j} & \dots & a_{1n} \\ \vdots & & \vdots & & \vdots \\ a_{i 1} & \dots & a_{ij} & \dots & a_{in} \\ \vdots & & \vdots & & \vdots \\ a_{m 1} & \dots & a_{mj} & \dots & a_{mn} \\ \rix .\]

  • $a_{ij}$—the entry in the $i$-th row and $j$-th column of $A$, i.e., position $(i,j)$.
  • Main Diagonal—the north-west to south-east diagonal of the matrix: $a_{11}, a_{22}, \dots$.
  • $\vec{a}_j$—the $j$-th column of $A$ (a vector in $\R^m $).
  • Size—the size of $A$ is $m\times n$.

Two very special matrices are: \[ \text{identity matrix } I_n \; = \; \mat{cccc} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & 1 \rix_{n\times n} , \qquad \text{zero matrix } 0 \; = \; \mat{cccc} 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & 0 \rix_{m\times n} .\]

Let $A$ and $B$ be $m\times n$ matrices (i.e., of the same size).
  • $A$ and $B$ are equal if all corresponding entries are equal:
    $a_{ij} = b_{ij}$ for all $1\le i\le m$, $1\le j\le n$.
  • For any scalar $r$, $rA$ is a scalar multiple of $A$ with each entry multiplied by $r$:
    $(rA)_{ij} = r \cdot a_{ij}$
  • The sum of $A$ and $B$ is the $m\times n$ matrix $C = A + B$ consisting of the sum of corresponding entries in $A$ and $B$:
    $c_{ij} = a_{ij} + b_{ij}$ for all $1\le i\le m$, $1\le j\le n$.
Let $A = \mat{rrr} 4 & 0 & 5 \\ -1 & 3 & 2 \rix$, $B = \mat{rrr} 1 & 1 & 1 \\ 3 & 5 & 7 \rix$, $C = \mat{rr} 2 & -3 \\ 0 & 1 \rix$.
  • $A + B = $
  • $B + A = $
  • $A + C = $
  • $B + C = $
  • $2 B = $
  • $A - 2B = $
Let $A, B, C$ be matrices of the same size, and let $r, s$ be scalars. Then:
  1. $A + B = B + A$
  2. $(A+B)+C = A+(B+C)$
  3. $A + 0 = A$
  4. $r(A+B) = rA + rB$
  5. $(r+s)A = rA + sA$
  6. $r(sA) = (rs)A$

Now, we turn to multiplication. Let $B$ be an $n\times p$ matrix and $\vec{x} \in \R^p$. Suppose we compute $B \vec{x}$ and multiply this output by $A$ (an $m\times n$ matrix), obtaining $A(B \vec{x}) \in \R^m$.

Is there a single matrix which represents this final result? \begin{align*} B \vec{x} & \eq \\[5em] A (B \vec{x}) & \eq \hspace{25em} \\[5em] & \eq \\[5em] & \eq \\[5em] & \eq \end{align*}

If $A$ is an $m\times n$ matrix and $B$ is an $n\times p$ matrix, then the product $AB$ is the $m\times p$ matrix \[ AB \eq A \mat{cccc} \vec{b}_1 & \vec{b}_2 & \dots & \vec{b}_p \rix \eq \mat{cccc} A \vec{b}_1 & A \vec{b}_2 & \dots & A \vec{b}_p \rix .\]

Note:

  1. Matrix multiplication is not done entry-wise!
  2. The product $AB \, \vec{x}$ represents the composition $\vec{x} \mapsto B \vec{x} \mapsto A(B \vec{x})$.
  3. Each column of $AB$ is a linear combination of the columns of $A$, using weights from that column of $B$.
Compute $AB$, where $A = \mat{rr} 4 & -2 \\ 3 & -5 \\ 0 & 1 \rix$ and $B = \mat{rr} 2 & -3 \\ 6 & -7 \rix$. \begin{align*} A \vec{b}_1 & \eq \hspace{8em} & A\vec{b}_2 & \eq \hspace{8em} \\[5em] & \eq & \eq \\[6em] \end{align*} \[ AB \eq \hspace{10em} \]

It is critically important to pay attention to the sizes!

If possible, compute $AB$ and $BA$ where $A = \mat{rr} 2 & 3 \\ 1 & -5 \rix$ and $B = \mat{rrr} 4 & 3 & 6 \\ 1 & -2 & 3 \rix$. \begin{align*} A \vec{b}_1 & \eq \hspace{10em} & A \vec{b}_2 & \eq \hspace{10em} & A \vec{b}_3 & \eq \hspace{10em} \\[5em] & \eq & & \eq & & \eq \end{align*}
Row-Column Rule (Easier way to do hand-calculations)
If $A$ is $m\times n$ and $B$ is $n\times p$, then the $(i,j)$ entry of $AB$ is \[ (AB)_{ij} \eq a_{i 1} b_{1j} + a_{i 2} b_{2j} + \dots + a_{in} b_{nj} .\] \[ \mat{cccc} \\ \\ a_{i 1} & a_{i 2} & \dots & a_{i n} \\ \\ \\ \rix \mat{ccccccc} & & & b_{1 j} & & & \\ & & & b_{2 j} & & & \\ & & & \vdots & & & \\ & & & b_{nj} & & & \rix \eq \mat{ccccccc} \\ \\ & & & (AB)_{ij} & & & \\ \\ \\ \rix \]

Continuing the example from above, \[ AB \;=\; \mat{rr} 2 & 3 \\ 1 & -5 \rix \mat{rrr} 4 & 3 & 6 \\ 1 & -2 & 3 \rix \;=\; \hspace{30em} .\]

Let $A$ be an $m\times n$ matrix, and let $B$ and $C$ have sizes that are conformable for the described operation.
  1. $A(BC) = (AB)C$ associative law of multiplication
  2. $A(B+C) = AB + AC$ left-distributive law
  3. $(B+C)A = BA + CA$ right-distributive law
  4. $r(AB) = (rA)B = A(rB)$ for any scalar $r$
  5. $I_m A = A = A I_n$ identity for matrix multiplication
Warning!
  1. In general, $AB \neq BA$. It is sometimes possible, though.
  2. If $AB = AC$, then $B$ may or may not equal $C$.
  3. If $AB = 0$, then $A$ and $B$ are not necessarily $0$.
  1. If $A = \mat{rr} 5 & 1 \\ 3 & -2 \rix$ and $B = \mat{cc} 2 & 0 \\ 4 & 3 \rix$, then \begin{align*} AB & \eq \mat{rr} 5 & 1 \\ 3 & -2 \rix \mat{cc} 2 & 0 \\ 4 & 3 \rix \eq \hspace{30em} \\[1em] BA & \eq \mat{cc} 2 & 0 \\ 4 & 3 \rix \mat{rr} 5 & 1 \\ 3 & -2 \rix \eq \end{align*}
  2. If $A = \mat{rr} 2 & -3 \\ -4 & 6 \rix$, $B = \mat{cc} 8 & 4 \\ 5 & 5 \rix$, $C = \mat{rr} 5 & -2 \\ 3 & 1 \rix$, then \[ AB \eq \mat{rr} 1 & -7 \\ -2 & 14 \rix \eq AC, \qquad \text{but } \quad B\neq C .\]
  3. If $A = \mat{rr} 3 & -6 \\ -1 & 2 \rix$ and $B = \mat{rr} 2 & 4 \\ 1 & 2 \rix$, then $AB = 0$ but $A\neq 0$ and $B\neq 0$.
If $A$ is an $n\times n$ matrix (i.e., is square) and $k$ is a positive integer, then we define \[ A^k \eq \underbrace{A\cdot A\cdot \dots \cdot A}_{k \text{ times} } , \qquad \text{where } A^0 = I_n .\]
Compute $A^3$, where $A = \mat{cc} 1 & 0 \\ 3 & 2 \rix$.
Let $A$ be an $m\times n$ matrix. The transpose of $A$ is the $n\times m$ matrix, denoted $A^T$, whose columns are the rows of $A$.
Compute the transpose of the following matrices. \begin{align*} A & \eq \mat{cc} 1 & 2 \\ 3 & 4 \rix & A^T & \eq \hspace{20em} \\[2em] B & \eq \mat{rr} -5 & 2 \\ 1 & -3 \\ 0 & 4 \rix & B^T & \eq \\[1em] C & \eq \mat{rrrr} 1 & 1 & 1 & 1 \\ -3 & 5 & -2 & 7 \rix & C^T & \eq \\ \vec{x} & \eq \mat{c} 3 \\ 6 \\ 7 \\ 10 \rix & \vec{x}^T & \eq \end{align*}
Let $A$ and $B$ be matrices which are conformable for the given operation.
  1. $(A^T)^T = A$ (transpose of $A^T$ is $A$)
  2. $(A+B)^T = A^T + B^T$
  3. For any scalar $r$, then $(rA)^T = r A^T$
  4. $(AB)^T = B^T A^T$ (transpose of a product is product of transposes in reverse order)

Note: Property 4 does not say $(AB)^T = A^T B^T$ (this is only sometimes true).

These properties easily generalize to more matrices. For example, \[ (ABC)^T \eq \hspace{10em} \eq \hspace{10em} \eq \]

Are the following true or false?
  1. If $AB$ and $BA$ are defined, then $A$ and $B$ are square matrices of the same size.
  2. If $\vec{p}$ solves $A \vec{x} = \vec{b}$ and $\vec{u}$ solves $B \vec{x} = \vec{c}$, then $\vec{p}+\vec{u}$ is a solution for $(A+B) \vec{x} = \vec{b} + \vec{c}$.
  3. If $A^2 = I_n$, then $A = -I_n$ or $A = I_n$.
  4. If $A,B$ are $n\times n$ matrices, then $(A^T B + B^T A)^T = A^T B + B^T A$.

Section 2.2: The Inverse of a Matrix

Before discussing inverses of a matrix, recall that the (multiplicative) inverse of any nonzero real number $a$ is $a^{-1} = \frac{1}{a}$, since $a \cdot a^{-1} = 1$. For example, $7 \cdot 7^{-1} = 1 = 7^{-1}\cdot 7$.

For matrices, we would require that both $A \cdot A^{-1}$ and $A ^{-1} \cdot A$ are the multiplicative identity, since multiplication is not commutative! Further, $A$ must be square.

An $n\times n$ matrix $A$ is invertible (i.e., nonsingular, has an inverse) if there exists an $n\times n$ matrix $C$ such that \[ CA \eq AC \eq I_n .\] Here, $C$ is called the inverse of $A$, and denoted $A^{-1}$.

Unlike how every nonzero real number is invertible, not every matrix has an inverse (even if $A \neq 0$). We'll spend time developing theorems for when $A$ is nonsingular.

If $A$ is invertible, then the inverse of $A$ is unique.
Suppose $B$ and $C$ are two inverses of $A$. By definition, $AB = BA = I$ and $AC = CA = I$. Then \[ B \eq B \cdot I \eq B\cdot (AC) \eq (BA) \cdot C \eq I \cdot C \eq C .\] Therefore, $B = C$ and hence the inverse is unique.

By this proposition, the notation $A^{-1}$ is well-defined.

Some Equivalent Terminology
$A$ is invertible $A$ is not invertible
$A$ is nonsingular $A$ is singular
$A$ has an inverse $A$ does not have an inverse

Let $A = \mat{cc} a & b \\ c & d \rix$ be a nonsingular matrix. What is $A^{-1}$?

\begin{align*} A\cdot A^{-1} & \eq \mat{cc} a & b \\ c & d \rix \mat{cc} w & x \\ y & z \rix \eq \mat{cc} a w + by & ax+bz \\ cw+dy & cx+dz \rix \eq \mat{cc} 1 & 0 \\ 0 & 1 \rix \end{align*} This yields four equations: \begin{align} aw + by & \eq 1 \label{2.2.1} \\ ax+bz & \eq 0 \label{2.2.2} \\ cw + dy & \eq 0 \label{2.2.3}\\ cx + dz & \eq 1 \label{2.2.4} \end{align}

Multiply \eqref{2.2.1} by $d$ and \eqref{2.2.3} by $b$, and subtract: \[ daw+dby - (bcw+bdy) = d \quad \implies \quad w(ad-bc) = d \quad\implies \quad w = \frac{d}{ad-bc} .\]

Multiply \eqref{2.2.4} by $b$ and \eqref{2.2.2} by $d$, and subtract: \[ bcx+bdz - (dax+dbz) = b \quad \implies \quad x(bc-ad) = b \quad \implies \quad x = \frac{-b}{ad-bc} .\]

Multiply \eqref{2.2.1} by $c$ and \eqref{2.2.3} by $a$, and subtract: \[ caw + cby - (acw+ady) = c \quad \implies \quad y(cb-ad) = c \quad\implies \quad y = \frac{-c}{ad-bc} .\]

Multiply \eqref{2.2.4} by $a$ and \eqref{2.2.2} by $c$, and subtract: \[ acx + adz - (cax+cbz) = a \quad \implies \quad z(ad-bc) = a \quad \implies \quad z = \frac{a}{ad-bc} .\] Therefore, $A^{-1}$ is $\dots$

If $A = \mat{cc} a & b \\ c & d \rix$ and $ad-bc\neq 0$, then $A$ is invertible and \[ A^{-1} \eq \frac{1}{ad-bc} \mat{rr} d & -b \\ -c & a \rix .\] On the other hand, if $ad-bc=0$ then $A$ is singular.
See the discussion on the last page, or just multiply $A$ and its proclaimed inverse to verify the product is $I_2$.
Warning! This proposition does not generalize to matrices $3\times 3$ or larger!!
Is $A = \mat{cc} 3 & 4 \\ 5 & 6 \rix$ invertible? If so, find the inverse.

Suppose $A$ is $n\times n$ and invertible, and $\vec{b} \in \R^n $. What is the solution to $A \vec{x} = \vec{b}$? \[ A \vec{x} = \vec{b} \quad \implies \quad \hspace{25em} \]

If $A$ is an $n\times n$ invertible matrix, then for each $\vec{b} \in \R^n $, the equation $A \vec{x} = \vec{b}$ has the unique solution $\vec{x} = A^{-1} \vec{b}$.
For the system $A \vec{x} = \vec{b}$ where $A = \mat{cc} 3 & 4 \\ 5 & 6 \rix$ and $\vec{b} = \mat{c} 2 \\ 1 \rix$, then \[ \vec{x} \eq A^{-1} \vec{b} \eq \hspace{25em} .\]
Suppose $A$ and $B$ are $n\times n$ invertible matrices.
  1. $A^{-1}$ is invertible and $(A^{-1})^{-1} = A$.
  2. $AB$ is invertible and $(AB)^{-1} = B^{-1} A^{-1}$.
  3. $A^T$ is invertible and $(A^T)^{-1} = (A^{-1})^T$.
...

Note that property 3 generalizes to three or more matrices: \[ (ABC)^{-1} \eq C^{-1} B^{-1} A ^{-1} .\]

How do we find the inverse of an $n\times n$ nonsingular matrix $A$? For example, \[ A \eq \mat{rrr} 1 & 2 & 0 \\ 0 & 5 & 0 \\ -1 & -2 & 1 \rix .\] The following row operations will bring $A$ to its $\RREF$: \[ \underbrace{R_3 \leftarrow R_3 + R_1}_{\text{OP } 1} , \qquad \underbrace{R_2 \leftarrow \frac{1}{5}R_2}_{\text{OP } 2} , \qquad \underbrace{R_1 \leftarrow R_1 - 2 R_2}_{\text{OP } 3} .\] In fact, the $\RREF$ of $A$ is simply $I_3$. That is, $A$ is row equivalent to $I_3$. We ought to be able to leverage something out of this fact.

Notice that the action of OP 1 is performed by computing \[ E_1 \cdot A \eq \hspace{10em} \cdot \mat{rrr} 1 & 2 & 0 \\ 0 & 5 & 0 \\ -1 & -2 & 1 \rix \eq \mat{rrr} 1 & 2 & 0 \\ 0 & 5 & 0 \\ 0 & 0 & 1 \rix .\] The action of OP 2 is performed by computing \[ E_2 \cdot (E_1 A) \eq \hspace{10em} \mat{rrr} 1 & 2 & 0 \\ 0 & 5 & 0 \\ 0 & 0 & 1 \rix \eq \mat{rrr} 1 & 2 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \rix .\] The action of OP 3 is performed by computing \[ E_3 \cdot (E_2 E_1 A) \eq \hspace{10em} \mat{rrr} 1 & 2 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \rix \eq \mat{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \rix .\] That is:

Therefore, the same row operations that transform $A$ to $I$ are used to create $A^{-1}$.

The matrices $E_1, E_2, E_3$ we used have a special name.

An $n\times n$ elementary matrix is one that is obtained from performing a single row operation to $I_n$.

For example, \[ \mat{cccc} 1 & 0 & 0 & 0 \\ 0 & 6 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \rix \qquad \mat{ccc} 0 & 1 & 0\\ 1 & 0 & 0 \\ 0 & 0 & 1 \rix \qquad \mat{cccc} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 5 & 0 & 1 \rix \] are elementary matrices associated with $\dots$

  1. If a row operation is applied to $A$, the resulting matrix is $E A$ where $E$ is the corresponding elementary matrix.
  2. Elementary matrices are invertible. Moreover, if $E$ is elementary, then the inverse of $E$ is the elementary matrix corresponding to the single row operation that transforms $E$ back to $I$.

For example: \[ \mat{cccc} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 5 & 0 & 1 \rix \quad \text{has inverse} \quad \hspace{10em} \]

\[ \mat{ccc} 0 & 1 & 0\\ 1 & 0 & 0 \\ 0 & 0 & 1 \rix \quad \text{has inverse} \quad \hspace{9em} \]

An $n\times n$ matrix $A$ is invertible if and only if $A$ is row equivalent to $I_n$.
Furthermore, if $A$ is invertible, then any sequence of row operations that reduce $A$ to $I_n$ also transform $I_n$ to $A^{-1}$.

We saw this result in action before: \[ A = \mat{rrr} 1 & 2 & 0 \\ 0 & 5 & 0 \\ -1 & -2 & 1 \rix \qquad\quad \begin{aligned} R_3 & \leftarrow R_3 + R_1 \\ R_2 & \leftarrow \frac{1}{5}R_2 \\ R_1 & \leftarrow R_1 - 2 R_2 \end{aligned} \qquad\quad A^{-1} = \mat{rrr} 1 & -2 /5 & 0 \\ 0 & 1 /5 & 0 \\ 1 & 0 & 1 \rix .\] To find $A^{-1}$ (if it exists), we can generate row operations that reduce $A$ to $I_n$. By the theorem, if we keep applying these operations to $I_n$, we will obtain $A^{-1}$.

Algorithm to Compute $A^{-1}$
  • Place $A$ and $I_n$ side-by-side in an augmented matrix $\mat{c|c} A & I_n \rix$.
  • Perform row operations on this entire matrix, with the goal of reducing $A$ (i.e, the left-half) to $I_n$.
  • Afterward, one obtains $\mat{c|c} I_n & A^{-1} \rix$. That is, $A^{-1}$ is the right-half of this matrix.
If row reducing $A$ to $I_n$ is not possible, then $A$ is not invertible.
Find the inverse of $A = \mat{rrr} 0 & 1 & 2 \\ 1 & 0 & 3 \\ 4 & -3 & 8 \rix$, if it exists. \begin{align*} \mat{c|c} A & I \rix \eq \mat{rrr|ccc} 0 & 1 & 2 & 1 & 0 & 0\\ 1 & 0 & 3 & 0 & 1 & 0 \\ 4 & -3 & 8 & 0 & 0 & 1 \rix & \qquad R_2 \leftrightarrow R_1 & \quad & \mat{rrr|ccc} 1 & 0 & 3 & 0 & 1 & 0 \\ 0 & 1 & 2 & 1 & 0 & 0 \\ 4 & -3 & 8 & 0 & 0 & 1 \rix \\ & \qquad R_3 \leftarrow R_3 - 4 R_1 & \quad & \mat{rrr|rrr} 1 & 0 & 3 & 0 & 1 & 0 \\ 0 & 1 & 2 & 1 & 0 & 0 \\ 0 & -3 & -4 & 0 & -4 & 1 \rix \\ & \qquad R_3 \leftarrow R_3 + 3 R_2 & \quad & \mat{rrr|rrr} 1 & 0 & 3 & 0 & 1 & 0 \\ 0 & 1 & 2 & 1 & 0 & 0 \\ 0 & 0 & 2 & 3 & -4 & 1 \rix \end{align*}
Continue on your own!
Suppose $P$ is an invertible $n\times n$ matrix and $A$ and $B$ are $n\times n$ matrices such that $A = P B P^{-1}$.
  1. Solve for $B$ in terms of $A$ and $P$.

    \begin{align*} A & \eq P B P^{-1} \\[1em] & \eq \\[1em] & \eq \\[1em] & \eq \\[1em] & \eq \\[1em] & \eq B \end{align*}

  2. Find an expression for $A^{2026}$ in terms of $B$ and $P$.

    \begin{align*} A^2 & \eq \hspace{8em} & \eq \hspace{6em} \\[2em] A^3 & \eq \hspace{8em} & \eq \hspace{6em} \\[2em] A^4 & \eq \hspace{8em} & \eq \hspace{6em} \\[2em] & \quad \;\, \vdots \\ A^{2026} & \eq \end{align*}

Section 2.3: Characterizations of Invertible Matrices

In this short section we give a summary of equivalent conditions to a square matrix $A$ being invertible or not.

Let $A$ be an $n\times n$ matrix. Then the following are equivalent (i.e., either all statements are true or all are false).
  1. $A$ is an invertible matrix.
  2. $A$ is row equivalent to the $n\times n$ identity matrix $I_n$.
  3. $A$ has $n$ pivot positions.
  4. The equation $A \vec{x} = \vec{0}$ has only the trivial solution.
  5. The columns of $A$ form a linearly independent set.
  6. The equation $A \vec{x} = \vec{b}$ has a unique solution for all $\vec{b} \in \R^n $.
  7. The columns of $A$ span $\R^n $.
  8. There exists an $n\times n$ matrix $C$ such that $CA = I_n$.
  9. There exists an $n\times n$ matrix $D$ such that $AD = I_n$.
  10. $A^T$ is an invertible matrix.

While powerful, the IMT applies to only square matrices! Particularly, do not apply statements (8) and (9) when $A$ is not square.

  1. The $\RREF$ of $A$ is $\mat{rrr} 1 & 0 & -2 \\ 0 & 1 & 4 \\ 0 & 0 & 3 \rix$. Is $A$ invertible?
  2. Is $B = \mat{ccc} 2 & 3 & 4 \\ 2 & 3 & 4 \\ 2 & 3 & 4 \rix$ invertible?
  3. Suppose $C$ is a $5\times 5$ matrix, and there exists $\vec{v} \in \R ^5$ which is not a linear combination of the columns of $C$. How many solutions are there to $C \vec{x} = \vec{0}$?
  4. Let $A,B$ be $n\times n$ matrices such that $AB \vec{x} = \vec{0}$ has a nontrivial solution. Is $AB$ invertible?
  5. Let $A$ be a $3\times 3$ matrix whose columns do not span $\R^3$. Can $A$ be invertible?
  6. Suppose $A$ is $n\times n$ and invertible. Show that the columns of $A^{-1}$ are linearly independent.
Suppose $A$ and $B$ are $n\times n$ matrices, and the columns of $B$ are linearly dependent. Show that the columns of $AB$ are linearly dependent. Represent $A$ and $B$ as \[ A = \mat{ccc} \vec{a}_1 & \dots & \vec{a}_n \rix, \qquad B = \mat{ccc} \vec{b}_1 & \dots & \vec{b}_n \rix .\] Because $B$ has linearly dependent columns, then there exists weights $c_1, \dots , c_n$ (not all of which are zero) such that \[ \hspace{10em} \eq \] Multiply by $A$ to obtain \[ \hspace{10em} \eq \] Now, notice that \[ AB \eq A \mat{ccc} \vec{b}_1 & \dots & \vec{b}_n \rix \eq \hspace{10em} \]
Let $A$ and $B$ be $n\times n$ matrices. If $AB$ is invertible, show that $A$ is invertible. Because $AB$ is invertible, there exists an $n\times n$ matrix $W$ such that By the IMT statement (9):