The applications:
What were the 10 applications?
What does \({\bf Ax}={\bf b}\) mean?
- First, it's a linear combination of columns of \({\bf A}\)
using elements of \({\bf x}\) as the coefficients.
$$
\mat{
a_{11} & a_{12} & \cdots & a_{1n} \\
a_{21} & a_{22} & \cdots & a_{2n} \\
\vdots & \vdots & & \vdots \\
a_{n1} & a_{n2} & \cdots & a_{nn}
}
\mat{
x_1\\
x_2\\
\vdots\\
x_n
}
\eql
\mat{
\sum_{k=1}^n a_{1k} x_k\\
\sum_{k=1}^n a_{2k} x_k\\
\vdots\\
\sum_{k=1}^n a_{nk} x_k
}
\eql
{\bf b}
$$
- Second, it's a set of linear equations:
For example the equations
$$\eqb{
2x_1 & \plss & x_2 & \plss & 4x_3 & \miss & x_4 & \eql 4\\
x_1 & & & \miss & 2x_3 & \plss & 4x_4 & \eql 1\\
3x_1 & \plss & 2x_2 & \plss & 10x_3 & \miss & 6x_4 & \eql 7\\
}$$
become the matrix-vector equation:
$$
\mat{
2 & 1 & 4 & -1 \\
1 & 0 & -2 & 4 \\
3 & 2 & 10 & -6\\
}
\mat{x_1 \\ x_2 \\ x_3\\ x_4}
\eql
\vecthree{4}{1}{7}
$$
- Third, it's a question about the column space:
\(\rhd\)
Is \({\bf b} \in \mbox{colspace}({\bf A})\)?
- Fourth, it's a transformation.
\(\rhd\)
The matrix "acts" on the vector \({\bf x}\) to produce the
vector \({\bf b}\).
RREFs:
- Row operations reduce a matrix \({\bf A}\) to \(RREF({\bf A})\):
- The RREF tells us whether:
- There is a contradiction
\(\rhd\)
No solution
- \(RREF({\bf A}) = {\bf I}\)
\(\rhd\)
Unique solution
- Multiple solutions exist, for any setting of free variables.
- The RREF also plays a key role in the proofs of many
results.
- The RREF is unique
\(\rhd\)
Every matrix has only one RREF.
- We have an algorithm that produces an RREF.
Important definitions:
- Norm: \(|{\bf v}| = \sqrt{{\bf v}\cdot {\bf v}}\)
- Orthogonal vectors: \({\bf v} \cdot {\bf u} = {\bf 0}\)
- Orthonormal vectors: \({\bf v} \cdot {\bf u} = {\bf 0}\)
and \(|{\bf v}| = |{\bf u}| = 1\)
- Orthogonal matrix: the columns are mutually orthonormal vectors.
- Span:
$$
\mbox{span}({\bf v}_1, {\bf v}_2, \ldots, {\bf v}_n)
\eql
\{{\bf z}: \exists \alpha_1,\ldots,\alpha_n \mbox{ such that }
{\bf z}= \alpha_1{\bf v}_1 + \alpha_2 {\bf v}_2 +
\ldots + \alpha_n{\bf v}_n\}
$$
- Linear independence.
Vectors \({\bf v}_1, {\bf v}_2, \ldots, {\bf v}_n\)
are linearly independent if the only
solution to the equation
$$
\alpha_1 {\bf v}_1 + \alpha_2 {\bf v}_2 + \ldots + \alpha_n {\bf v}_n
\eql {\bf 0}
$$
is \(\alpha_1 = \alpha_2 = \ldots = \alpha_n = 0\).
- Rank. The size (number) of the largest subset of independent
vectors from a collection of vectors.
- Basis (of a subspace):
any minimal collection of vectors that span the subspace.
- Space or subspace. a set of vectors
closed under linear combinations.
- Dimension. The minimum number of vectors needed to generate
the subspace via linear combinations.
- Rowspace (span of rows), colspace (span of columns)
Useful small results:
- Matrix-matrix multiplication properties:
- Associative, but not commutative.
- Distributes over addition
- Angle and dot product:
$$
{\bf v} \cdot {\bf u}
\eql
|{\bf v}| |{\bf u}| \cos(\theta)
$$
Or
$$
\cos(\theta)
\eql
\frac{{\bf v} \cdot {\bf u}}{
|{\bf v}| |{\bf u}|}
$$
- Orthogonal matrix and its inverse:
\({\bf Q}^{-1} = {\bf Q}^T\) if \({\bf Q}\) is orthogonal.
Important results:
- Proposition 5.1:
The RREF algorithm produces a matrix that satisfies
the RREF definition.
- Proposition 5.2:
When the RREF completes with a pivot on every row,
the same row operations applied to \({\bf I}\)
produces the inverse.
- Proposition 5.3: \({\bf x}={\bf A}^{-1}{\bf b}\) is the unique
solution when the inverse exists.
- Proposition 5.4: The inverse is unique.
- Proposition 5.5:
Consider the equation \({\bf A}^\prime {\bf x} = {\bf 0}\)
where \({\bf A}^\prime\) is an RREF matrix. Then:
- if \({\bf A}^\prime\) is full-rank, \({\bf x}={\bf 0}\) is the only
solution.
- if \({\bf A}^\prime\) is not full-rank, there is at least
one non-zero solution \({\bf x} \neq {\bf 0}\)
- Proposition 5.6:
When the RREF fails to be full-rank
because of zero rows, the inverse does not exist.
- Proposition 5.7:
The inverse \({\bf A}^{-1}\) exists if and only if \({\bf Ax}={\bf 0}\)
has the unique solution \({\bf x}={\bf 0}\).
- Proposition 5.8:
If \({\bf A}\) does have an inverse, the procedure
always finds it.
- Proposition 5.9
If a unique solution to \({\bf Ax}={\bf b}\) exists, so does the
inverse \({\bf A}^{-1}\).
- Proposition 6.1: Treating each row of any RREF as
a vector, the pivot rows are independent. All other rows
are dependent on the pivot rows.
- Proposition 6.2: Treating each column of an RREF as
a vector, the pivot columns are independent. All other columns
are dependent on the pivot columns.
- Proposition 6.3:.
If columns \(i_1,i_2, \ldots, i_k\) are the pivot
columns of \(RREF({\bf A})\),
then \({\bf v}_{i_1}, {\bf v}_{i_2}, \ldots, {\bf v}_{i_k}\)
are independent vectors amongst the set.
- Proposition 6.4: The rank of the collection
\({\bf v}_1, {\bf v}_2, \ldots, {\bf v}_n\)
is the number of pivot columns of \(RREF({\bf A})\).
- Proposition 6.5: If
\(\mbox{span}({\bf v}_1, {\bf v}_2, \ldots, {\bf v}_n)\) has
dimension \(n\), the vectors \({\bf v}_1, {\bf v}_2, \ldots, {\bf v}_n\)
are linearly independent.
- Theorem 6.6:
\(\mbox{dim}(\mbox{rowspace}({\bf A}))
= \mbox{dim}(\mbox{colspace}({\bf A}))
\)
- Proposition 6.7: The vectors in a basis are linearly
independent.
- Proposition 6.8: If \({\bf v}_1, {\bf v}_2, \ldots, {\bf
v}_n\) are orthogonal, they are linearly independent.
Complex vectors: in what ways are they different?
- Elements are complex numbers:
$$
{\bf u} \eql (u_1, u_2,\ldots, u_n) \eql
(a_1+ib_1, a_2+ib_2, \ldots, a_n+ib_n)
$$
- The scalars are complex.
$$\eqb{
\alpha {\bf u}
& \eql &
\alpha (u_1,u_2,\ldots,u_n) \\
& \eql &
(\alpha u_1, \alpha u_2, \ldots, \alpha u_n)
}$$
- Dot product uses conjugates
$$
{\bf u} \cdot {\bf v} \eql
u_1 \overline{v_1} + u_2 \overline{v_2} + \ldots +
u_n \overline{v_n}
$$
Also called the Hermitian dot product.
- Length is defined via the dot-product:
$$
|{\bf u}|^2 \eql {\bf u} \cdot {\bf u}
$$
Other highlights:
- Transformation by orthogonal matrices does not change length
(only angle).
- Some vectors are only stretched (angle does not change).
- Repeated multiplication by the same matrix sometimes results
in convergence to a single vector.