0%

线性代数讲义摘要

这是自学由 MIT 的 W.Gilbert Strang 教授开设的 18.06 Linear Algebra 课程时所作的笔记摘要

Lecture 1

$n$ Linear equations, $n$ unknowns
Row Picture
Column Picture
matrix form

Row Picture

$2x - y = 0$
$-x + 2y = 0$

$ \begin{bmatrix} 2 & -1 \\ -1 & 2 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} =\begin{bmatrix} 0 \\ 3 \end{bmatrix} \Longrightarrow \boldsymbol A \boldsymbol x = \boldsymbol b\\ $

Matrix $\boldsymbol A$

Column Picture

linear combination of columns

$ x\begin{bmatrix} 2 \\ -1 \end{bmatrix} + y\begin{bmatrix} -1 \\ 2 \end{bmatrix} =\begin{bmatrix} 0 \\ 3 \end{bmatrix}\\ $ $ 1\begin{bmatrix} 2 \\ -1 \end{bmatrix} + 2\begin{bmatrix} -1 \\ 2 \end{bmatrix} =\begin{bmatrix} 0 \\ 3 \end{bmatrix}\\ $

$\boldsymbol A$ times $\boldsymbol x$ is a combination of the columns of $\boldsymbol A$

$ x\begin{bmatrix} 2 \\ -1 \\ 0 \end{bmatrix} + y\begin{bmatrix} -1 \\ 2 \\ -3 \end{bmatrix} + z\begin{bmatrix} 0 \\ -1 \\ 4 \end{bmatrix} =\begin{bmatrix} 0 \\ -1 \\ 4 \end{bmatrix}\\ $

$x = 0, y = 0, z = 1$

Can I solve $\boldsymbol A \boldsymbol x = \boldsymbol b$ for every right-hand side $\boldsymbol b$?
Do the linear combinations of the columns fill three dimensional space?

singular case: the matrix would be invertible
non-singular case: invertible and beautiful

Lecture 2

Elimination Success or Failure
Back-Substitution
Elimination matrices
Matrix multiplication

Lecture 3

Matrix multiplication
Inverse of $\boldsymbol A$ $\boldsymbol A \boldsymbol B$ $\boldsymbol A^\mathrm{T}$
Gauss-Jordan / find $\boldsymbol A^{-1}$

Lecture 4

Inverse of $\boldsymbol A \boldsymbol B$, $\boldsymbol A^\mathrm{T}$ (What’s the inverse of a product?)
Product of elimination matrices
$\boldsymbol A = \boldsymbol L \boldsymbol U$ (no row exchanges)

Lecture 5

$\boldsymbol P \boldsymbol A = \boldsymbol L \boldsymbol U$
Permutation / Transposes
Vector spaces
and subspaces

Lecture 6

Vector spaces and Subspaces
Column Space of $\boldsymbol A$
Nullspace of $\boldsymbol A$
: Solving $\boldsymbol A \boldsymbol x = \boldsymbol b$

Lecture 7

Computing the nullspace ( $\boldsymbol A \boldsymbol x = \boldsymbol 0$ )
Pivot variables - free variables
Special Solutions - $\text{rref }(\boldsymbol A) = \boldsymbol R$

Lecture 8

Complete solution of $\boldsymbol A \boldsymbol x = \boldsymbol b$
Rank $r \ \ \ \ x = x_p + x_n$
$r = m : $ solution exists
$r = n : $ solution is unique

Lecture 9

Linear independence
Spanning a space
BASIS and dimension

Lecture 10

Four Fundamental Subspaces
(for matrix $\boldsymbol A$)
column space : $\text{C}(\boldsymbol A)$ in $\mathbb{R}^{m}$
nullspace : $\text{N}(\boldsymbol A)$ in $\mathbb{R}^{n}$
row space : = all combinations of rows = all combs of columns of $\boldsymbol A^\mathrm{T} = \text{C}(\boldsymbol A^\mathrm{T})$ in $\mathbb{R}^{n}$
nullspace of $\boldsymbol A^\mathrm{T}$ : = $\text{N}(\boldsymbol A^\mathrm{T})$ in $\mathbb{R}^{m}$

Lecture 11

Bases of new vector spaces
Rank one matrices
Small world graphs

Lecture 12

Graphs & Networks
Incidence Matrices
Kirchhoff’s Laws

Lecture 13

Review for Exam 1
Emphasizes Chapter 3

Lecture 14

Orthogonal vector & Subspaces
nullspace $\perp$ row space
$\text{N}(\boldsymbol A^\mathrm{T} \boldsymbol A) = \text{N}(\boldsymbol A)$

Lecture 15

Projections!
Least squares
PROJECTION MATRIX

Lecture 16

Projections
Least squares and
best straight line

Lecture 17

Orthogonal basis $q_1, q_2, \cdots, q_n$
Orthogonal matrix $\boldsymbol Q$ : square
Gram-Schmidt $\boldsymbol A \to \boldsymbol Q$
Orthonormal vectors $q^\mathrm{T}_i q_j = \begin{cases} 0 & \text{if } i \neq j, \\ 1 & \text{if } i = j.\end{cases}$

Lecture 18

Determinants $\text{det } \boldsymbol A$
Properties 1,2,3,4-10
$\pm$ signs

Lecture 19

Formula for $\text{det } \boldsymbol A$
Cofactor furmula
Tridiagonal matrices

Lecture 20

Formula for $\boldsymbol A^{-1}$
Cramers Rule for $\boldsymbol x = \boldsymbol A^{-1} \boldsymbol b$
$| \text{Det } \boldsymbol A|$ = Volume of box