Matrix Operations
Addition:
Scalar Multiplication:
Matrix Multiplication:
Properties:
- Associative: \((AB)C = A(BC)\)
- Distributive: \(A(B+C) = AB + AC\)
- Not commutative: \(AB \neq BA\)
Special Matrices
Identity Matrix:
Diagonal Matrix:
Triangular:
- Upper: entries below diagonal = 0
- Lower: entries above diagonal = 0
Symmetric:
Transpose & Properties
Definition:
Properties:
- \((A^T)^T = A\)
- \((A + B)^T = A^T + B^T\)
- \((cA)^T = c(A^T)\)
- \((AB)^T = B^T A^T\)
Trace:
Matrix Inverse
Definition:
2×2 Inverse:
Properties:
- \((A^{-1})^{-1} = A\)
- \((AB)^{-1} = B^{-1}A^{-1}\)
- \((A^T)^{-1} = (A^{-1})^T\)
Determinants
2×2 Determinant:
3×3 Determinant (Rule of Sarrus):
Properties:
- \(\det(AB) = \det(A)\det(B)\)
- \(\det(A^T) = \det(A)\)
- \(\det(A^{-1}) = \frac{1}{\det(A)}\)
- \(\det(cA) = c^n\det(A)\)
Cofactor Expansion
Expansion along row i:
Cofactor:
where \(M_{ij}\) is the minor (determinant of submatrix).
Adjugate Matrix:
Gaussian Elimination
Row Operations:
- Swap two rows
- Multiply row by nonzero scalar
- Add multiple of one row to another
Row Echelon Form:
- All zero rows at bottom
- Leading entry (pivot) in each row is right of pivot above
- Leading entry is 1 (reduced row echelon form)
Back Substitution:
Solve \(Ux = b\) where U is upper triangular.
Systems of Linear Equations
Matrix Form:
where A is coefficient matrix, x is variables, b is constants.
Solution (if \(\det(A) \neq 0\)):
Consistency:
- Unique solution if \(\text{rank}(A) = \text{rank}([A|b]) = n\)
- Infinite solutions if \(\text{rank}(A) = \text{rank}([A|b]) < n\)
- No solution if \(\text{rank}(A) < \text{rank}([A|b])\)
LU Decomposition
Factorization:
where L is lower triangular, U is upper triangular.
Solving Ax = b:
- Solve \(Ly = b\) by forward substitution
- Solve \(Ux = y\) by back substitution
Vector Spaces
Span:
Set of all linear combinations of vectors \(v_1, ..., v_k\):
Linear Independence:
\(v_1, ..., v_k\) are linearly independent if:
Basis: Linearly independent set that spans the space.
Rank & Nullity
Rank: Number of linearly independent rows/columns
Nullity: Dimension of null space (solution space of \(Ax = 0\))
Rank-Nullity Theorem:
where n is the number of columns.
Eigenvalues & Eigenvectors
Definition:
Characteristic Equation:
Eigenspace:
Trace & Determinant:
- \(\sum \lambda_i = \text{tr}(A)\)
- \(\prod \lambda_i = \det(A)\)
Diagonalization
Diagonal Form:
where D is diagonal matrix of eigenvalues, P is matrix of eigenvectors.
Conditions:
- A must have n linearly independent eigenvectors
- Symmetric matrices are always diagonalizable
Powers of A:
Orthogonality
Dot Product:
Orthogonal Vectors:
Vector Norm:
Projection of u onto v:
Gram-Schmidt Process
Convert linearly independent vectors to orthonormal basis:
Algorithm:
Normalization:
Result: \(\{e_1, e_2, ..., e_n\}\) is orthonormal basis.
QR Decomposition
Factorization:
where Q has orthonormal columns, R is upper triangular.
Computing QR:
Apply Gram-Schmidt to columns of A.
Solving Least Squares:
Singular Value Decomposition (SVD)
Factorization:
where U, V orthogonal; \(\Sigma\) diagonal with singular values \(\sigma_i \geq 0\).
Computing:
- Eigenvalues of \(AA^T\) give \(\sigma_i^2\)
- Eigenvectors of \(AA^T\) give columns of U
- Eigenvectors of \(A^TA\) give columns of V
Pseudoinverse:
Least Squares Approximation
Problem: Minimize \(\|Ax - b\|^2\)
Normal Equations:
Solution:
Residual:
Optimal approximation:
Markov Chains
Transition Matrix: Columns sum to 1
State at time n:
Stationary Distribution:
Properties:
- All eigenvalues have \(|\lambda| \leq 1\)
- Always has eigenvalue \(\lambda = 1\)
Rotations & Computer Graphics
2D Rotation (angle θ):
3D Rotation about z-axis:
Homogeneous Coordinates (2D):
Matrix Norms
Frobenius Norm:
Spectral Norm:
Operator Norms:
- \(\|A\|_1\) = max column sum
- \(\|A\|_\infty\) = max row sum
Condition Number:
Inner Product Spaces
Inner Product: \(\langle u, v \rangle\)
Properties:
- Symmetric: \(\langle u, v \rangle = \langle v, u \rangle\)
- Linear: \(\langle au + bw, v \rangle = a\langle u, v \rangle + b\langle w, v \rangle\)
- Positive: \(\langle u, u \rangle > 0\) for \(u \neq 0\)
Cauchy-Schwarz Inequality:
Orthogonal Matrices
Definition:
Equivalently: \(Q^{-1} = Q^T\)
Properties:
- \(\det(Q) = \pm 1\)
- \(\|Qx\| = \|x\|\) (preserves length)
- Eigenvalues have \(|\lambda| = 1\)
- Columns/rows are orthonormal
Computational Applications
Solving Ax = b: Gaussian elimination, LU decomposition
Least Squares: Normal equations, QR decomposition
Eigenvalue Problems: QR algorithm, power method
Low-Rank Approximation: Truncated SVD
Data Analysis: PCA via eigendecomposition
Image Compression: SVD-based rank reduction