NUMERICAL ANALYSIS Complete Formula Reference

GUIDE
16 CORE TOPICS & FORMULAS
Error Analysis • Root Finding • Interpolation • Integration • ODE Solvers • Linear Systems • Eigenvalues
Error Analysis
Absolute Error
$$E_a = |x_{\text{true}} - x_{\text{approx}}|$$
Relative Error
$$E_r = \frac{|x_{\text{true}} - x_{\text{approx}}|}{|x_{\text{true}}|}$$
Percent Error
$$E_p = E_r \times 100\%$$
Significant Digits
Digits from first non-zero digit
$3.14159$: 6 sig figs
$0.00314$: 3 sig figs
Floating Point (IEEE 754)
Standard Form
$$(-1)^s \times m \times 2^e$$
$s$: sign bit
$m$: mantissa (1.xxx...)
$e$: exponent
Machine Epsilon
$$\epsilon_m = 2^{-p}$$
$p$ = precision bits
Double: $\epsilon_m \approx 2.22 \times 10^{-16}$
Single: $\epsilon_m \approx 1.19 \times 10^{-7}$
Bisection Method
Algorithm
Given $f(a) \cdot f(b) < 0$
Find midpoint: $c = \frac{a+b}{2}$
If $f(c) = 0$: root found
Else: replace $(a,b)$ with $(a,c)$ or $(c,b)$
Convergence
$$|x_n - r| \leq \frac{b-a}{2^n}$$
Linear convergence: $|e_n| \approx 0.5 |e_{n-1}|$
Guaranteed for continuous $f$
Newton-Raphson Method
Iteration Formula
$$x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$$
Convergence
Quadratic (order 2)
Error: $|e_{n+1}| \approx C|e_n|^2$
Very fast near root
Challenges
Requires derivative $f'(x)$
May diverge if poor initial guess
Multiple roots need different starts
Secant Method
Iteration Formula
$$x_{n+1} = x_n - \frac{f(x_n)(x_n - x_{n-1})}{f(x_n) - f(x_{n-1})}$$
Convergence
Superlinear ($\approx 1.618$)
No derivative needed
Needs two initial points
vs Newton
Slower convergence
More robust, no derivative
Fixed Point Iteration
Method
Rewrite $f(x)=0$ as $x = g(x)$
$$x_{n+1} = g(x_n)$$
Convergence Criterion
$$|g'(x)| < 1 \text{ near fixed point}$$
Geometric convergence
Error Bound
$$|x_n - x^*| \leq \frac{L}{1-L}|x_n - x_{n-1}|$$
$L = \max|g'(x)|$ in interval
Lagrange Interpolation
Basis Polynomials
$$L_j(x) = \prod_{k=0, k \neq j}^{n} \frac{x - x_k}{x_j - x_k}$$
Interpolating Polynomial
$$P_n(x) = \sum_{j=0}^{n} y_j L_j(x)$$
Error
$$|f(x) - P_n(x)| \leq \frac{M}{(n+1)!}|(x-x_0)\cdots(x-x_n)|$$
$M = \max|f^{(n+1)}(x)|$ on interval
Newton Divided Differences
First Differences
$$f[x_i, x_j] = \frac{f(x_j) - f(x_i)}{x_j - x_i}$$
Higher Order
$$f[x_i, \ldots, x_k] = \frac{f[x_{i+1}, \ldots, x_k] - f[x_i, \ldots, x_{k-1}]}{x_k - x_i}$$
Polynomial Form
$P_n(x) = f[x_0] + f[x_0,x_1](x-x_0) + \ldots$
Better numerical stability
Easy to add new points
Spline Interpolation
Cubic Spline
Piecewise cubic on each interval
$C^2$ continuity (smooth 2nd deriv)
Natural Spline (boundary)
$S''(x_0) = S''(x_n) = 0$
Clamped Spline
$S'(x_0)$ and $S'(x_n)$ specified
On interval $[x_i, x_{i+1}]$
$$S_i(x) = a_i + b_i(x-x_i) + c_i(x-x_i)^2 + d_i(x-x_i)^3$$
Numerical Differentiation
Forward Difference
$$f'(x) \approx \frac{f(x+h) - f(x)}{h}$$
Error: $O(h)$
Backward Difference
$$f'(x) \approx \frac{f(x) - f(x-h)}{h}$$
Error: $O(h)$
Central Difference
$$f'(x) \approx \frac{f(x+h) - f(x-h)}{2h}$$
Error: $O(h^2)$ - More accurate!
Trapezoidal Rule
Single Interval
$$\int_a^b f(x)dx \approx \frac{b-a}{2}[f(a) + f(b)]$$
Composite (n subintervals)
$$T_n = \frac{h}{2}\left[f(a) + 2\sum_{i=1}^{n-1}f(x_i) + f(b)\right]$$
$h = \frac{b-a}{n}$
Error
$$E_T = -\frac{(b-a)^3}{12n^2}f''(\xi)$$
Error: $O(h^2)$
Simpson's Rule
Simpson's 1/3 (2 intervals)
$$\int_a^b f(x)dx \approx \frac{b-a}{6}[f(a) + 4f(\frac{a+b}{2}) + f(b)]$$
Simpson's 3/8 (3 intervals)
$$\int_a^b f(x)dx \approx \frac{3h}{8}[f(x_0) + 3f(x_1) + 3f(x_2) + f(x_3)]$$
Composite Simpson's 1/3
$S_n = \frac{h}{3}[f(a) + 4\sum_{\text{odd}}f(x_i) + 2\sum_{\text{even}}f(x_i) + f(b)]$
Error: $O(h^4)$ - Very accurate!
Gaussian Quadrature
General Formula
$$\int_a^b f(x)dx \approx \sum_{i=1}^{n} w_i f(x_i)$$
$w_i$: weights, $x_i$: nodes
Gauss-Legendre ($[-1,1]$)
Optimal node placement
$n$ points: degree $2n-1$ polynomial
Transform to $[a,b]$
$$\int_a^b f(x)dx = \frac{b-a}{2}\int_{-1}^{1} f\left(\frac{b-a}{2}t + \frac{a+b}{2}\right)dt$$
LU Decomposition
Factorization
$$A = LU$$
$L$: lower triangular (1s on diag)
$U$: upper triangular
Solve $Ax = b$
1) Solve $Ly = b$ (forward subst.)
2) Solve $Ux = y$ (back subst.)
with Partial Pivoting
$PA = LU$
More numerically stable
Iterative Methods
Jacobi Method
$$x_i^{(k+1)} = \frac{1}{a_{ii}}\left(b_i - \sum_{j \neq i} a_{ij}x_j^{(k)}\right)$$
Gauss-Seidel
$$x_i^{(k+1)} = \frac{1}{a_{ii}}\left(b_i - \sum_{ji} a_{ij}x_j^{(k)}\right)$$
Convergence
Both: diagonally dominant or SPD
Gauss-Seidel: usually faster
Eigenvalues & ODE Solvers
Power Method (largest eigenvalue)
$\mathbf{x}^{(k+1)} = A\mathbf{x}^{(k)}$
$\lambda \approx \frac{\mathbf{x}^{(k+1)} \cdot \mathbf{x}^{(k)}}{\mathbf{x}^{(k)} \cdot \mathbf{x}^{(k)}}$ (Rayleigh quotient)
Euler's Method (ODE)
$$y_{n+1} = y_n + h f(t_n, y_n)$$
Simple, $O(h)$ error
Runge-Kutta 4
$k_1 = f(t_n, y_n)$
$k_2 = f(t_n+h/2, y_n+hk_1/2)$
$y_{n+1} = y_n + \frac{h}{6}(k_1+2k_2+2k_3+k_4)$