ConceptComplete

Multistep Methods

Multistep methods use information from several previous steps to advance the solution. They achieve high order with fewer function evaluations per step than Runge-Kutta methods but require careful initialization and have restricted stability regions.


Adams Methods

Definition7.4Adams-Bashforth Methods

The kk-step Adams-Bashforth (AB) method is an explicit multistep method: yn+1=yn+hj=0k1bjf(tnj,ynj)y_{n+1} = y_n + h \sum_{j=0}^{k-1} b_j f(t_{n-j}, y_{n-j}) where the coefficients bjb_j are determined by requiring exactness for polynomials of degree k1\leq k - 1. Key instances:

  • AB1 (Euler): yn+1=yn+hfny_{n+1} = y_n + hf_n
  • AB2: yn+1=yn+h2(3fnfn1)y_{n+1} = y_n + \frac{h}{2}(3f_n - f_{n-1})
  • AB3: yn+1=yn+h12(23fn16fn1+5fn2)y_{n+1} = y_n + \frac{h}{12}(23f_n - 16f_{n-1} + 5f_{n-2})
  • AB4: yn+1=yn+h24(55fn59fn1+37fn29fn3)y_{n+1} = y_n + \frac{h}{24}(55f_n - 59f_{n-1} + 37f_{n-2} - 9f_{n-3})
Definition7.5Adams-Moulton Methods

The kk-step Adams-Moulton (AM) method is implicit: yn+1=yn+hj=1k1bjf(tnj,ynj)y_{n+1} = y_n + h \sum_{j=-1}^{k-1} b_j f(t_{n-j}, y_{n-j}) (note fn+1f_{n+1} appears). Key instances:

  • AM0 (backward Euler): yn+1=yn+hfn+1y_{n+1} = y_n + hf_{n+1}
  • AM1 (trapezoidal): yn+1=yn+h2(fn+1+fn)y_{n+1} = y_n + \frac{h}{2}(f_{n+1} + f_n)
  • AM2: yn+1=yn+h12(5fn+1+8fnfn1)y_{n+1} = y_n + \frac{h}{12}(5f_{n+1} + 8f_n - f_{n-1})

AM methods are one order higher than AB methods with the same number of steps, and have larger stability regions.


Predictor-Corrector Methods

ExampleAdams-Bashforth-Moulton Pair

A predictor-corrector pair uses ABkk to predict yn+1Py_{n+1}^P (explicit), evaluates f(tn+1,yn+1P)f(t_{n+1}, y_{n+1}^P), then applies AM(k1)(k-1) to correct: yn+1C=yn+hbjfnjy_{n+1}^C = y_n + h\sum b_j f_{n-j} using fn+1f(tn+1,yn+1P)f_{n+1} \approx f(t_{n+1}, y_{n+1}^P). The AB4/AM3 pair (PECE mode): predict with AB4, evaluate fn+1Pf_{n+1}^P, correct with AM3, evaluate fn+1Cf_{n+1}^C. This gives fourth-order accuracy with only 2 function evaluations per step.


BDF Methods

Definition7.6Backward Differentiation Formulas

The kk-step BDF method is: j=0kαjyn+1j=hβ0f(tn+1,yn+1)\sum_{j=0}^k \alpha_j y_{n+1-j} = h \beta_0 f(t_{n+1}, y_{n+1}) (only fn+1f_{n+1} appears on the right). BDF methods are designed for stiff problems:

  • BDF1 (backward Euler): yn+1yn=hfn+1y_{n+1} - y_n = hf_{n+1}
  • BDF2: 32yn+12yn+12yn1=hfn+1\frac{3}{2}y_{n+1} - 2y_n + \frac{1}{2}y_{n-1} = hf_{n+1}
  • BDFkk for k6k \leq 6 are stable; BDF7 and higher are unstable

BDF methods are A(α\alpha)-stable: stable in a sector arg(λh)<α|\arg(-\lambda h)| < \alpha where α\alpha decreases from 90°90° (BDF1-2) to 18°\sim 18° (BDF6).

RemarkStiff ODE Solvers in Practice

Modern stiff ODE solvers (LSODE, VODE, CVODE, MATLAB's ode15s) are variable-order, variable-step BDF or NDF (numerical differentiation formulas) implementations. They automatically adjust the order kk (between 1 and 5) and step size hh based on error and stability estimates. Newton iteration solves the implicit equations, with the Jacobian f/y\partial f/\partial y updated only when convergence slows, amortizing the O(n3)O(n^3) factorization cost.