The Nordsieck History Array
A step from time \(t_{n-1}\) to \(t_n\) with step size \(h_n = t_{n} - t_{n-1}\) using a method of order \(q\) begins with the predictor polynomial interpolant, \(\pi_{p,n-1}\), of degree \(q\) or less. For Adams methods, the predictor interpolant satisfies the \(q + 1\) conditions
Similarly for BDF methods, the predictor interpolant satisfies
The solution at \(t_n\) is obtained by constructing the corrector polynomial interpolant, \(\pi_{c,n}\), of degree \(q\) or less. For Adams methods, the corrector interpolant satisfies the \(q + 2\) conditions
Similarly for variable coefficient (VC) BDF methods, the corrector interpolant satisfies
For fixed-leading-coefficient (FLC) BDF methods, the corrector interpolant satisfies the conditions
Thus, the FLC corrector interpolates the predictor at evenly spaced past points rather than interpolating the computed values as with the VC corrector.
With either Adams or BDF methods the solution history is represented using the Nordsieck array given by
where the derivatives, \(y^{(j)}_{n-1}\) are approximated by \(\pi^{(j)}_{p,n-1}(t_{n-1})\). With this formulation, computing the time step is now a problem of constructing \(Z_n\) from \(Z_{n-1}\). This computation is done by forming the predicted array, \(Z_{n(0)}\), with columns \(\frac{1}{j!} h^{(j)}_{n} \pi^{(j)}_{p,n-1}(t_n)\) and computing the coefficients \(l = [l_0,\, l_1,\, \ldots,\, l_q]\) such that
where \(\Delta_n\) is the correction to the predicted solution i.e.,
From the second column of (7) we obtain the nonlinear system for computing \(y_n\),
where \(\gamma = h_n / l_1\) and \(a_n = y_{n(0)} - \gamma \dot{y}_{n(0)}\).
Changing the State Size
After completing a step from time \(t_{n-1}\) to \(t_n\) using a method of order \(q\) and before starting the next step from \(t_{n}\) to \(t_{n+1}\) with a method of order \(q'\), the size of the state vector may change. To “resize” the integrator without restarting from first order, we require that the user supply, depending on the method type, either the recent solution or right-hand side history for the new state size.
Continuing the integration with the updated state size requires constructing a new Nordsieck array, \(Z_n\), given the necessary history to evaluate the appropriate predictor interpolating polynomial and its derivatives at \(t_n\). To compute \(\pi_{p,n}(t_n)\) and \(\pi^{(j)}_{p,n}(t_n)\), we use a Newton interpolating polynomial. Given a set of \(k+1\) data points \(\{x_n,\ldots,x_{n-k}\}\) to interpolate and a corresponding set of times, \(\{t_n,\ldots,t_{n-k}\}\), the \(k\)-th degree polynomial is given by
The polynomial coefficients, \(c_j\), are given by the divided differences,
where \([x_n,\ldots,x_{n-j}]\) is defined recursively as
with \([x_i] = x_i\). The basis polynomials, \(N_j(t)\), are given by
for \(j > 0\) with \(N_0(t) = 1\).
Organizing the divided differences in a table illustrates the dependencies in the recursive computation. For example with \(k = 3\), we need to compute \(c_0 = [x_n]\), \(c_1 = [x_n,x_{n-1}]\), \(c_2 = [x_n,x_{n-1},x_{n-2}]\), and \(c_3 = [x_n,x_{n-1},x_{n-2},x_{n-3}]\) which depend on the difference of the entries to the immediate lower and upper left in the table below.
As such, the coefficients can be computed recursively with the following steps:
For \(i = 0,\ldots,k\)
\(c_i = x_i\)
For \(i = 1,\ldots,k\)
For \(j = k,\ldots,i\)
\(c_j = \frac{c_{j-1} - c_j}{t_{j - i} - t_{j}}\)
Rewriting the Newton polynomial using nested multiplications,
leads the following iteration to evaluate \(P(t)\):
Let \(P_k(t) = c_k\)
For \(j = k-1, \ldots, 0\)
\(P_{j}(t) = c_j + (t - t_{n-j}) P_{j+1}(t)\)
Utilizing this recursive relationship for \(P(t)\), we can similarly compute its derivatives. The evaluation of the \(d\)-th derivative is given by the following iteration:
Let \(P_k(t) = c_k\) and \(\dot{P}_k(t) = \ddot{P}_k(t) = \ldots = P^{(d)}_k(t) = 0\)
For \(j = k-1, \ldots, 0\)
\(P_{j}(t) = c_j + (t - t_{n-j}) P_{j+1}(t)\)
\(\dot{P}_{j}(t) = P_{j+1}(t) + (t - t_{n-j}) \dot{P}_{j+1}(t)\)
\(\ddot{P}_{j}(t) = 2\, \dot{P}_{j+1}(t) + (t - t_{n-j}) \ddot{P}_{j+1}(t)\)
\(\vdots\)
\(P^{(d)}_{j}(t) = d\, P^{(d-1)}_{j+1}(t) + (t - t_{n-j}) P^{(d)}_{j+1}(t)\)
While only \(P^{(d)}_k(t) = 0\) is needed to start the iteration, note that \(P^{(d)}_{k} = P^{(d)}_{k - 1} = \ldots = P^{(d)}_{k - d + 1} = 0\) so some computations can be skipped when \(j > k - d\). With these pieces in place we can perform the necessary evaluations to build \(Z_n\).
For Adams methods we require \(y_n\) and \(q'\) right-hand side values, \(f_{n-j}\) for \(j = 0,1,\ldots,q'-1\). The first two columns of \(Z_n\) are simply \(y_n\) and \(h_n f_n\). To evaluate the higher order derivatives to fill the remaining \(q' - 1\) columns of \(Z_n\) we use \(P(t) = \dot{\pi}_{p,n}\) instead of \(\pi_{p,n}\) as the term interpolating \(y_n\) vanishes when taking the first derivative and we already have the data needed for the first column of \(Z_n\). In this case, \(P(t)\) interpolates the \(q'\) right-hand side values i.e., \(x_{n-j} = f_{n-j}\) above for \(j = 0,1,\ldots,q'-1 = k\). For example with \(q' = k + 1 = 3\), the table of divided differences is
With BDF methods we need \(f_n\) and \(q'\) solution values, \(y_{n-j}\) for \(j = 0,1,\ldots,q'-1\). Again, the first two columns of \(Z_n\) are simply \(y_n\) and \(h_n f_n\). To evaluate the higher order derivatives to fill the remaining \(q' - 1\) columns of \(Z_n\) we need a slight modification of the Newton polynomial evaluation procedure described above to build a Hermite polynomial that incorporates the derivative interpolation condition, \(\dot{P}(t_n) = f_n\). In this case we duplicate the data point at \(t_n\) and replace the divided difference \([y_n, y_n]\) with the corresponding derivative value, \(f_n\). For example with \(q' = k = 3\), the table of divided differences is
Other than this adjustment in computing the \(c_1\) polynomial coefficient, the iterations to evaluate \(P(t_n)\) and \(P^{(d)}(t_n)\) to fill the remaining columns of \(Z_n\) remain the same.
Note
With both Adams and BDF methods the second entry of \(Z_n\) uses the input value of \(f_n\). This will differ from what occurs in a step without resizing where the entry corresponding to \(f_n\) is obtained from the correction to the predicted Nordsieck array and not an evaluation of the right-hand side function at \(y_n\).
Additionally, when the method order is increased in the next step, we use the provided history to directly construct \(Z_n\) at the new order instead of adjusting the order using the correction vector.
Beyond building \(Z_n\), we also need to compute a resized correction vector for the just completed step, \(\Delta_n\), in order to test for a potential order increase in the next step. This calculation is achieved by computing a resized prediction, \(y_{n(0)}\), for the just-completed step and subtracting it from the resized solution, \(y_n\). Depending on the method order for the next step, \(q'\), this computation will require one (if \(q' = q+1\)), two (if \(q' = q\)), or three (if \(q' = q-1\)) additional data points beyond what is needed to construct \(Z_n\). For Adams methods, computing the prediction requires resized versions of \(y_{n-1}\) (always), \(f_{n-q}\) (if \(q' = q\) or \(q' = q-1\)), and \(f_{n-q-1}\) (if \(q' = q-1\)). Similarly for BDF methods, computing the prediction requires resized versions of \(f_{n-1}\) (always), \(y_{n-q}\) (if \(q' = q\) or \(q' = q-1\)), and \(y_{n-q-1}\) (if \(q' = q-1\)).