Commit 40598751 by Conor McCoid

Extrap: overview of how to get to Shanks transformation, Sidis (b) process

parent 41c59b00
 ... ... @@ -357,4 +357,63 @@ Let $\vec{r}(\vec{x}_i) = \vec{x}_{i+1} - \vec{x}_i$ be the residual function. Then MPE may be expressed as equation (\ref{eq:uqn}) with $\fxi{i} = \vec{r}(\vec{x}_i)$ and $\vec{v}_i = \vec{r}(\vec{x}_{n+i-1})$. RRE and MMPE may also be expressed as such, using $\vec{v}_i = \vec{r}(\vec{x}_{n+i}) - \vec{r}(\vec{x}_{n+i-1})$ and $\vec{v}_i$ some fixed vector independent of $\vec{r}(\vec{x})$, respectively. \section{Generalized Shanks Transformation} Suppose that we have a sequence $\set{x_n}$ that tends towards $s$ as $$x_n \tilde s + \sum_{i=1}^\infty a_i \lambda_i^n.$$ Then we say $s_{n,k}$ is an approximation of $s$ such that x_{n+j} = s_{n,k} + \sum_{i=1}^k a_i \lambda_i^{n+j} for all $0 \leq j \leq 2k$. This gives $2k+1$ equations to solve for the $2k+1$ unknowns ($s_{n,k}$, $a_i$ and $\lambda_i$). We know that the final result will be a sum of $x_{n+j}$. Suppose the solution is \begin{align*} s_{n,k} = & \sum_{j=0}^m \gamma_j x_{n+j} \\ = & \sum_{j=0}^m \gamma_j \left ( s_{n,k} + \sum_{i=0}^k a_i \lambda_i^{n+j} \right ) \\ = & \left ( \sum_{j=0}^m \gamma_j \right ) s_{n,k} + \sum_{j=0}^m \sum_{i=0}^k a_i \lambda_i^{n+j}. \end{align*} Naturally this means $\sum_{j=0}^m \gamma_j = 1$. Suppose also that \begin{equation*} s_{n,k} = \sum_{j=0}^m \gamma_j x_{n+j+i} \end{equation*} for $0 \leq i \leq 2k-m-1$. Then \begin{align*} s_{n,k} - s_{n,k} = & \sum_{j=0}^m \gamma_j \left ( x_{n+j+i+1} - x_{n+j+i} \right ) \\ 0 = & \sum_{j=0}^m \gamma_j r_{n+j+i} \end{align*} for the same set of $i$. This gives the following linear equation: \begin{equation*} \begin{bmatrix} 1 & \dots & 1 \\ r_n & \dots & r_{n+m} \\ \vdots & & \vdots \\ r_{n+2k-m-1} & \dots & r_{n+2k} \end{bmatrix} \begin{bmatrix} \gamma_0 \\ \vdots \\ \gamma_m \end{bmatrix} = \begin{bmatrix} 1 \\ 0 \\ \vdots \\ 0 \end{bmatrix}. \end{equation*} For this equation to be solvable one requires that $2k-m = m$, or $m=k$. The solution is then \begin{align*} \gamma_i = & \frac{\vmat{ 1 & \dots & 1 & 1 & 1 & \dots & 1 \\ r_n & \dots & r_{n+i-1} & 0 & r_{n+i+1} & \dots & r_{n+k} \\ \vdots & & \vdots & \vdots & \vdots & & \vdots \\ r_{n+k-1} & \dots & r_{n+i+k-2} & 0 & r_{n+i+k} & \dots & r_{n+2k-1} }}{\vmat{ 1 & \dots & 1 \\ r_n & \dots & r_{n+k} \\ \vdots & & \vdots \\ r_{n+k-1} & \dots & r_{n+2k-1} }} \\ = & \frac{\vmat{ r_n & \dots & r_{n+i-1} & r_{n+i+1} & \dots & r_{n+k} \\ \vdots & \vdots & \vdots & \vdots & & \vdots \\ r_{n+k-1} & \dots & r_{n+i+k-2} & r_{n+i+k} & \dots & r_{n+2k-1} }}{\vmat{ 1 & \dots & 1 \\ r_n & \dots & r_{n+k} \\ \vdots & & \vdots \\ r_{n+k-1} & \dots & r_{n+2k-1} }} \end{align*} which gives s_{n,k} = \frac{\vmat{ x_n & \dots & x_{n+k} \\ r_n & \dots & r_{n+k} \\ \vdots & & \vdots \\ r_{n+k-1} & \dots & r_{n+2k-1} }}{\vmat{ 1 & \dots & 1 \\ r_n & \dots & r_{n+k} \\ \vdots & & \vdots \\ r_{n+k-1} & \dots & r_{n+2k-1} }}. \end{document} \ No newline at end of file
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!