Online Textbook Practice Tests 1500 Calculus Problems Solved About

3.8: Newton's Method

Whereas quadratic equations \(ax^2 + bx + c = 0\) have solutions given by the Quadratic Formula, other types of functions lack such a well-known formula (or any formula at all). How do we approximate the zeros of such a function? In this section we present Newton's Method, in which we use successive tangent lines to estimate the roots of functions.

Figure 1

Suppose that a function \(f\) is differentiable on some interval containing a root \(r.\) Our goal is to estimate the value of \(r\) in the equation \(f(r) = 0.\) We first approximate the root to be some value \(x_1;\) then we construct a tangent line \(\ell_1\) to \(f\) at \((x_1, f(x_1)).\) The idea of Newton's Method is to approximate the curve's \(x\)-intercept to be the tangent line's \(x\)-intercept, denoted \(x_2.\) (Since \(\ell_1\) is a line, its \(x\)-intercept is simple to calculate.) An equation of \(\ell_1\) is, by point-slope form, \[y - f(x_1) = f'(x_1)(x - x_1) \pd\] The coordinates of the \(x\)-intercept are \((x_2, 0),\) which we substitute to get \[ \ba 0 - f(x_1) &= f'(x_1) (x_2 - x_1) \nl \implies x_2 &= x_1 - \frac{f(x_1)}{f'(x_1)} \cma \ea \] assuming \(f'(x_1) \ne 0.\) Thus, \(x_2\) is the next (usually more accurate) approximation of the root \(r.\) (See Figure 1.) Again, at \((x_2, f(x_2))\) we construct another tangent line \(\ell_2\) to \(f,\) whose \(x\)-intercept \(x_3\) is (usually) a better estimate of \(r.\) An equation of \(\ell_2\) is \[y - f(x_2) = f'(x_2) (x - x_2) \pd\] Substituting \((x_3, 0)\) produces \[ \ba 0 - f(x_2) &= f'(x_2) (x_3 - x_2) \nl \implies x_3 &= x_2 - \frac{f(x_2)}{f'(x_2)} \pd \ea \] By repeating this process, we usually attain increasingly accurate approximations of \(r.\) In general, if the \(n\)th approximation of \(r\) is \(x_n\) and \(f'(x_n) \ne 0,\) then the subsequent approximation is given by \begin{equation} x_{n + 1} = x_n - \frac{f(x_n)}{f'(x_n)} \pd \label{eq:newton-n} \end{equation} Mathematically, we say \(x_n \to r\) as \(n \to \infty\) if our approximations \(x_1,\) \(x_2, \dots\) get closer and closer to the true root \(r.\) Then Newton's Method is said to converge. (See Animation 1.)

Animation 1
NEWTON'S METHOD
If \(f\) is differentiable on some interval containing \(r\) and \(f(r) = 0,\) then using Newton's Method with the starting approximation \(r \approx x_1\) and general approximation \(r \approx x_n\) enables us to estimate \(r\) using \begin{equation} x_{n + 1} = x_n - \frac{f(x_n)}{f'(x_n)} \pd \eqlabel{eq:newton-n} \end{equation}

\(\eqrefer{eq:newton-n}\) represents an iterative process because we use the same rule to attain \(x_{n + 1}\) from \(x_n.\) (Each application of Newton's Method is called an iteration.) Thus, it is easy to write a computer script or calculator program to execute Newton's Method.

EXAMPLE 1
Use three iterations of Newton's Method, starting with \(x_1 = 0,\) to approximate the root of \[f(x) = x^3 + x - 1 \pd\]
Figure 2
Note that \(f'(x) = 3x^2 + 1.\) Using Newton's Method, as in \(\eqref{eq:newton-n},\) with \(n = 1\) gives \[ \ba x_2 &= x_1 - \frac{f(x_1)}{f'(x_1)} \nl &= 0 - \frac{f(0)}{f'(0)} \nl &= 0 - \frac{(0)^3 + (0) - 1}{3(0)^2 + 1} \nl &= 1 \pd \ea \] (Figure 2 shows the tangent line for the first iteration.) Likewise, with \(n = 2\) we find \[ \ba x_3 &= x_2 - \frac{f(x_2)}{f'(x_2)} \nl &= 1 - \frac{f(1)}{f'(1)} \nl &= 1 - \frac{(1)^3 + (1) - 1}{3(1)^2 + 1} \nl &= 0.75 \pd \ea \] Finally, with \(n = 3\) we get \[ \ba x_4 &= x_3 - \frac{f(x_3)}{f'(x_3)} \nl &= 0.75 - \frac{f(0.75)}{f'(0.75)} \nl &= 0.75 - \frac{(0.75)^3 + (0.75) - 1}{3(0.75)^2 + 1} \nl &\approx \boxed{0.68605} \ea \] The true root is approximately \(r = 0.68233;\) Newton's Method came close with just three iterations!

To estimate a root to a desired accuracy threshold, we generally cease using Newton's Method if \(x_n\) and \(x_{n + 1}\) agree to the same desired number of decimal places. (It is unlikely that further iterations will change the approximation significantly.) The following example demonstrates this idea.

EXAMPLE 2
Use Newton's Method to calculate \(\sqrt[5]{2}\) correct to three decimal places.
Note that \(\sqrt[5] 2\) is the zero of \(f(x) = x^5 - 2.\) Differentiating gives \(f'(x) = 5x^4;\) let's begin with \(x_1 = 1.\) So Newton's Method, as in \(\eqref{eq:newton-n},\) gives \[ \ba x_2 &= x_1 - \frac{f(x_1)}{f'(x_1)} \nl &= 1 - \frac{f(1)}{f'(1)} \nl &= 1 - \frac{(1)^5 - 2}{5(1)^4} \nl &= 1.2 \pd \ea \] Subsequent iterations give \[ \ba x_3 &= 1.1529012 \cma \nl x_4 &= 1.1487289 \cma \nl x_5 &= 1.1486984 \pd \ea \] Because \(x_4\) and \(x_5\) both agree to three decimal places, we cease the approximations and use \(\sqrt[5]{2} \approx \boxed{1.148}.\)
Figure 3

Failure Unfortunately, Newton's Method does not always work. If the approximations \(x_1,\) \(x_2, \dots\) do not converge to \(r,\) then Newton's Method is said to fail. Obviously, if \(f'(x_n) = 0\) and \(f(x_n) \ne 0,\) then Newton's Method is invalid because the tangent line does not cross the \(x\)-axis. Newton's Method may also fail if a function has multiple roots, since successive iterations may converge to a different root than the desired root. Generally, the method fails or converges very slowly if we pick \(x_1\) such that \(f'(x_1)\) is close to \(0.\) In any of these cases, we should select a better starting approximation \(x_1.\) In Figure 3 the choice of \(x_1\) makes \(f'(x_1)\) close to \(0,\) so the tangent line has a large range and therefore poorly models the behavior of \(f.\) Accordingly, \(x_2\) is a worse approximation than \(x_1\) in estimating the root \(r.\) (In fact, \(x_2\) is such a poor approximation that it falls outside the domain of \(f \exclam\)) The following examples reveal additional situations in which Newton's Method fails.

EXAMPLE 3
Choosing \(x_1 = 1,\) show that Newton's Method fails to converge to the zero of \[f(x) = x^3 - 2x + 2 \pd\]
Figure 4
We have \(f'(x) = 3x^2 - 2.\) With \(n = 1\) and \(x_1 = 1,\) Newton's Method [as in \(\eqref{eq:newton-n}\)] gives \[ \ba x_2 &= x_1 - \frac{f(x_1)}{f'(x_1)} \nl &= 1 - \frac{f(1)}{f'(1)} \nl &= 1 - \frac{(1)^3 - 2(1) + 2}{3(1)^2 - 2} \nl &= 0 \pd \ea \] Then with \(n = 2\) and \(x_2 = 0,\) we get \[ \ba x_3 &= x_2 - \frac{f(x_2)}{f'(x_2)} \nl &= 0 - \frac{f(0)}{f'(0)} \nl &= 0 - \frac{(0)^3 - 2(0) + 2}{3(0)^2 - 2} \nl &= 1 \pd \ea \] Subsequent iterations give \(x_4 = 0,\) \(x_5 = 1,\) and \(x_6 = 0\)—a repeating pattern, as shown in Figure 4. Because the approximations bounce back and forth, never settling to a final value, Newton's Method fails. (Luckily, choosing a different value of \(x_1,\) such as \(x_1 = 2,\) remedies this issue.)
EXAMPLE 4
Show that Newton's Method fails to converge to the zero of \(f(x) = \sqrt[3]{x}.\)
Figure 5
It is clear that the root is \(x = 0.\) But with Newton's Method, observe that \(f'(x) = \tfrac{1}{3} x^{-2/3}\) and thus \[ \ba x_{n + 1} &= x_n - \frac{f(x_n)}{f'(x_n)} \nl &= x_n - \frac{\par{x_n}^{1/3}}{\tfrac{1}{3} \par{x_n}^{-2/3}} \nl &= x_n - 3 x_n \nl &= -2 x_n \pd \ea \] In words, each subsequent approximation of the root is double the magnitude of the preceding estimate. For example, if \(x_1 = 1\) then \(x_2 = -2,\) \(x_3 = 4,\) \(x_4 = -8,\) and \(x_5 = 16.\) Thus, Newton's Method produces approximations that lead away from the true root \(x = 0,\) as shown in Figure 5. This is a famous case in which Newton's Method fails completely.

Newton's Method enables us to approximate a root of a function by using successive tangent lines, an iterative process. If \(f\) is differentiable on some interval containing a root \(r,\) and we take the starting approximation \(r \approx x_1\) and general approximation \(r \approx x_n,\) then Newton's Method enables us to approximate \(r\) using \begin{equation} x_{n + 1} = x_n - \frac{f(x_n)}{f'(x_n)} \pd \eqlabel{eq:newton-n} \end{equation} Newton's Method is said to converge to a root \(r\) if \(x_n \to r\) as \(n \to \infty\)—that is, if the approximations get closer and closer to the true value of \(r.\) In general, we can cease the iterative process if \(x_n\) and \(x_{n + 1}\) both agree to a desired number of decimal places. But Newton's Method fails if the approximations do not approach \(r;\) failure may occur in the following scenarios:

In any of these cases, try using a better choice of \(x_1.\)