 ### Lei Mao

Machine Learning, Artificial Intelligence, Computer Science.

# Newton-Raphson Method

### Introduction

Given a function $f$, sometimes we would like to find one of its roots, i.e., the value of x for which $f(x) = 0$. Computing its analytical root might be difficult, but we could use some approximate algorithms to find an approximate root.

In this blog post, I would like to discuss the Newton-Raphson Method, which is commonly used for finding the approximate root of a given function.

### Newton-Raphson Method

The Newton-Raphson method is simple. Given a function $f$, its derivative $f’$, and an initial guess $x_0$, the approximate root after $n$ iterations is

$x_{1} = x_0 - \frac{f(x_0)}{f^{\prime}(x_0)}$ $x_{n+1} = x_n - \frac{f(x_n)}{f^{\prime}(x_n)}$

To “animate” the iterative process, we have

The derivation of the update function is also simple. At any iteration $n$, the function of the line is

$y = f^{\prime}(x_n)(x - x_n) + f(x_n)$

When $y = 0$,

$x = x_n - \frac{f(x_n)}{f^{\prime}(x_n)}$

### Newton-Raphson Method VS Binary Search Method

There is another conventional algorithm that could be used for root searching, which is the binary search method. Here, we have implemented both the Newton-Raphson method and the binary search method for finding the root of a specific function.

The Newton-Raphson method and the binary search method have slightly different input signatures, which might be applied to different scenarios. Although without rigorous proof, it seems that the convergence of the Newton-Raphson method is faster than the binary search method.

# find_root.py
from typing import Callable

def binary_search_approximate_root(f: Callable[[float], float], n: int,
x_low: float, x_high: float):
assert f(x_low) * f(x_high) <= 0
for _ in range(n):
x_mid = (x_low + x_high) / 2
if f(x_low) * f(x_mid) >= 0:
x_low = x_mid
else:
x_high = x_mid
return x_mid

def newton_approximate_root(f: Callable[[float], float],
f_prime: Callable[[float], float], n: int,
x: float) -> float:
for _ in range(n):
x = x - f(x) / f_prime(x)
return x

def f(x: float) -> float:
return x**2 - 4 * x - 7

def f_prime(x: float) -> float:
return 2 * x - 4

newton_root = newton_approximate_root(f=f, f_prime=f_prime, n=4, x=5)

print("Newton approximate root: {}".format(newton_root))
assert (abs(f(newton_root)) < 1e-8)

bs_root = binary_search_approximate_root(f=f, n=50, x_low=0, x_high=10)
print("Binary search approximate root: {}".format(bs_root))
assert (abs(f(bs_root)) < 1e-8)


To execute the program, we run the following command in the terminal.

$python find_root.py Newton approximate root: 5.3166247903554 Binary search approximate root: 5.3166247903554  In order to find one of the roots of$f(x) = x^2 - 4x - 7$, we could see from the example above that despite the difference in input signature it took 4 iterations and 50 iterations for the Newton-Raphson method and the binary search method, respectively. ### Newton-Raphson Method for Multiple Variables # find_multivariate_root.py from typing import Callable def newton_approximate_root_two_variables(f: Callable[[float, float], float], f_prime_1: Callable[[float, float], float], f_prime_2: Callable[[float, float], float], n: int, x_1: float, x_2: float) -> float: for _ in range(n): x_1 = x_1 - f(x_1, x_2) / f_prime_1(x_1, x_2) x_2 = x_2 - f(x_1, x_2) / f_prime_2(x_1, x_2) return x_1, x_2 def g(x_1: float, x_2: float) -> float: return x_1**2 - x_1 * x_2 + x_2**2 - 6 def g_prime_1(x_1: float, x_2: float) -> float: return 2 * x_1 - x_2 def g_prime_2(x_1: float, x_2: float) -> float: return -x_1 + 2 * x_2 newton_root = newton_approximate_root_two_variables(f=g, f_prime_1=g_prime_1, f_prime_2=g_prime_2, n=6, x_1=5, x_2=5) print("Newton approximate root: {}".format(newton_root)) assert (abs(g(*newton_root)) < 1e-8)  To execute the program, we run the following command in the terminal. $ python find_multivariate_root.py
Newton approximate root: (2.805487102886944, 1.0914051717803375)


The Newton-Raphson method still worked very well for finding the root of the two-variable function $f(x_1, x_2) = x_1^2 - x_1 x_2 + x_2^2 - 6$. In principle, it could also be generalized to finding the root of functions that has more variables.

### Caveats

No algorithm will always work, so does the Newton-Raphson method. There are scenarios where the Newton-Raphson method fails. If interested, please check the Wikipedia on this topic.