Root-Finding Methods in Python
Introduction
A numerical root–finding algorithm iteratively computes better approximations of zeros, also called "roots", of continuous functions.
This article presents the theory behind four standard root-finding algorithms and their implementation in Python from scratch.

Take Equation 1 as an example:

Figure 1 is a plot of Equation 1 over the interval [-1, 5]. A zero or root is any ** value of x when f(x) equals 0. i.e. the location where the curve moves across the x-axi**s. In this case, a root lies somewhere between x₀ = 1 and __ x₁ = 2.

The objective of this article is to approximate the root in Python using some numerical operations.
Three methods to achieve this objective are:
- Bisection Method
- Newton’s method
- Secant Method
Bisection Method
The bisection method approximates the roots of continuous functions by repeatedly dividing the interval at midpoints.
The technique applies when two values with opposite signs are known.
If there is a root of f(x) on the interval [x₀, x₁] then f(x₀) and f(x₁) must have a different sign. i.e. f(x₀)f(x₁) < 0.
Review this Intermediate Value Theorem article if necessary.
Let x₀ and x₁ be the start and endpoints for the estimated interval. Then, use Equation 2 to calculate the midpoint (x₂) between [x₀, x₁].

The next guess is either the midpoint of x₀ and x₂ or x₂ and x₁, depending on which side the root falls on. Convergence is slow.
Gist 1 gives an example implementation of the bisection algorithm.
Storing the successive root approximations and plotting Figure 2 illustrates the iterative nature of the algorithm.

Running the bisection method on Equation 1, specifying an interval between [1, 2], returns a solution at x ≈1.324718475. Inspection of Figure 1 tells that this value is around the expectation.
The procedure took around 18 iterations to achieve this result. Estimation improvements occur gradually.
A final note on this algorithm is that it is possible to estimate the minimum number of iterations taken to achieve a specific error bound using Equation 3.

x₀ and x₁ are the interval start and endpoints, for example, [1, 2].
Implement the equation using the Python code in Gist 3.
Newton’s Method
Possibly the most well-known root-finding algorithm, Newton’s method approximates the zeros of real-valued continuous functions. Starting with an initial guess of the solution, Equation 4 iteratively improves the approximation using knowledge of the function and the derivative value at xₙ.
- n: iteration counter
- xₙ₊₁: next estimate
- xₙ: current best estimate
- f(xₙ): function evaluated at xₙ
- f'(xₙ): derivative of f at xₙ

Clearly, this procedure requires the first derivative of f(x), and therefore f(x) must be differentiable.
Gist 3 provides the Python code to implement an iterative solution for Newton’s method. It uses the Sympy library to evaluate f'(xₙ). Upon each pass through the loop, the parameter values are substituted into Equation 1 to determine xₙ₊₁.
A tolerance limit controls when the process breaks, and this condition triggers when f(xₙ) approaches zero.
Like above, saving each __ guess and _plottin_g Figure 3 shows how Newton’s method _approache_s the correct solution.

With an initial guess of x = 9, this method returns of f(x) = 0 @ x = 1.324718834. It takes 8 iterations to reach an accuracy of 1e-5.
Newton’s method converges much faster than the bisection method but has limitations depending on the function’s derivative in question.
Secant Method
The secant method is very similar to the bisection method, except instead of dividing at the midpoint, it divides intervals by the secant line connecting the endpoints.
Equation 4 is to be implemented in Python either iteratively or recursively.

The Python method in Gist 4 recursively solves for roots using the Secant method.
Starting with initial values of x₀ = 1 and x₁ = 2, the procedure outputs a root of f(x) at x = 1.324717957, within 8 recursive calls. Figure 4 visualises the secant lines generated on each pass.

The Secant Method is second best to Newton’s Method. It is most applicable for situations requiring faster convergence than Bisection, but it is too difficult or impossible to take an analytical derivative of the function f(x).
Conclusion
Implementing root-finding algorithms is valuable for engineering and mathematics students. Furthermore, there are many other methods that the user can explore and __ expand their knowledge an_d coding skill_s, such as _Brent’_s or _Steffensen’_s method.
If you’re interested in Python, engineering, and Data Science, please follow and check out my other articles.
References
[1] Calculus: A Complete Course, 8th Edition – Robert A. Adams, University of North Carolina, Chapel Hill, USA