Find the root of a function in a given range

I have a set of functions f_t

with multiple roots (actually two). I want to find the "first" root and doing it using fsolve

works great in most cases. The problem is that the two roots converge as t goes to infinity. (a simple example of my functions would be f_t(x) = x^2 - 1/t

). Thus, the more t

, the more errors fsolve

. Is there a predefined function similar fsolve

to which I can tell it should only look in a given range (e.g. always find the root in [0, inf

)).

The question is basically the same as https://mathematica.stackexchange.com/questions/91784/how-to-find-numerically-all-roots-of-a-function-in-a-given-range?noredirect=1&lq = 1 , however there are answers for Mathematica, I want them in Python.

PS: Now I can write my own algorithm, but since they tend to be slower than the built-in ones, I was hoping to find a built-in one that does the same. Especially I read this post Find the root of a function in a given interval

+3


source to share


3 answers


It is generally accepted that for smooth, well-controlled functions, the Brent method is the fastest method to guarantee getting a root. As with the other two methods listed, you must provide an interval [a, b] through which the function will be continuous and change sign.

The Scipy implementation is documented here . An example usage example for the mentioned function might look like this:



from __future__ import division
import scipy

def func(x,t):
    return(x**2 - 1/t)

t0 = 1
min = 0
max = 100000 # set max to some sufficiently large value

root = scipy.optimize.brentq(func, min, max, args = (t0)) # args just supplies any extra
                                                       # argument for the function that isn't the varied parameter

      

+1


source


You can use scipy.optimize.bisect

which takes two parameters a

and b

which specify the starting interval. However, there are several limitations:

  • The interval must be finite. You cannot search in [0, inf].
  • The function must reverse the sign at the root ( f(a)

    and f(b)

    must have opposite signs), so, for example, you cannot find the root f(x) = abs(x)

    (even if it is even a "root" in mathematical sense). Also, it won't work for f(x) = x**2 - 1

    and interval [a, b] with <-1 and b> 1.
  • The method is not gradient based. This can be advantageous if the function is very uneven or expensive to evaluate, but it can be slower with other functions.


The alternative is scipy.optimize.minimize

to keep it to a minimum abs(f(x))

. This function can accept bounds

that include infinity. But minimization can result in a non-root local minimum of the function.

+2


source


Classically, you can use root

:

import numpy as np
from scipy.optimize import root

def func(x, t):
    return x ** 2 - 1. / t

t = 5000

res = root(func, 0.5, args=(t, )).x[0]
print res

      

This will print positive, in this case 0.0141421356237

.

If you want to specify a range and define all roots in that interval , you can use chebpy

:

from chebpy import chebfun

x = chebfun('x', [-100000, 100000])
t = 5000
f = x ** 2 - 1. / t

rts = f.roots()
print rts

      

This will print both the positive and negative root, in this case

[-0.01413648  0.01413648]

      

If you only want to look in the positive range, you can change

x = chebfun('x', [-100000, 100000])

      

to

x = chebfun('x', [0, 100000])

      

I am, however, not sure how to use infinity, but you can just use a very large number for practical purposes I think.

+1


source







All Articles