Scipy odeint with jacobian matrix matrix

I am integrating a hard ODE system using the integr.odeint SciPy function. Since the integration is non-trivial and time-consuming, I also use the corresponding Jacobian. By rearranging the equations, I can determine that the Jacobian is a striped matrix. Following the API documentation , I would like to define a form with mu and ml parameters. Unfortunately, the documentation is a bit ambiguous, so I couldn't figure out how to implement my jacobian function.

To test how odeint should be called, I used the following (somewhat silly) code:

from scipy.integrate import odeint
lmax = 5

def f1(y, t):
    ydot = np.zeros(lmax)
    for i in range(lmax):
        ydot[i] = y[i]**2-y[i]**3
    return ydot

def fulljac(y, t,):
    J = np.zeros((lmax, lmax))
    J[0,0]=-3*y[0]**2 + 2*y[0]
    J[1,1]=-3*y[1]**2 + 2*y[1]
    J[2,2]=-3*y[2]**2 + 2*y[2]
    J[3,3]=-3*y[3]**2 + 2*y[3]
    J[4,4]=-3*y[4]**2 + 2*y[4]
    return J

## initial conditions and output times
delta = 0.0001;
yini  = np.array([delta]*lmax)
times = np.linspace(0, 2/delta, 100)

y, infodict = odeint(f1, yini, times, Dfun=fulljac, full_output=1)
print("f1: nst: {0}, nfe: {1}, nje: {2}".format(infodict["nst"][-1],
                                                infodict["nfe"][-1],
                                                infodict["nje"][-1]))

      

Using the full NxN jacob matrix, the integration is successful. Using only the diagonal and mu = 0 and ml = 0, the integration succeeds as well.

To test the matrix strip use case, I create an artificial 3xN striped Jacobian using mu = 1 and ml = 1 where all derivatives of the diagonal are zero. This causes strange behavior of the solver (similar to what I see in my original problem where the off-diagonal is not zero).

def bandjac(y, t):
    J = np.zeros((lmax, 3))
    J[0,1]=-3*y[0]**2 + 2*y[0]
    J[1,1]=-3*y[1]**2 + 2*y[1]
    J[2,1]=-3*y[2]**2 + 2*y[2]
    J[3,1]=-3*y[3]**2 + 2*y[3]
    J[4,1]=-3*y[4]**2 + 2*y[4]
    return J

y, infodict = odeint(f1, yini, times, Dfun=bandjac, full_output=1, mu=1, ml=1)
print("f1: nst: {0}, nfe: {1}, nje: {2}".format(infodict["nst"][-1],
                                                infodict["nfe"][-1],
                                                infodict["nje"][-1]))

      

What's the correct way to use the banded jacobian option with SciPy integrate.odeint?

+3


source to share


2 answers


For completeness, I'm answering my own question.

As Warren Walkesser pointed out, there is a bug in Scipy <0.14.0 in the way odeint handles grouped Jacobians.

Current documentation of odeint states:

Dfun should return a matrix whose rows contain non-zero stripes (starting from the lowest diagonal).



I think it is wrong. Instead, it should start at the highest diagonal.

The following code snippet shows how Dfun should return a jacobian (derived from test_integrate.py unit test ):

def func(y, t, c):
    return c.dot(y)

def jac(y, t, c):
    return c

def bjac_rows(y, t, c):
    return np.array([[   0,    75,       1,  0.2], # mu - upper 
                     [ -50,  -0.1,  -1e-04,  -40], # diagonal
                     [ 0.2,   0.1,    -0.2,   0]]) # lower - ml

c = array([[-50,    75,     0,   0],
            [0.2, -0.1,     1,   0],
            [0,    0.1, -1e-4,   0.2],
            [0,      0,   -0.2, -40]])

y0 = arange(4)

t = np.linspace(0, 50, 6)

# using the full jacobian
sol0, info0 = odeint(func, y0, t, args=(c,), full_output=True,Dfun=jac)
print("f1: nst: {0}, nfe: {1}, nje: {2}".format(info0["nst"][-1],
                                                info0["nfe"][-1],
                                                info0["nje"][-1]))

# using the row based banded jacobian
sol2, info2 = odeint(func, y0, t, args=(c,), full_output=True,Dfun=bjac_rows, ml=1, mu=1)
print("f1: nst: {0}, nfe: {1}, nje: {2}".format(info2["nst"][-1],
                                                info2["nfe"][-1],
                                                info2["nje"][-1]))

      

Note : transposed stripe matrix does not work with col_deriv = True

+1


source


I made a function to convert the jacobian matrix to strip form as expected on odeint

, as well as the parameters mu

and ml

. Entry is expected np.array

. For best performance, you probably want to turn the results matrix into np.array

and then return the jacobian function to your transpose and use col_der=True

. I think BrainGrylls is correct that you should start at the highest diagonal first, as opposed to the current API documentation.



def make_banded_jacobian(matrix):
    '''returns a banded jacobian list (in odeint format), along with mu and ml parameters'''

    # first find the values of mu and ml
    dims = matrix.shape[0]
    assert dims == matrix.shape[1]
    mu = 0
    ml = 0

    for row in xrange(dims):
        for col in xrange(dims):
            if matrix[row][col] != 0:
                if col > row:
                    dif = col - row
                    mu = max(mu, dif)
                else:
                    dif = row - col
                    ml = max(ml, dif)

    banded = []

    for yoffset in xrange(-mu, ml+1):
        row = []

        for diag in xrange(dims):
            x_index = diag
            y_index = diag + yoffset

            if y_index < 0 or y_index >= dims:
                row.append(0.0)
            else:
                row.append(matrix[y_index][x_index])

        banded.append(row)

    return (banded, mu, ml)

      

0


source







All Articles