phacoo
phacoo

Reputation: 23

Newton method in python / scipy

i am trying to get root of a function using Newton's algorithm on python. I have a runtime error even if i change the precision level. can you kindly help me understand how can i improve it?

Best, GB

Below my 'simple' code and the root finding part:

from scipy.stats import norm
import numpy as np 
import matplotlib.pyplot as plt  
import scipy.optimize as opt


def vega_callspread(r,S,T,d,sigma,q,K1,K2):

    d1 = (np.log(S/K1)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))
    d2 = (np.log(S/K2)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))

    u1 = S*np.exp(-q*T)*norm.pdf(d1,0,1)
    u2 = S*np.exp(-q*T)*norm.pdf(d2,0,1)

    return u1-u2;

x0=112
r=0
T=1
d=0
sigma=0.2
q=0
K1=110
K2=130




res2= opt.newton(vega_callspread, x0, args=(r,T,d,sigma,q,K1,K2,),tol=10**(-1),maxiter=1000000)


the error i get is: res2= opt.zeros.newton(vega_callspread, x0, args=(r,T,d,sigma,q,K1,K2,),tol=10**(-1),maxiter=1000000)
/Users/anaconda/lib/python3.5/site-packages/scipy/optimize/zeros.py:173: RuntimeWarning: Tolerance of 0.011300000000005639 reached
  warnings.warn(msg, RuntimeWarning)

Upvotes: 2

Views: 4338

Answers (2)

JeeyCi
JeeyCi

Reputation: 597

S is constant, give it as constant (Stock Pr per certain being analized date), devided by K (Strike Pr) - NO, you do not drvide by zero (as in the marked answer), - from the essence of the field (Option Pricing). Try change your tolerance param, to e.g. tol=1e-8

Upvotes: 0

sascha
sascha

Reputation: 33532

It's hard to give advice with such sparse context. But some remarks:

A:

You are not using newton, as described here:

The Newton-Raphson method is used if the derivative fprime of func is provided, otherwise the secant method is used.

B:

Your error comes from here i would say:

As those tol-values are hardcoded, i suppose this should not happen!

# Secant method
p0 = x0
if x0 >= 0:
    p1 = x0*(1 + 1e-4) + 1e-4
else:
    p1 = x0*(1 + 1e-4) - 1e-4
q0 = func(*((p0,) + args))
q1 = func(*((p1,) + args))
for iter in range(maxiter):
    if q1 == q0:
        if p1 != p0:
            msg = "Tolerance of %s reached" % (p1 - p0)
            warnings.warn(msg, RuntimeWarning)
        return (p1 + p0)/2.0

Meaning: there is probably something wrong with your code!

C:

Let's try the slower but more safe bisection-method:

# brackets not tuned! it's just some trial!
res2 = opt.bisect(vega_callspread, 0, 200, args=(r,T,d,sigma,q,K1,K2))

Output:

X:\so_newton.py:9: RuntimeWarning: divide by zero encountered in log
  d1 = (np.log(S/K1)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))
X:\so_newton.py:10: RuntimeWarning: divide by zero encountered in log
  d2 = (np.log(S/K2)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))

That's a bad sign.

D (follows C):

Your func reads:

def vega_callspread(r,S,T,d,sigma,q,K1,K2):

    d1 = (np.log(S/K1)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))
    d2 = (np.log(S/K2)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))

    u1 = S*np.exp(-q*T)*norm.pdf(d1,0,1)
    u2 = S*np.exp(-q*T)*norm.pdf(d2,0,1)

    return u1-u2;

and you call with:

x0=112
r=0
T=1
d=0
sigma=0.2
q=0
K1=110
K2=130

args=(r,T,d,sigma,q,K1,K2,)

There is no S!

So either you are dangerously renaming variables or there is some error on your end!

The value of S is the problem in log(S/K1) and co.

It's not about S/K1 but about this:

import numpy as np
np.log(0)
# __main__:1: RuntimeWarning: divide by zero encountered in log
# -inf

which i got from this SO-answer.

You can try to interpret what this is doing to your function now as this log-eval will get the value of -np.inf.

E How is scipy handling this (follows D):

I'm too lazy to read the docs/sources of args-handling, but let's check something (add a print to the func; use bisection as earlier):

def vega_callspread(r,S,T,d,sigma,q,K1,K2):

    print('S: ', S)

    d1 = (np.log(S/K1)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))
    d2 = (np.log(S/K2)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))

    u1 = S*np.exp(-q*T)*norm.pdf(d1,0,1)
    u2 = S*np.exp(-q*T)*norm.pdf(d2,0,1)

    return u1-u2;

Output:

S:  0
X:\so_newton.py:11: RuntimeWarning: divide by zero encountered in log
  d1 = (np.log(S/K1)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))
X:\so_newton.py:12: RuntimeWarning: divide by zero encountered in log
  d2 = (np.log(S/K2)+(r+sigma*sigma/2)*T)/(sigma*np.sqrt(T))
S:  0

So it seems arg-handling is by arg-name (and not just param-position in the call; i'm missing the correct programming-language terms here; again lazy!)

This (S=0) is probably very bad (and not what you want) and this is an error on your end!

EDIT

After your comment, it seems you try to optimize S. That made it clear to me, that you are using some optimization-algorithm optimizing x where you don't have x in your function!

I'm not analyzing your task here, but you probably want to make your function to use some x (initialized by x0) as this is the general idea of scipy.optimize. We can keep the name S, but it needs to be the first parameter of the function. This is all explained in the docs.

So:

def vega_callspread(S, r,T,d,sigma,q,K1,K2):  # S now first argument !!!
    ...

res2= opt.newton(vega_callspread, x0, args=(r,T,d,sigma,q,K1,K2,),tol=10**(-1),maxiter=1000000)  # S removed from args; S is init by x0 -> read docs!

Output:

117.214682594

Upvotes: 2

Related Questions