Hrihaan
Hrihaan

Reputation: 285

Scipy least squares positional argument issue

I am trying to do a robust non-linear fitting of the following data:

r_fast:

[0.2065 0.2661 0.2026 0.22   0.2065 0.2661 0.264  0.2173 0.2615 0.2682
 0.407  0.4085 0.409  0.4045 0.405  0.3985 0.5235 0.5846 0.5171 0.5385
 0.6415 0.7661 0.699  0.6523 0.7745 0.7332 0.842  0.9085 0.909  0.8445
 0.84   0.8635]

a_fast:

[-43.3  -3.  -86.8 -10.5 -56.2  -2.5  -7.2 -12.2  -4.6  -9.  -21.3  -2.
  -3.2  -2.7  -5.8  -6.8 -15.5  -1.8 -22.1  -0.5  -8.7  -0.8   0.   -3.3
  -0.8  -0.8 -12.5  -0.5  -0.7   0.3  -1.   -1.2]

I tried the following approach. However, I am receiving an error on this line:

res_soft_l1 = least_squares(f, x, loss='soft_l1', f_scale=0.1, args=(r_fast, a_fast))

The error is:

f() missing 1 required positional argument: 'x2'

All code as follow:

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from scipy.optimize import least_squares
def f(r_fast, x0, x1, x2):
    return x[0] + r_fast**x[1] * x[2]
data= pd.read_table('/Users/Hrihaan/Desktop/Data.txt', dtype=float, header=None, sep='\s+').values
r_fast=data[:,1]
a_fast=data[:,2]
r_min=np.min(r_fast)
r_max=np.max(r_fast)
x = np.array([1.0, 1.0, 0.0])
rr= np.linspace(r_min, r_max, len(r_fast))
res_soft_l1 = least_squares(f, x, loss='soft_l1', f_scale=0.1, args=(r_fast, a_fast))
aa= f(rr, *res_soft_l1.x)
plt.xlabel('r_fast', fontsize=30)
plt.ylabel('a_fast', fontsize=30)
plt.scatter(r_fast, a_fast, c='burlywood', s=10**2)
plt.plot(rr, aa, linewidth=3, label='Power law fit')
plt.legend(fontsize=25, loc=8, framealpha=1.0, edgecolor='maroon') 
plt.show()

I am unable to figure out what I am missing. Any help would be greatly appreciated. Thanks in advance.

Upvotes: 0

Views: 1454

Answers (2)

Siva-Sg
Siva-Sg

Reputation: 2821

There are few issues with the code .

  1. The function must return the "residuals", that is the error between the prediction and actual values (y) and not the prediction. I guess a_fast are the actual values in your case.
  2. The parameters to be optimized must always be the first argument to the function. In this case [x0, x1,and x2]
  3. Any other additional parameters of the function should be passed as argsto the least_squares function. I believe "r_fast" is your additional parameter.

Following code is the minimal code that works.

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from scipy.optimize import least_squares

r_fast = np.array([0.2065 ,0.2661,0.2026,0.22,0.2065,0.2661,0.264,0.2173,0.2615,0.2682
,0.407,0.4085,0.409,0.4045,0.405,0.3985,0.5235,0.5846,0.5171,0.5385
,0.6415,0.7661,0.699,0.6523,0.7745,0.7332,0.842,0.9085,0.909,0.8445
,0.84,0.8635])
a_fast = np.array([-43.3 , -3. , -86.8 ,-10.5 ,-56.2,  -2.5 , -7.2 ,-12.2,  -4.6  ,-9., -21.3  ,-2  , -3.2,  -2.7 , -5.8 , -6.8 ,-15.5 , -1.8, -22.1 , -0.5 , -8.7,  -0.8,   0. ,  -3.3 ,  -0.8,  -0.8, -12.5,  -0.5,  -0.7,   0.3 , -1. ,  -1.2])

def f(X ,r_fast):
    x0 ,x1 ,x2 = X
    return x0 + r_fast**x1 * x2 -a_fast



x_init = np.array([1.0, 1.0, 0.0])

res_soft_l1 = least_squares(f, x_init, args= ([r_fast]) ,loss='soft_l1', f_scale=0.1)

output:

res_soft_l1.x

array([-5.43168803e+03,  1.31665146e-03,  5.43206946e+03])

Upvotes: 2

newkid
newkid

Reputation: 1458

This is because x need 4 arguments but is only receiving 3. In the line least_squares(f, x, loss='soft_l1', f_scale=0.1, args=(r_fast, a_fast)), least_squares calls f with the following arguments:

f(r_fast=x, x0=r_fast, x1=a_fast, x2=)

As you can see x2 is missing. In any case none of these would help because your function f doesn't use x0, x1 or x2.

You can change your function definition to:

def f(x, r_fast):
    return x[0] + r_fast**x[1] * x[2]

and least_squares call to

least_squares(f, x, loss='soft_l1', f_scale=0.1, args=(r_fast))

Upvotes: 1

Related Questions