santobedi
santobedi

Reputation: 858

Kernel parameters of Gaussian Process Regression: How to get them in Scikit-learn?

I use the squared exponential kernel or RBF in my regression operation using GaussianProcessRegressor of Scikit-learn. In addition, I use the internally available optimizer 'fmin_l_bfgs_b' (L-BFGS-B algorithm) to optimize the Kernel parameters. The kernel parameters are length scale and signal variance in my case. The documentation regarding log_marginal_likelihood is as follows:

enter image description here

I followed this documentation to print the GPML kernel and log_marginal_likelihood. Following is the code snippet:

print("GPML kernel: %s" % gp.kernel_)
print("Log-marginal-likelihood:",
       gp.log_marginal_likelihood(gp.kernel_.theta, eval_gradient = True))

Following value is printed at the console:

    GPML kernel: 31.6**2 * RBF(length_scale=1.94)
Log-marginal-likelihood: (-115.33295413296841, array([ 1.01038168e+02, -2.16465175e-07]))

Here, I could not figure out the values printed at the Log-marginal-likelihood. What are the values inside the array?

My code snippet regarding the regression is as follows:

x_train = np.array([[0,0],[2,2],[3,3]])
y_train = np.array([200,321,417])
xvalues = np.array([0,1,2,3])
yvalues = np.array([0,1,2,3])
a,b = np.meshgrid(xvalues,yvalues)
positions = np.vstack([a.ravel(), b.ravel()])
x_test = (np.array(positions)).T
kernel = C(1.0, (1e-3, 1e3)) * RBF(10)
gp = GaussianProcessRegressor(kernel=kernel, optimizer = 'fmin_l_bfgs_b',alpha = 1.5, n_restarts_optimizer=5)
gp.fit(x_train, y_train)
y_pred_test, sigma = gp.predict(x_test, return_std =True)

Is my approach to printing the kernel parameters correct?

Thank you!

Upvotes: 4

Views: 6010

Answers (1)

Dan Reia
Dan Reia

Reputation: 1135

The values returned by gp.log_marginal_likelihood are indicated in the docs that you attached, the first value is actually the resulting log marginal likelihood of the passed parameter, in your case gp.kernel_.theta and the values in the array are the gradients with respect to the kernel's parameters.

To actually get the resulting kernel parameters post-optimization. Use the returned kernel either with:

gp.kernel_.get_params()

which returns a dictionary including the parameters, or you can get them individually using:

gp.kernel_.k1

and

gp.kernel_.k2

Upvotes: 9

Related Questions