Adauto.Almeida
Adauto.Almeida

Reputation: 113

How to implement a negative binomial loss function in python to use in light GBM?

I have a machine learning problem that I believe the negative binomial loss function would fit well, but the light gbm package doesn't have it as a standard, I'm trying to implement it, but I'm don't know how to get Gradient and Hessian, does anyone know how I can do this? I managed to get to the loss function, but I can't get to the gradient and hessian.

import math

def custom_asymmetric_valid(y_pred,y_true):
    y_true = y_true.get_label()
    p = 0.5
    n = y_pred
    loss = math.gamma(n) + math.gamma(y_true + 1) - math.gamma(n + y_true) - n * math.log(p) - y_true * math.log(1 - p)
    return "custom_asymmetric_eval", np.mean(loss), False

Now how to get the Gradient and Hessian?

def custom_asymmetric_train(y_pred,y_true):
    residual = (y_true.get_label() - y_pred).astype("float")

    grad = ?
    hess = ?

    return grad, hess

Anyone could help?

Upvotes: 4

Views: 1690

Answers (1)

Marco Cerliani
Marco Cerliani

Reputation: 22021

this is possible with scipy automatically:

from scipy.misc import derivative
from scipy.special import gamma

def custom_asymmetric_train(y_pred, dtrain):

    y_true = dtrain.label
    p = 0.5

    def loss(x,t):
        loss = gamma(x) + gamma(t+1) - gamma(x+t) - x*np.log(p) - t*np.log(1-p)
        return loss

    partial_d = lambda x: loss(x, y_true)
    grad = derivative(partial_d, y_pred, n=1, dx=1e-6)
    hess = derivative(partial_d, y_pred, n=2, dx=1e-6)

    return grad, hess

Upvotes: 3

Related Questions