如何用python实现负二项式损失函数在light GBM中使用?

问题描述 投票:1回答:1

我有一个机器学习的问题,我相信负二项式损失函数会很适合,但是light gbm包里没有它的标准,我想实现它,但是我不知道如何得到梯度和切分,有谁知道怎么做?我成功地得到了损失函数,但我不能得到梯度和恒信。

import math

def custom_asymmetric_valid(y_pred,y_true):
    y_true = y_true.get_label()
    p = 0.5
    n = y_pred
    loss = math.gamma(n) + math.gamma(y_true + 1) - math.gamma(n + y_true) - n * math.log(p) - y_true * math.log(1 - p)
    return "custom_asymmetric_eval", np.mean(loss), False

现在如何得到梯度和切分?

def custom_asymmetric_train(y_pred,y_true):
    residual = (y_true.get_label() - y_pred).astype("float")

    grad = ?
    hess = ?

    return grad, hess

有谁能帮助我?

python machine-learning gradient lightgbm hessian-matrix
1个回答
2
投票

这个是可以用scipy自动实现的。

from scipy.misc import derivative
from scipy.special import gamma

def custom_asymmetric_train(y_pred, dtrain):

    y_true = dtrain.label
    p = 0.5

    def loss(x,t):
        loss = gamma(x) + gamma(t+1) - gamma(x+t) - x*np.log(p) - t*np.log(1-p)
        return loss

    partial_d = lambda x: loss(x, y_true)
    grad = derivative(partial_d, y_pred, n=1, dx=1e-6)
    hess = derivative(partial_d, y_pred, n=2, dx=1e-6)

    return grad, hess
© www.soinside.com 2019 - 2024. All rights reserved.