Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom loss reconstruction for Tweedie/Regression_l1 loss #6160

Closed
hilade opened this issue Oct 29, 2023 · 1 comment
Closed

Custom loss reconstruction for Tweedie/Regression_l1 loss #6160

hilade opened this issue Oct 29, 2023 · 1 comment
Labels

Comments

@hilade
Copy link

hilade commented Oct 29, 2023

Description

I am working on a project where I need to implement a custom loss function for the LGBMRegressor model.
Specifically, I am interested in reconstructing (as a first step) the Tweedie or Regression_l1 loss that is being used in the model as an objective. This would allow me to have more flexibility in customizing the loss function while retaining the model's compatibility with existing LightGBM functionalities.

I tried looking at the C++ implementation of LightGBM#RegressionTweedieLoss() and implement the function in python.
I expected my custom loss function to produce results equivalent to using Tweedie or Regression L1 loss in the objective hyperparameter but the results were very different (+- 100% difference).

Reproducible example

LGBMRegressor(boosting_type='gbdt',
                             num_leaves=1000,
                             max_depth=20,
                             learning_rate=0.01,
                             n_estimators=100,
                             objective=customLossTweedie, # or objective=customLossRegression_l1
                             min_split_gain=0.03,
                             min_child_samples=400,
                             subsample=0.5,
                             colsample_bytree=0.8,
                             reg_alpha=0.5,
                             reg_lambda=0.5,
                             nthread=-1,
                             random_state=10)

def customLossTweedie(y_true, y_pred):
    rho = 1.5
    exp_1_score = np.exp((1 - rho) * y_pred)
    exp_2_score = np.exp((2 - rho) * y_pred)
    grad = -y_true * exp_1_score + exp_2_score
    hess = -y_true * (1 - rho) * exp_1_score + (2 - rho) * exp_2_score
    return grad, hess

def customLossRegression_l1(y_true, y_pred):
    diff = y_pred - y_true
    grad = np.sign(diff)
    hess = np.ones_like(y_pred)
    return grad, hess

Additional Information

LightGBM Version: 3.3.5
Programming Language: Python
Operating System: macOS

Any guidance or examples on how to achieve this customization would be greatly appreciated.
Thank you for your assistance!

@jameslamb
Copy link
Collaborator

Thanks for using LightGBM.

Also...I see that you double-posted this here and on Stack Overflow (link).

Please do not do that. Maintainers here also monitor the [lightgbm] tag on Stack Overflow. I could have been spending time preparing an answer here while another maintainer was spending time answering your Stack Overflow post, which would have been a waste of maintainers' limited attention that could otherwise have been spent improving this project. Double-posting also makes it less likely that others with a similar question will find the relevant discussion and answer.

Since you're already getting answers on Stack Overflow, I'm going to close this and encourage anyone who can help to please go answer there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants