Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better documentation for loss functions #4790

Closed
lcrmorin opened this issue Nov 11, 2021 · 4 comments
Closed

Better documentation for loss functions #4790

lcrmorin opened this issue Nov 11, 2021 · 4 comments

Comments

@lcrmorin
Copy link

Summary

Loss function documentation currently send to wikipedia & kaggle. It's not clear how parameters (alpha for huber, quantile loss and c for fair loss) play. It's not clear what range are acceptable for these parameters.

Motivation

Better documentation for loss functions would help their usage and adoption.

Description

Some documentation of the loss, including: formula, theoretical aspects and parameters would be nice.

Ps: I am willing to participate in this documentation. But I am not familiar with the process of updating said documentation.

@jameslamb
Copy link
Collaborator

jameslamb commented Nov 12, 2021

Thanks very much for using LightGBM and for your interest in this.

Are you looking for LightGBM to maintain documentation on each objective function with a similar level of detail to scikit-learn's?

For example, https://scikit-learn.org/stable/modules/model_evaluation.html#mean-tweedie-deviance


Separate from that general question, we'd be very grateful for contributions that update the documentation for other parameters that are specific to individual objectives.

I agree with your assessment...documentation for fair_c, for example, doesn't currently explain these key things:

  • what is the scale of this parameter?
  • what are the implications of increasing or decreasing it?

I'd be happy to help you through the contributions, if you'd like! The process involves updating comments in a header file, which we use to code-generate the ReStructuredText used in the documentation site.

For example, for fair_c (docs link) you would add to this block:

// check = >0.0
// desc = used only in ``fair`` ``regression`` application
// desc = parameter for `Fair loss <https://www.kaggle.com/c/allstate-claims-severity/discussion/24520>`__
double fair_c = 1.0;

then run python helpers/parameter_generator.py

If you choose to help with this, we'd prefer one pull request per parameter, to keep the PRs small and easy to review.

@StrikerRUS StrikerRUS changed the title Documentation of loss functions Better documentation for loss functions Dec 16, 2021
@StrikerRUS
Copy link
Collaborator

Closed in favor of being in #2302. We decided to keep all feature requests in one place.

Welcome to contribute this feature! Please re-open this issue (or post a comment if you are not a topic starter) if you are actively working on implementing this feature.

@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity since it was closed.
To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues
including a reference to this.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 16, 2023
@microsoft microsoft unlocked this conversation Aug 17, 2023
@jameslamb
Copy link
Collaborator

Sorry, this was locked accidentally. Just unlocked it. We'd still love help with this feature!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants