-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom objective does not have the same loss curves. #5350
Comments
Hi @nd7141, I believe this is due to the different init scores, as explained in #5114 (comment). If you set |
Thank you @jmoralez Is there a way to set up |
Yes, there's an example in #5114 (comment) passing it to Dataset but you can also pass it to the fit method in the scikit-learn API. Please let us know if you have further doubts. |
This issue has been automatically closed because it has been awaiting a response for too long. When you have time to to work with the maintainers to resolve this issue, please post a new comment and it will be re-opened. If the issue has been locked for editing by the time you return to it, please open a new issue and reference this one. Thank you for taking the time to improve LightGBM! |
This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this. |
I see that if I use custom L2 implementation instead of default one, I get much slower convergence. Why is that? I would expect two lines will be exactly the same.
Plot:
Code:
The text was updated successfully, but these errors were encountered: