Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[R-package] Temporary workaround for not modifying global OMP config #5134

Closed
wants to merge 2 commits into from

Conversation

david-cortes
Copy link
Contributor

ref #4705

This is a temporary workaround to avoid modifying the global OpenMP user configuration for the whole R process when running lgb.train. This PR should be reverted once issue #4705 is fixed properly.

Copy link
Collaborator

@jameslamb jameslamb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your proposal, but I don't support introducing a new required dependency to the R package for this purpose.

Instead of this fix that benefits only the R package, and even then only one part of it (i.e. not other multi-threaded operations like prediction and Dataset construction), if you're interested in seeing LightGBM's behavior here change, please work with @guolinke and @shiyu1994 on #4705 to implement something at the C++ level that all users of LightGBM will benefit from.

@david-cortes
Copy link
Contributor Author

In #4705, @guolinke suggested a workaround like this. It does BTW fix the issue for dataset construction when it happens inside lgb.train.

@jameslamb
Copy link
Collaborator

If you mean #4705 (comment)

A quick solution is to have a context manager. For example, when start running LightGBM, it records current omp nthreads, then when exiting LightGBM, it resets the omp nthreads by the previous recorded values.

I believe that was a recommendation for how LightGBM's C++ code would manage getting and setting the the number of threads, not a recommendation that every wrapper of that code do this itself.

@shiyu1994
Copy link
Collaborator

I agree that such changes should be made in C++ so that both R and Python packages can leverage the workaround. I can help to do that.

@jameslamb
Copy link
Collaborator

both R and Python packages

Thanks @shiyu1994 . For the benefit of everyone reading this PR, I just want to clarify...when I refer to wrappers of LightGBM's C++ code, I don't only mean the R and Python package in this repo. I also mean SynapseML (formerly MMLSpark), the Java interface generated with SWIG, LightGBM.NET, and the other projects listed at https://github.com/microsoft/LightGBM#external-unofficial-repositories.

Based on this discussion, I'm going to close this PR.

@david-cortes thanks very much for documenting #4705 and for helping this project try to address it!

@jameslamb jameslamb closed this Apr 9, 2022
@github-actions
Copy link

This pull request has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 23, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants