Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added check to prevent CUDA driver activation if CUDA is not functional #23

Merged
merged 1 commit into from
Oct 30, 2022

Conversation

rssdev10
Copy link
Contributor

This check is required when CUDA is imported somewhere but actual drivers are unavailable. Without that check, the package fails while attempting to CUDA libraries load.

@codecov-commenter
Copy link

codecov-commenter commented Oct 30, 2022

Codecov Report

Base: 87.98% // Head: 87.98% // No change to project coverage 👍

Coverage data is based on head (0e3df1d) compared to base (f79d863).
Patch coverage: 100.00% of modified lines in pull request are covered.

Additional details and impacted files
@@           Coverage Diff           @@
##             main      #23   +/-   ##
=======================================
  Coverage   87.98%   87.98%           
=======================================
  Files           3        3           
  Lines         383      383           
=======================================
  Hits          337      337           
  Misses         46       46           
Impacted Files Coverage Δ
src/ONNXRunTime.jl 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

Copy link
Owner

@jw3126 jw3126 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot! Should we throw a warning if CUDA is not functional?

@rssdev10
Copy link
Contributor Author

Actually, not sure that the warning message is required. The issue happened because it is hard to control what was loaded by 3-rd party components. We didn't use CUDA, but got that issues with drivers. CUDA.jl was included internally by Transformers.jl... The specifics of @require is unconditional activating of the code if the package was loaded somewhere before.

@jw3126 jw3126 merged commit 30a32df into jw3126:main Oct 30, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants