Skip to content
This repository has been archived by the owner on Jan 7, 2025. It is now read-only.

Bugfix - Torch and CUDA_VISIBLE_DEVICES #1130

Merged
merged 1 commit into from
Oct 4, 2016

Conversation

lukeyeager
Copy link
Member

I think this has been broken forever. But it's a bigger deal now that #1091 encourages users to use CUDA_VISIBLE_DEVICES to choose the GPUs that will be used in DIGITS.

To test:

  1. export CUDA_VISIBLE_DEVICES=1,0
  2. ./digits-devserver
  3. Train a model with torch
  4. You'll see 0 utilization for the selected GPU. nvidia-smi will tell you that Torch is using the other GPU.

/cc @gheinrich

Copy link
Contributor

@gheinrich gheinrich left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

verified the bug and fix. Thanks!

map_visible_to_real = {}
for visible, real in enumerate(old_cvd.split(',')):
map_visible_to_real[visible] = int(real)
print map_visible_to_real
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that little print was forgotten here :-)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops, thanks for the catch!

@lukeyeager lukeyeager force-pushed the fix-torch-cudavisibledevices branch from 9358423 to b647c20 Compare October 3, 2016 17:23
@lukeyeager
Copy link
Member Author

I can't get Travis to pass because of #1134.

@lukeyeager lukeyeager merged commit 38825d1 into NVIDIA:master Oct 4, 2016
@lukeyeager lukeyeager deleted the fix-torch-cudavisibledevices branch October 4, 2016 17:19
SlipknotTN pushed a commit to cynnyx/DIGITS that referenced this pull request Mar 30, 2017
…evices

Bugfix - Torch and CUDA_VISIBLE_DEVICES
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants