You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I happened to encounter this problem. After reading code and testing, I did not need to modify the original inference code. I just add a line before the run_inference method in the main function: "os.environ['CUDA_VISIBLE_DEVICES'] = '7'" , so that pytorch will run on the graphics card numbered 7, which is the eighth graphics card.
In this way, I can choose the graphics card to use without setting parameters such as rank, gpu_num and so on.
The following is the sample code for inference.py. I only added the line os.environ['CUDA_VISIBLE_DEVICES'] = '7'.
Hello there, I'm trying to change the CUDA selection, I saw this:
I tried to change rank to 3 so I can use my third CUDA and also adding gpu_no=3 as param:
The text was updated successfully, but these errors were encountered: