user3873107
user3873107

Reputation: 11

What is the alternative of CUDA GPU for model training with CPU support?

I dont have CUDA enabled GPU but I have i7 processor and 16GB Ram 1 GB amd graphics card

i want to disable that option and need to train a model with CPU support itself

mycodes are

parser = argparse.ArgumentParser()
parser.add_argument("--gpu", dest='gpu', type=str, default='0',                       help='Set CUDA_VISIBLE_DEVICES environment variable, optional')   
os.environ['CUDA_VISIBLE_DEVICES'] = args.gpu
params = vars(args)

how can i change into cpu version

Upvotes: 1

Views: 609

Answers (1)

RightmireM
RightmireM

Reputation: 2492

So, the above is just the argparser, which tells Python which values to accept at the command line. It just sets variable values within the code. Even if we change this, it wouldn't change how the code runs.

It depends on how your code is written (that actually calls the ML) but running on CPU is the default. Your code specifically has to tell it to run on the GPU.

With the line os.environ['CUDA_VISIBLE_DEVICES'] = args.gpu you're setting the environment variable CUDA_VISIBLE_DEVICES to the command-line-passed-in argument gpu ... which your code that calls the GPU will use.

But you need to change the code regarding how the ML processes are called.

Maybe you can post more code?

Upvotes: 1

Related Questions