Diver
Diver

Reputation: 11

YOLO - tensorflow works on cpu but not on gpu

I've used YOLO detection with trained model using my GPU - Nvidia 1060 3Gb, and everything worked fine.

Now I am trying to generate my own model, with param --gpu 1.0. Tensorflow can see my gpu, as I can read at start those communicates: "name: GeForce GTX 1060 major: 6 minor: 1 memoryClockRate(GHz): 1.6705" "totalMemory: 3.00GiB freeMemory: 2.43GiB"

Anyway, later on, when program loads data, and is trying to start learning i got following error: "failed to allocate 832.51M (872952320 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY"

I've checked if it tries to use my other gpu (Intel 630) , but it doesn't.

As i run the train process without "--gpu" mode option, it works fine, but slowly. ( I've tried also --gpu 0.8, 0.4 ect.)

Any idea how to fix it?

Upvotes: 1

Views: 1521

Answers (2)

Danh Nguyen
Danh Nguyen

Reputation: 318

Look like your custom model use to much memory and the graphic card cannot support it. You only need to use the --batch option to control the size of memory.

Upvotes: 0

Diver
Diver

Reputation: 11

Problem solved. Changing batch size and image size in config file didn't seem to help as they didn't load correctly. I had to go to defaults.py file and change them up there to lower, to make it possible for my GPU to calculate the steps.

Upvotes: 0

Related Questions