tech_a_break
tech_a_break

Reputation: 163

How to verify Tensorflow Serving is using GPUs on a GPU instance?

While running Tensorflow Serving how to verify it uses GPUs for serving? Configured Tensorflow to use GPUs during ./configure.

Tried monitoring nvidia-smi while running inference it shows no running process found.

Upvotes: 1

Views: 1290

Answers (1)

aforwardz
aforwardz

Reputation: 3389

First, of course you need to configure to use cuda when ./configure

Second, you should compile tf serving using

bazel build -c opt --config=cuda tensorflow/...

and

bazel build -c opt --config=cuda --spawn_strategy=standalone //tensorflow_serving/model_servers:tensorflow_model_server

Lastly you can see the information when you serve the model if serving with GPU:

I external/org_tensorflow/tensorflow/core/common_runtime/gpu/gpu_device.cc:965] Found device 0 with properties: name: GeForce GTX 1070 major: 6 minor: 1 memoryClockRate(GHz): 1.721 pciBusID: 0000:01:00.0 totalMemory: 7.92GiB freeMemory: 7.76GiB I external/org_tensorflow/tensorflow/core/common_runtime/gpu/gpu_device.cc:1055] Creating TensorFlow device (/device:GPU:0) -> (device: 0, name: GeForce GTX 1070, pci bus id: 0000:01:00.0, compute capability: 6.1)

and check the nvidia-smi at the same time

+----------------------------------------------------------------------------------------------------------------------------+ | Processes: ______________________________________________________GPU Memory |
| GPU _______ PID ______ Type _______ Process name _______________________ Usage | |======================================================================---| | 0 _________1215________G ________/usr/lib/xorg/Xorg_______________________59MiB | | 0 _________ 7341_______ C ___ ...ing/model_servers/tensorflow_model_server __7653MiB | +----------------------------------------------------------------------------------------------------------------------------+

GPU is consumed a lot.

Upvotes: 1

Related Questions