Reputation: 13
We can train our deep learning model on GPU, I know how to enable NVIDIA graphics to train deep learning models, I just want to ask how can we use AMD Radeon graphics to train deep learning models in Jupyter Notebook.
Upvotes: 1
Views: 3339
Reputation: 253
Usually, we run the deep learning model in Nvidia graphic card out of the support of cuDNN and CUDA.
As I know, ROCm based on AMD graphic card now supports TensorFlow, Caffe, MXnet, etc.
you can try to realize this platform (ROCm), but according to the experiment run by some person show that the training phase and model performances are not good as running in Nvidia GPU.
As a solution for this, you can use google colab.
Upvotes: 2