Reputation: 53916
This question is related to Will scikit-learn utilize GPU? but I don't think offers same answer. I'm executing scikit-learn algorithms against an Nvidia GPU without error so assume scikit is running on the underlying hardware. As scikit-learn is not designed to execute against GPU what is process that enables the algorithms to run ?
For example I'm running executing scikit-learn algorithms using Gigabyte Nvidia GTX 1060 WF2 3GB GDDR5 PCI-E
with spec :
1152 NVIDIA CUDA Cores
1582MHz Base/1797MHz Boost Clock (OC Mode) or 1556MHz Base/1771MHz Boost Clock (Gaming Mode)
3GB GDDR5 8008MHz Memory
Using scikit-learn are some of the cores not being executed against ?
Update :
I use Nvidia docker container to run container on GPU as specified : https://github.com/NVIDIA/nvidia-docker. I've installed scikit on this container so scikit-learn
algorithms are being executed on GPU ?
Upvotes: 1
Views: 11524
Reputation: 492
From my experience, I use this package to utilize GPU for some sklearn algorithms in here.
The code I use:
from sklearnex import patch_sklearn
from daal4py.oneapi import sycl_context
patch_sklearn()
Source: oneAPI and GPU support in Intel(R) Extension for Scikit-learn
Upvotes: 3
Reputation: 1565
scikit-learn does not and can not run on the GPU. See this answer in the scikit-learn FAQ.
Upvotes: 4