Reputation: 119
Anyone got luck trying to install GPU support for lightgbm on Google Colab using the Notebooks there?
Upvotes: 8
Views: 10264
Reputation: 2670
Updated answer for lightgbm>=4.0.0
(released in July 2023)
lightgbm
(the Python package for LightGBM), comes with GPU support already included. As long as you are on a Windows or Linux system where a wheel is available (as on Google Colab as of this writing), no additional re-installation or manual compilation is required.
In your Colab notebook, select Edit -> Notebook Settings
and choose T4 GPU
(or whatever other NVIDIA GPU is available).
Run the following in a notebook cell, to tell LightGBM to use the NVIDIA OpenCL installable client driver (ICD). For more details on what that is, see the OpenCL docs (link).
!mkdir -p /etc/OpenCL/vendors && echo "libnvidia-opencl.so.1" > /etc/OpenCL/vendors/nvidia.icd
Then use lightgbm
with GPU support by passing "device": "gpu
, for example like this:
import lightgbm as lgb
from sklearn.datasets import make_regression
X, y = make_regression(n_samples=10_000)
dtrain = lgb.Dataset(X, label=y)
bst = lgb.train(
params={
"objective": "regression",
"device": "gpu",
"verbose": 1
},
train_set=dtrain,
num_boost_round=5
)
You will see something like the following in logs, confirming that the GPU is being used for training.
[LightGBM] [Info] This is the GPU trainer!!
[LightGBM] [Info] Total Bins 25500
[LightGBM] [Info] Number of data points in the train set: 10000, number of used features: 100
[LightGBM] [Info] Using GPU Device: Tesla T4, Vendor: NVIDIA Corporation
[LightGBM] [Info] Compiling OpenCL Kernel with 256 bins...
[LightGBM] [Info] GPU programs have been built
[LightGBM] [Info] Size of histogram bin entry: 8
[LightGBM] [Info] 100 dense feature groups (0.95 MB) transferred to GPU in 0.001878 secs. 0 sparse feature groups
[LightGBM] [Info] Start training from score 1.025020
NOTE: as of v4.0.0 of LightGBM, there is also a second way to use GPU acceleration for NVIDIA GPUs. Recompile lightgbm
with CUDA support by running the following:
!pip install \
--force-reinstall \
--no-binary lightgbm \
--config-settings=cmake.define.USE_CUDA=ON \
lightgbm
After restarting the notebook runtime, you can use the CUDA version by passing {"device": "cuda"}
through the parameters, like this:
import lightgbm as lgb
from sklearn.datasets import make_regression
X, y = make_regression(n_samples=10_000)
dtrain = lgb.Dataset(X, label=y)
bst = lgb.train(
params={
"objective": "regression",
"device": "cuda",
"verbose": 1
},
train_set=dtrain,
num_boost_round=5
)
Upvotes: 3
Reputation: 450
make sure you followed the installation steps correctly
!git clone --recursive https://github.com/Microsoft/LightGBM
%cd LightGBM
!mkdir build
%cd build
!cmake ../../LightGBM
!make -j4
after this you have to execute the setupfile in LightGBM folder
%cd LightGBM/python-package
!python3 setup.py install --gpu
Once thats done , you're all set. ps: make sure you have cmake installed, if not just
!pip install cmake
Upvotes: 5
Reputation: 816
Very simple: just run
!pip install lightgbm --install-option=--gpu
or
pip install lightgbm --install-option=--gpu --install-option="--opencl-include-dir=/usr/local/cuda/include/" --install-option="--opencl-library=/usr/local/cuda/lib64/libOpenCL.so"
Remember to enable GPU support in your notebook and add'device':'gpu'
in the lightgbm setting. And don't forget to uninstall the version of lightgbm that don't support gpu version first.
Upvotes: 6
Reputation: 119
Most of it was following the documentation provided here, with two small tweaks to make it work on Google Colab.
Since the instances are renewed after 12 hours of usage, I post this at the beginning of my notebook to reinstall GPU support with lightgbm :
!apt-get -qq install --no-install-recommends nvidia-375
!apt-get -qq install --no-install-recommends nvidia-opencl-icd-375 nvidia-opencl-dev opencl-headers
!apt-get -qq install --no-install-recommends git cmake build-essential libboost-dev libboost-system-dev libboost-filesystem-dev
!pip3 install -qq lightgbm --install-option=--gpu
Upvotes: 2