Reputation: 69
I followed instructions here . I have installed all packages from http://www.lfd.uci.edu/~gohlke/pythonlibs/ (all the latest one).
It seems I installed successfully. I ran the code below in Ipython:
import pycuda.gpuarray as gpuarray
import pycuda.driver as cuda
import pycuda.autoinit
import numpy
a_gpu = gpuarray.to_gpu(numpy.random.randn(4,4).astype(numpy.float32)) ## pass
a_doubled = (2*a_gpu).get() ## the line can't be passed with Ipython
and got this error:
File "C:\Python27\lib\site-packages\pycuda\compiler.py", line 137, in compile_plain
lcase_err_text = (stdout+stderr).decode("utf-8").lower() File "C:\Python27\lib\encodings\utf_8.py", line 16, in decode return codecs.utf_8_decode(input, errors, True) UnicodeDecodeError: 'utf8' codec can't decode byte 0xb8 in position 109: invalid start byte
How to solve this issue? I have struggled several days.
Upvotes: 1
Views: 617
Reputation: 72349
This appears to have been caused by an error handling issue inside PyCUDA when code contains unparseable unicode. The bug was fixed in late 2013 and should have been pushed in the PyCUDA 2014.1 release.
[This answer was added as a community wiki entry to get this question off the unanswered list for the CUDA and PyCUDA tags]
Upvotes: 1