Reputation: 3166
I have this error for trying to load a saved SVM model. I have tried uninstalling sklearn, NumPy and SciPy, reinstalling the latest versions all-together again (using pip). I am still getting this error. Why?
In [1]: import sklearn; print sklearn.__version__
0.18.1
In [3]: import numpy; print numpy.__version__
1.11.2
In [5]: import scipy; print scipy.__version__
0.18.1
In [7]: import pandas; print pandas.__version__
0.19.1
In [10]: clf = joblib.load('model/trained_model.pkl')
---------------------------------------------------------------------------
RuntimeWarning Traceback (most recent call last)
<ipython-input-10-5e5db1331757> in <module>()
----> 1 clf = joblib.load('sentiment_classification/model/trained_model.pkl')
/usr/local/lib/python2.7/dist-packages/sklearn/externals/joblib/numpy_pickle.pyc in load(filename, mmap_mode)
573 return load_compatibility(fobj)
574
--> 575 obj = _unpickle(fobj, filename, mmap_mode)
576
577 return obj
/usr/local/lib/python2.7/dist-packages/sklearn/externals/joblib/numpy_pickle.pyc in _unpickle(fobj, filename, mmap_mode)
505 obj = None
506 try:
--> 507 obj = unpickler.load()
508 if unpickler.compat_mode:
509 warnings.warn("The file '%s' has been generated with a "
/usr/lib/python2.7/pickle.pyc in load(self)
862 while 1:
863 key = read(1)
--> 864 dispatch[key](self)
865 except _Stop, stopinst:
866 return stopinst.value
/usr/lib/python2.7/pickle.pyc in load_global(self)
1094 module = self.readline()[:-1]
1095 name = self.readline()[:-1]
-> 1096 klass = self.find_class(module, name)
1097 self.append(klass)
1098 dispatch[GLOBAL] = load_global
/usr/lib/python2.7/pickle.pyc in find_class(self, module, name)
1128 def find_class(self, module, name):
1129 # Subclasses may override this
-> 1130 __import__(module)
1131 mod = sys.modules[module]
1132 klass = getattr(mod, name)
/usr/local/lib/python2.7/dist-packages/sklearn/svm/__init__.py in <module>()
11 # License: BSD 3 clause (C) INRIA 2010
12
---> 13 from .classes import SVC, NuSVC, SVR, NuSVR, OneClassSVM, LinearSVC, \
14 LinearSVR
15 from .bounds import l1_min_c
/usr/local/lib/python2.7/dist-packages/sklearn/svm/classes.py in <module>()
2 import numpy as np
3
----> 4 from .base import _fit_liblinear, BaseSVC, BaseLibSVM
5 from ..base import BaseEstimator, RegressorMixin
6 from ..linear_model.base import LinearClassifierMixin, SparseCoefMixin, \
/usr/local/lib/python2.7/dist-packages/sklearn/svm/base.py in <module>()
6 from abc import ABCMeta, abstractmethod
7
----> 8 from . import libsvm, liblinear
9 from . import libsvm_sparse
10 from ..base import BaseEstimator, ClassifierMixin
__init__.pxd in init sklearn.svm.libsvm (sklearn/svm/libsvm.c:10207)()
RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility. Expected 96, got 80
UPDATE: OK, by following here, and
pip uninstall -y scipy scikit-learn
pip install --no-binary scipy scikit-learn
The error has now gone, though I still have no idea why it occurred in the first place...
Upvotes: 190
Views: 247408
Reputation: 2788
I had the same issue in my context, someone was installing a certain module:
pip install biopython==1.81
and I saw in the old CI (Docker container build) logs, that this used to implicitly download numpy as transitive dependency:
Downloading numpy-1.26.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB)
since recently, for whatever reason, this now downloads
Downloading numpy-2.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (19.5 MB)
probably the dependency spec in biopython is wrong. In any case, this then breaks at rutime with the OP's error.
Anyways, I solved it by freezing the numpy version to the old one:
pip install biopython==1.81 numpy==1.26.4
and another day where everything breaks all the time and needs to be fixed all the time ends :)
Upvotes: 0
Reputation: 27980
We are getting this error :
ValueError: numpy.dtype size changed, may indicate binary incompatibility.
with the latest release of numpy 2.0 as well.
https://github.com/numpy/numpy/releases
We expect that some modules will need time to support NumPy 2
till then:
pip install numpy==1.26.4
Upvotes: 24
Reputation: 97
if you are in an anaconda environment simply use:
conda update --all
or:
conda update numpy
Upvotes: 7
Reputation: 97
Just upgrade your numpy module, right now it is 1.15.4. For windows
pip install numpy --upgrade
Upvotes: 8
Reputation: 2573
Note that as of cython 0.29 there is a new check_size option that eliminates the warning at the source, so no work-arounds should be needed once that version percolates to the various packages
Upvotes: 0
Reputation: 738
It's the issue of new numpy version (1.15.0)
You can downgrade numpy and this problem will be fixed:
sudo pip uninstall numpy
sudo pip install numpy==1.14.5
Finally numpy 1.15.1 version is released so the warning issues are fixed.
sudo pip install numpy==1.15.1
This is working..
Upvotes: 40
Reputation: 38307
Meta-information: The recommended way to install sklearn
If you already have a working installation of numpy and scipy, the easiest way to install scikit-learn is using
pip
pip install -U scikit-learn
or
conda
:conda install scikit-learn
[... do not compile from source using pip]
If you don’t already have a python installation with numpy and scipy, we recommend to install either via your package manager or via a python bundle. These come with numpy, scipy, scikit-learn, matplotlib and many other helpful scientific and data processing libraries.
Upvotes: 0
Reputation: 182
I've tried the above-mentioned ways, but nothing worked. But the issue was gone after I installed the libraries through apt install,
For Python3,
pip3 uninstall -y numpy scipy pandas scikit-learn
sudo apt update
sudo apt install python3-numpy python3-scipy python3-pandas python3-sklearn
For Python2,
pip uninstall -y numpy scipy pandas scikit-learn
sudo apt update
sudo apt install python-numpy python-scipy python-pandas python-sklearn
Hope that helps.
Upvotes: 9
Reputation: 36126
According to MAINT: silence Cython warnings about changes dtype/ufunc size. - numpy/numpy:
These warnings are visible whenever you import scipy (or another package) that was compiled against an older numpy than is installed.
and the checks are inserted by Cython (hence are present in any module compiled with it).
Long story short, these warnings should be benign in the particular case of numpy
, and these messages are filtered out since numpy 1.8
(the branch this commit went onto). While scikit-learn 0.18.1
is compiled against numpy 1.6.1
.
To filter these warnings yourself, you can do the same as the patch does:
import warnings
warnings.filterwarnings("ignore", message="numpy.dtype size changed")
warnings.filterwarnings("ignore", message="numpy.ufunc size changed")
Of course, you can just recompile all affected modules from source against your local numpy
with pip install --no-binary :all:
¹ instead if you have the balls tools for that.
Longer story: the patch's proponent claims there should be no risk specifically with numpy
, and 3rd-party packages are intentionally built against older versions:
[Rebuilding everything against current numpy is] not a feasible solution, and certainly shouldn't be necessary. Scipy (as many other packages) is compatible with a number of versions of numpy. So when we distribute scipy binaries, we build them against the lowest supported numpy version (1.5.1 as of now) and they work with 1.6.x, 1.7.x and numpy master as well.
The real correct would be for Cython only to issue warnings when the size of dtypes/ufuncs has changes in a way that breaks the ABI, and be silent otherwise.
As a result, Cython's devs agreed to trust the numpy team with maintaining binary compatibility by hand, so we can probably expect that using versions with breaking ABI changes would yield a specially-crafted exception or some other explicit show-stopper.
¹The previously available --no-use-wheel
option has been removed since pip 10.0.0
.
Upvotes: 171
Reputation: 3009
This error occurs because the installed packages were build agains different version of numpy.
We need to rebuild scipy and scikit-learn against the local numpy
.
For new pip
(in my case pip 18.0
) this worked:
pip uninstall -y scipy scikit-learn
pip install --no-binary scipy,scikit-learn -I scipy scikit-learn
--no-binary
takes a list of names of packages that you want to ignore binaries for. In this case we passed --no-binary scipy,scikit-learn
which will ignore binaries for packages scipy,scikit-learn.
Didn't help me
Upvotes: 1
Reputation: 1
My enviroment is Python 2.7.15
I try
pip uninstall
pip install --no-use-wheel
but it does not work. It shows the error:
no such option: --no-use-wheel
Then I try:
pip uninstall
pip install --user --install-option="--prefix=" -U scikit-learn
And it works: the useless warnings do not show.
Upvotes: -4
Reputation: 11
When import scipy, error info shows: RuntimeWarning: builtin.type size changed, may indicate binary incompatibility. Expected zd, got zd
I solved this problem by updating python version from 2.7.2 to 2.7.13
Upvotes: -6