Reputation: 2491
Importing from pyxdameraulevenshtein
gives the following error, I have
pyxdameraulevenshtein==1.5.3
pandas==1.1.4
scikit-learn==0.20.2.
Numpy is 1.16.1.
Works well in Python 3.6, Issue in Python 3.7.
Has anyone been facing similar issues with Python 3.7 (3.7.9), docker image python:3.7-buster
?
from pyxdameraulevenshtein import normalized_damerau_levenshtein_distance as norm_dl_dist
__init__.pxd:242: in init pyxdameraulevenshtein
???
E ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject
Upvotes: 239
Views: 385930
Reputation: 41206
This is a more detailed version of [SO]: ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject - (@FZeiser's answer) (great answer BTW).
In short, it's an ABI change (in NumPy).
As a consequence, under no circumstances, 2 versions (with and without the change) should be mixed together.
The error message (misleading for some users) comes from PyxDamerauLevenshtein.
[NumPy]: NumPy 1.20.0 Release Notes - Size of np.ndarray and np.void_ changed states:
The size of the PyArrayObject and PyVoidScalarObject structures have changed. The following header definition has been removed:
#define NPY_SIZEOF_PYARRAYOBJECT (sizeof(PyArrayObject_fields))
since the size must not be considered a compile time constant: it will change for different runtime versions of NumPy.
The most likely relevant use are potential subclasses written in C which will have to be recompiled and should be updated. Please see the documentation for PyArrayObject for more details and contact the NumPy developers if you are affected by this change.
NumPy will attempt to give a graceful error but a program expecting a fixed structure size may have undefined behaviour and likely crash.
[GitHub]: numpy/numpy - (v1.19.5) numpy/numpy/core/include/numpy/ndarraytypes.h#726:
#define NPY_SIZEOF_PYARRAYOBJECT (sizeof(PyArrayObject_fields))
is the last one that has it, in the subsequent versions, starting with v1.20.0 (and its preceding RCs), the line is commented out.
Also worth mentioning here that between the same versions, a new member has been added to (the end of) PyArrayObject_fields structure (~20 lines above):
void *_buffer_info; /* private buffer info, tagged to allow warning */
The numbers in the error message (80 and 88) are beginning to make some sense.
There's a boundary between the (above) 2 versions: anything that is built with one version and ran with the other is an unsupported scenario. It's just like misusing any other library: building with (headers and .so (.lib) from) one version and running with (the .so (.dll) from) another one - when the 2 versions are API / ABI incompatible.
2 cases emerge:
Build: v >= v1.20.0, Run: v <= v1.19.5: unsupported
Build: v <= v1.19.5, Run: v >= v1.20.0: technically also unsupported, but since there's no more value computed at compile (preprocess) time ([SO]: LNK2005 Error in CLR Windows Form (@CristiFati's answer)), it would work
[GitHub]: numpy/numpy - ENH,API: Store exported buffer info on the array (1) is the commit that introduced the changes.
[GitHub]: lanl/pyxDamerauLevenshtein - pyxDamerauLevenshtein is an extension module (C + Cython) that depends on NumPy. The source code (indirectly) includes ndarraytypes.h.
Browsing [PyPI]: pyxDamerauLevenshtein (latest versions at least) there's only one binary: for OSX pc064, and one Python version (probably the LTS at package publish time).
As a consequence, on all other platforms when pip install
ed, it's built from sources.
For simplicity's sake, will use v1.5.3 (like in the question).
As a side note, although setup.py requires NumPy >= v1.16.1, there is a requirements.txt file which contains v1.16.1 as a dependency (which is a bit misleading - especially when shallowly investigating).
The error message was added in v1.5.1 (specifically for this NumPy ABI change), and it was then removed in v1.7.0 when trying to move away from depending on NumPy at build time. I suppose it is because of the nightmare generated by the frequency of running into this situation (other packages depending on specific NumPy versions).
Removal commit: [GitHub]: lanl/pyxDamerauLevenshtein - first attempt of moving away from NumPy.
Once the theory is figured out, it's not hard to reproduce the problem.
Steps:
Install NumPy <= v1.19.5 (v1.19.5)
Install PyxDamerauLevenshtein v1.5.3
I prepared my "demo" on Win, as it happens to be the OS that I currently booted into.
There are also some additional commands to capture the state (at several points) in the process. Also some (unimportant) output will be stripped out:
(py_pc064_03.07_test1_q066060487) [cfati@CFATI-5510-0:e:\Work\Dev\StackOverflow\q066060487]> :: Python 3.7 console (py_pc064_03.07_test1_q066060487) [cfati@CFATI-5510-0:e:\Work\Dev\StackOverflow\q066060487]> sopr.bat ### Set shorter prompt to better fit when pasted in StackOverflow (or other) pages ### [prompt]> [prompt]> python -c "import sys;print(\"\n\".join((sys.executable, sys.version)))" E:\Work\Dev\VEnvs\py_pc064_03.07_test1_q066060487\Scripts\python.exe 3.7.9 (tags/v3.7.9:13c94747c7, Aug 17 2020, 18:58:18) [MSC v.1900 64 bit (AMD64)] [prompt]> [prompt]> python -m pip freeze [prompt]> [prompt]> python -m pip install numpy==1.19.5 Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com Collecting numpy==1.19.5 Downloading numpy-1.19.5-cp37-cp37m-win_amd64.whl (13.2 MB) ---------------------------------------- 13.2/13.2 MB 8.5 MB/s eta 0:00:00 Installing collected packages: numpy Successfully installed numpy-1.19.5 [prompt]> [prompt]> python -m pip freeze numpy==1.19.5 [prompt]> [prompt]> :: Pass -v to see what is actually going on. I`ll truncate some output [prompt]> [prompt]> python -m pip install -v pyxdameraulevenshtein==1.5.3 Using pip 22.3.1 from E:\Work\Dev\VEnvs\py_pc064_03.07_test1_q066060487\lib\site-packages\pip (python 3.7) Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com Collecting pyxdameraulevenshtein==1.5.3 Downloading pyxDamerauLevenshtein-1.5.3.tar.gz (58 kB) ---------------------------------------- 58.5/58.5 kB 1.0 MB/s eta 0:00:00 Running command pip subprocess to install build dependencies Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com, https://pypi.ngc.nvidia.com Collecting setuptools>=40.8.0 Downloading setuptools-66.0.0-py3-none-any.whl (1.3 MB) ---------------------------------------- 1.3/1.3 MB 5.0 MB/s eta 0:00:00 Collecting wheel>=0.33.1 Downloading wheel-0.38.4-py3-none-any.whl (36 kB) Collecting numpy>=1.16.1 Downloading numpy-1.21.6-cp37-cp37m-win_amd64.whl (14.0 MB) ---------------------------------------- 14.0/14.0 MB 9.2 MB/s eta 0:00:00 Installing collected packages: wheel, setuptools, numpy # @TODO - cfati: !!! Check NumPy version used at build time !!! Successfully installed numpy-1.21.6 setuptools-66.0.0 wheel-0.38.4 Installing build dependencies ... done Running command Getting requirements to build wheel running egg_info # @TODO - cfati: Truncated output Preparing metadata (pyproject.toml) ... done Requirement already satisfied: numpy>=1.16.1 in e:\work\dev\venvs\py_pc064_03.07_test1_q066060487\lib\site-packages (from pyxdameraulevenshtein==1.5.3) (1.19.5) Building wheels for collected packages: pyxdameraulevenshtein Running command Building wheel for pyxdameraulevenshtein (pyproject.toml) running bdist_wheel running build running build_ext building 'pyxdameraulevenshtein' extension creating build creating build\temp.win-amd64-cpython-37 creating build\temp.win-amd64-cpython-37\Release creating build\temp.win-amd64-cpython-37\Release\pyxdameraulevenshtein C:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\bin\HostX86\x64\cl.exe /c /nologo /O2 /W3 /GL /DNDEBUG /MD -IC:\Users\cfati\AppData\Local\Temp\pip-build-env-o_18jg2s\overlay\Lib\site-packages\numpy\core\include -IE:\Work\Dev\VEnvs\py_pc064_03.07_test1_q066060487\include -Ic:\Install\pc064\Python\Python\03.07\include -Ic:\Install\pc064\Python\Python\03.07\Include -IC:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\include -IC:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\ATLMFC\include -IC:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Auxiliary\VS\include "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22000.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22000.0\\um" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22000.0\\shared" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22000.0\\winrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22000.0\\cppwinrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\include\um" /Tcpyxdameraulevenshtein/pyxdameraulevenshtein.c /Fobuild\temp.win-amd64-cpython-37\Release\pyxdameraulevenshtein/pyxdameraulevenshtein.obj pyxdameraulevenshtein.c C:\Users\cfati\AppData\Local\Temp\pip-build-env-o_18jg2s\overlay\Lib\site-packages\numpy\core\include\numpy\npy_1_7_deprecated_api.h(14) : Warning Msg: Using deprecated NumPy API, disable it with #define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION pyxdameraulevenshtein/pyxdameraulevenshtein.c(2240): warning C4244: '=': conversion from 'Py_ssize_t' to 'unsigned long', possible loss of data pyxdameraulevenshtein/pyxdameraulevenshtein.c(2271): warning C4244: '=': conversion from 'Py_ssize_t' to 'unsigned long', possible loss of data pyxdameraulevenshtein/pyxdameraulevenshtein.c(2358): warning C4244: '=': conversion from 'Py_ssize_t' to 'unsigned long', possible loss of data pyxdameraulevenshtein/pyxdameraulevenshtein.c(2434): warning C4244: '=': conversion from 'Py_ssize_t' to 'unsigned long', possible loss of data pyxdameraulevenshtein/pyxdameraulevenshtein.c(2957): warning C4244: '=': conversion from 'double' to 'float', possible loss of data creating C:\Users\cfati\AppData\Local\Temp\pip-install-t79ijgrr\pyxdameraulevenshtein_4dee1beb9a9542bb89a45fc96b191728\build\lib.win-amd64-cpython-37 C:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\bin\HostX86\x64\link.exe /nologo /INCREMENTAL:NO /LTCG /DLL /MANIFEST:EMBED,ID=2 /MANIFESTUAC:NO /LIBPATH:E:\Work\Dev\VEnvs\py_pc064_03.07_test1_q066060487\libs /LIBPATH:c:\Install\pc064\Python\Python\03.07\libs /LIBPATH:c:\Install\pc064\Python\Python\03.07 /LIBPATH:E:\Work\Dev\VEnvs\py_pc064_03.07_test1_q066060487\PCbuild\amd64 /LIBPATH:C:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\ATLMFC\lib\x64 /LIBPATH:C:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\lib\x64 "/LIBPATH:C:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\lib\um\x64" "/LIBPATH:C:\Program Files (x86)\Windows Kits\10\lib\10.0.22000.0\ucrt\x64" "/LIBPATH:C:\Program Files (x86)\Windows Kits\10\\lib\10.0.22000.0\\um\x64" /EXPORT:PyInit_pyxdameraulevenshtein build\temp.win-amd64-cpython-37\Release\pyxdameraulevenshtein/pyxdameraulevenshtein.obj /OUT:build\lib.win-amd64-cpython-37\pyxdameraulevenshtein.cp37 -win_amd64.pyd /IMPLIB:build\temp.win-amd64-cpython-37\Release\pyxdameraulevenshtein\pyxdameraulevenshtein.cp37-win_amd64.lib Creating library build\temp.win-amd64-cpython-37\Release\pyxdameraulevenshtein\pyxdameraulevenshtein.cp37-win_amd64.lib and object build\temp.win-amd64-cpython-37\Release\pyxdameraulevenshtein\pyxdameraulevenshtein.cp37-win_amd64.exp Generating code Finished generating code installing to build\bdist.win-amd64\wheel # @TODO - cfati: Truncated output running install_scripts creating build\bdist.win-amd64\wheel\pyxDamerauLevenshtein-1.5.3.dist-info\WHEEL creating 'C:\Users\cfati\AppData\Local\Temp\pip-wheel-g6cfcj3b\.tmp-n2i1vw8v\pyxDamerauLevenshtein-1.5.3-cp37-cp37m-win_amd64.whl' and adding 'build\bdist.win-amd64\wheel' to it adding 'pyxdameraulevenshtein.cp37-win_amd64.pyd' adding 'pyxDamerauLevenshtein-1.5.3.dist-info/METADATA' adding 'pyxDamerauLevenshtein-1.5.3.dist-info/WHEEL' adding 'pyxDamerauLevenshtein-1.5.3.dist-info/top_level.txt' adding 'pyxDamerauLevenshtein-1.5.3.dist-info/RECORD' removing build\bdist.win-amd64\wheel C:\Users\cfati\AppData\Local\Temp\pip-build-env-o_18jg2s\overlay\Lib\site-packages\wheel\bdist_wheel.py:83: RuntimeWarning: Config variable 'Py_DEBUG' is unset, Python ABI tag may be incorrect if get_flag("Py_DEBUG", hasattr(sys, "gettotalrefcount"), warn=(impl == "cp")): C:\Users\cfati\AppData\Local\Temp\pip-build-env-o_18jg2s\overlay\Lib\site-packages\wheel\bdist_wheel.py:89: RuntimeWarning: Config variable 'WITH_PYMALLOC' is unset, Python ABI tag may be incorrect warn=(impl == "cp" and sys.version_info < (3, 8)), Building wheel for pyxdameraulevenshtein (pyproject.toml) ... done Created wheel for pyxdameraulevenshtein: filename=pyxDamerauLevenshtein-1.5.3-cp37-cp37m-win_amd64.whl size=24372 sha256=ced6c506896c3b1d98f8ddd165b4bf8a399287fd9f5543f2398953b479173e86 Stored in directory: C:\Users\cfati\AppData\Local\Temp\pip-ephem-wheel-cache-6epkbo0t\wheels\b4\f5\9e\39cf91e589064ceb8a4db3b6d9b2c7f267af79f9542f2ddbb3 Successfully built pyxdameraulevenshtein Installing collected packages: pyxdameraulevenshtein Successfully installed pyxdameraulevenshtein-1.5.3 [prompt]> [prompt]> :: Existing NumPy version (used at runtime) [prompt]> python -m pip freeze numpy==1.19.5 pyxDamerauLevenshtein==1.5.3 [prompt]> [prompt]> python -c "from pyxdameraulevenshtein import normalized_damerau_levenshtein_distance;print(\"Done.\")" Traceback (most recent call last): File "<string>", line 1, in <module> File "__init__.pxd", line 242, in init pyxdameraulevenshtein ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject [prompt]> [prompt]> python -m pip install numpy==1.20.0 Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com Collecting numpy==1.20.0 Downloading numpy-1.20.0-cp37-cp37m-win_amd64.whl (13.6 MB) ---------------------------------------- 13.6/13.6 MB 7.0 MB/s eta 0:00:00 Installing collected packages: numpy Attempting uninstall: numpy Found existing installation: numpy 1.19.5 Uninstalling numpy-1.19.5: Successfully uninstalled numpy-1.19.5 Successfully installed numpy-1.20.0 [prompt]> [prompt]> python -m pip freeze numpy==1.20.0 pyxDamerauLevenshtein==1.5.3 [prompt]> [prompt]> python -c "from pyxdameraulevenshtein import normalized_damerau_levenshtein_distance;print(\"Done.\")" Done.
The cause is pretty clear (although is hidden in the verbose output): when building PyxDamerauLevenshtein, PIP (silently and temporarily) downloads (and uses for build) NumPy v1.21.6 (in this case), ignoring the existing installed version. Then, at runtime v1.19.5 is used, hence the error.
Note: The problem is no longer reproducible with newest PyxDamerauLevenshtein version (v1.7.1 at answer time).
As mentioned in the question, things seem to work for Python 3.6.
At the beginning I was thinking that this part of the commit (#1) is the culprit:
+ if (_buffer_info_free(fa->_buffer_info, (PyObject *)self) < 0) {
+ PyErr_WriteUnraisable(NULL);
+ }
[Python.3.7.Docs]: void PyErr_WriteUnraisable(PyObject *obj) states (for v3.7, but not for v3.6):
An exception must be set when calling this function.
From there, in [GitHub]: python/cpython - (v3.7.0) cpython/Python/errors.c#206, PyErr_GivenExceptionMatches seems to be stricter than in v3.6.15, and that's why no exception is raised when passing NULL.
But I was wrong, things are waaay simpler:
(py_pc064_03.06_test1_q066060487) [cfati@CFATI-5510-0:e:\Work\Dev\StackOverflow\q066060487]> :: Python 3.6 console (py_pc064_03.06_test1_q066060487) [cfati@CFATI-5510-0:e:\Work\Dev\StackOverflow\q066060487]> sopr.bat ### Set shorter prompt to better fit when pasted in StackOverflow (or other) pages ### [prompt]> [prompt]> python -c "import sys;print(\"\n\".join((sys.executable, sys.version)))" e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\Scripts\python.exe 3.6.8 (tags/v3.6.8:3c6b436a57, Dec 24 2018, 00:16:47) [MSC v.1916 64 bit (AMD64)] [prompt]> [prompt]> python -m pip freeze [prompt]> [prompt]> python -m pip install numpy==1.19.5 Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com Collecting numpy==1.19.5 Downloading numpy-1.19.5-cp36-cp36m-win_amd64.whl (13.2 MB) |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 13.2 MB 1.6 MB/s Installing collected packages: numpy Successfully installed numpy-1.19.5 [prompt]> [prompt]> python -m pip freeze numpy==1.19.5 [prompt]> [prompt]> :: Pass -v (again) to see what is actually going on. I`ll truncate some output [prompt]> [prompt]> python -m pip install -v pyxdameraulevenshtein==1.5.3 Using pip 21.3.1 from e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\lib\site-packages\pip (python 3.6) Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com Collecting pyxdameraulevenshtein==1.5.3 Downloading pyxDamerauLevenshtein-1.5.3.tar.gz (58 kB) |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 58 kB 1.6 MB/s Running command 'e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\Scripts\python.exe' 'e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\lib\site-packages\pip' install --ignore-installed --no-user --prefix 'C:\Users\cfati\AppData\Local\Temp\pip-build-env-h6hif14f\overlay' --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple --extra-index-url https://pypi.ngc.nvidia.com --trusted-host pypi.ngc.nvidia.com -- 'setuptools>=40.8.0' 'wheel>=0.33.1' 'numpy>=1.16.1' Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com, https://pypi.ngc.nvidia.com Collecting setuptools>=40.8.0 Downloading setuptools-59.6.0-py3-none-any.whl (952 kB) Collecting wheel>=0.33.1 Downloading wheel-0.37.1-py2.py3-none-any.whl (35 kB) Collecting numpy>=1.16.1 Downloading numpy-1.19.5-cp36-cp36m-win_amd64.whl (13.2 MB) Installing collected packages: wheel, setuptools, numpy Successfully installed numpy-1.19.5 setuptools-59.6.0 wheel-0.37.1 Installing build dependencies ... done Running command 'e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\Scripts\python.exe' 'e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\lib\site-packages\pip\_vendor\pep517\in_process\_in_process.py' get_requires_for_build_wheel 'C:\Users\cfati\AppData\Local\Temp\tmpfdchqls9' running egg_info # @TODO - cfati: Truncated output Preparing metadata (pyproject.toml) ... done Requirement already satisfied: numpy>=1.16.1 in e:\work\dev\venvs\py_pc064_03.06_test1_q066060487\lib\site-packages (from pyxdameraulevenshtein==1.5.3) (1.19.5) Building wheels for collected packages: pyxdameraulevenshtein Running command 'e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\Scripts\python.exe' 'e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\lib\site-packages\pip\_vendor\pep517\in_process\_in_process.py' build_wheel 'C:\Users\cfati\AppData\Local\Temp\tmp0_edf2js' running bdist_wheel running build running build_ext building 'pyxdameraulevenshtein' extension creating build creating build\temp.win-amd64-3.6 creating build\temp.win-amd64-3.6\Release creating build\temp.win-amd64-3.6\Release\pyxdameraulevenshtein C:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\bin\HostX86\x64\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -IC:\Users\cfati\AppData\Local\Temp\pip-build-env-h6hif14f\overlay\Lib\site-packages\numpy\core\include -Ic:\Install\pc064\Python\Python\03.06.08\include -Ic:\Install\pc064\Python\Python\03.06.08\include -IC:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\include -IC:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\ATLMFC\include -IC:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Auxiliary\VS\include "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22000.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22000.0\\um" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22000.0\\shared" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22000.0\\winrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22000.0\\cppwinrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\in clude\um" /Tcpyxdameraulevenshtein/pyxdameraulevenshtein.c /Fobuild\temp.win-amd64-3.6\Release\pyxdameraulevenshtein/pyxdameraulevenshtein.obj pyxdameraulevenshtein.c C:\Users\cfati\AppData\Local\Temp\pip-build-env-h6hif14f\overlay\Lib\site-packages\numpy\core\include\numpy\npy_1_7_deprecated_api.h(14) : Warning Msg: Using deprecated NumPy API, disable it with #define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION pyxdameraulevenshtein/pyxdameraulevenshtein.c(2240): warning C4244: '=': conversion from 'Py_ssize_t' to 'unsigned long', possible loss of data pyxdameraulevenshtein/pyxdameraulevenshtein.c(2271): warning C4244: '=': conversion from 'Py_ssize_t' to 'unsigned long', possible loss of data pyxdameraulevenshtein/pyxdameraulevenshtein.c(2358): warning C4244: '=': conversion from 'Py_ssize_t' to 'unsigned long', possible loss of data pyxdameraulevenshtein/pyxdameraulevenshtein.c(2434): warning C4244: '=': conversion from 'Py_ssize_t' to 'unsigned long', possible loss of data pyxdameraulevenshtein/pyxdameraulevenshtein.c(2957): warning C4244: '=': conversion from 'double' to 'float', possible loss of data creating C:\Users\cfati\AppData\Local\Temp\pip-install-jbo5i6wm\pyxdameraulevenshtein_f0a231227bfc404898102cc0b821c01c\build\lib.win-amd64-3.6 C:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\bin\HostX86\x64\link.exe /nologo /INCREMENTAL:NO /LTCG /DLL /MANIFEST:EMBED,ID=2 /MANIFESTUAC:NO /LIBPATH:c:\Install\pc064\Python\Python\03.06.08\Libs /LIBPATH:e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\libs /LIBPATH:e:\Work\Dev\VEnvs\py_pc064_03.06_test1_q066060487\PCbuild\amd64 /LIBPATH:C:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\ATLMFC\lib\x64 /LIBPATH:C:\Install\pc064\Microsoft\VisualStudioCommunity\2022\VC\Tools\MSVC\14.34.31933\lib\x64 "/LIBPATH:C:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\lib\um\x64" "/LIBPATH:C:\Program Files (x86)\Windows Kits\10\lib\10.0.22000.0\ucrt\x64" "/LIBPATH:C:\Program Files (x86)\Windows Kits\10\\lib\10.0.22000.0\\um\x64" /EXPORT:PyInit_pyxdameraulevenshtein build\temp.win-amd64-3.6\Release\pyxdameraulevenshtein/pyxdameraulevenshtein.obj /OUT:build\lib.win-amd64-3.6\pyxdameraulevenshtein.cp36-win_amd64.pyd /IMPLIB:build\temp.win-amd64-3.6\Release\p yxdameraulevenshtein\pyxdameraulevenshtein.cp36-win_amd64.lib Creating library build\temp.win-amd64-3.6\Release\pyxdameraulevenshtein\pyxdameraulevenshtein.cp36-win_amd64.lib and object build\temp.win-amd64-3.6\Release\pyxdameraulevenshtein\pyxdameraulevenshtein.cp36-win_amd64.exp Generating code Finished generating code installing to build\bdist.win-amd64\wheel # @TODO - cfati: Truncated output running install_scripts creating build\bdist.win-amd64\wheel\pyxDamerauLevenshtein-1.5.3.dist-info\WHEEL creating 'C:\Users\cfati\AppData\Local\Temp\pip-wheel-wpe1zd_h\tmpdhak06i5\pyxDamerauLevenshtein-1.5.3-cp36-cp36m-win_amd64.whl' and adding 'build\bdist.win-amd64\wheel' to it adding 'pyxdameraulevenshtein.cp36-win_amd64.pyd' adding 'pyxDamerauLevenshtein-1.5.3.dist-info/METADATA' adding 'pyxDamerauLevenshtein-1.5.3.dist-info/WHEEL' adding 'pyxDamerauLevenshtein-1.5.3.dist-info/top_level.txt' adding 'pyxDamerauLevenshtein-1.5.3.dist-info/RECORD' removing build\bdist.win-amd64\wheel C:\Users\cfati\AppData\Local\Temp\pip-build-env-h6hif14f\overlay\Lib\site-packages\wheel\bdist_wheel.py:82: RuntimeWarning: Config variable 'Py_DEBUG' is unset, Python ABI tag may be incorrect warn=(impl == 'cp')): C:\Users\cfati\AppData\Local\Temp\pip-build-env-h6hif14f\overlay\Lib\site-packages\wheel\bdist_wheel.py:87: RuntimeWarning: Config variable 'WITH_PYMALLOC' is unset, Python ABI tag may be incorrect sys.version_info < (3, 8))) \ Building wheel for pyxdameraulevenshtein (pyproject.toml) ... done Created wheel for pyxdameraulevenshtein: filename=pyxDamerauLevenshtein-1.5.3-cp36-cp36m-win_amd64.whl size=27385 sha256=e17febb7db9cbe5e7726c486367b189bbd8b07d93c845ab580ee69f652eed002 Stored in directory: C:\Users\cfati\AppData\Local\Temp\pip-ephem-wheel-cache-g6lraow9\wheels\ab\e3\f3\34dfd385a44f053693d576e00ea4a6f4beb73366f7237271cf Successfully built pyxdameraulevenshtein Installing collected packages: pyxdameraulevenshtein Successfully installed pyxdameraulevenshtein-1.5.3 Link requires a different Python (3.6.8 not in: '>=3.7'): https://files.pythonhosted.org/packages/9f/8b/a094f5da22d7abf5098205367b3296dd15b914f4232af5ca39ba6214d08c/pip-22.0-py3-none-any.whl#sha256=6cb1ea2bd7fda0668e26ae8c3e45188f301a7ef17ff22efe1f70f3643e56a822 (from https://pypi.org/simple/pip/) (requires-python:>=3.7) # @TODO - cfati: Truncated output [prompt]> [prompt]> python -m pip freeze numpy==1.19.5 pyxDamerauLevenshtein==1.5.3 [prompt]> [prompt]> python -c "from pyxdameraulevenshtein import normalized_damerau_levenshtein_distance;print(\"Done.\")" Done.
So, on Python 3.6, this scenario can't be encountered, since the newest (built) NumPy version is v1.19.5.
I ran in this (simplified) scenario (MCVE) intentionally, to prove a point.
Normal user would typically reach it when installing a package that depends on an older NumPy version. But since newer package versions (with newer dependents versions) keep appearing, the chance of running into it fades away as time passes by.
However, if someone does run into it, here are a bunch of (generic) guidelines to overcome it:
Install (upgrade) NumPy >= v1.20.0 (preferably) before installing PyxDamerauLevenshtein (or better: don't use an older version anymore)
Install PyxDamerauLevenshtein >= v1.7.0
Instruct PIP ([PyPA.PIP]: pip install) not to upgrade dependents
Build PyxDamerauLevenshtein "manually" (python setup.py build
), as in that case I noticed that existing NumPy version is used (this is what one would expect (and PIP is not used))
Note that this may be applicable to other packages as well, and also there might be additional restrictions involved
All in all
Check:
After installing all required packages (that might downgrade NumPy to <= v1.19.5)
Constantly (if package installation is a continuous process)
the NumPy version (pythnon -m pip freeze
), and if it's the case, upgrade it (python -m pip upgrade numpy
)
Related (more or less):
[SO]: How to fix error during pythonnet installation (@CristiFati's answer)
[SO]: How to install a package for a specific Python version on Windows 10? (@CristiFati's answer)
Upvotes: 10
Reputation: 1027
Indeed, (building and) installing with numpy>=1.20.0
should work, as pointed out e.g. by this answer. However, I thought some background might be interesting -- and provide also alternative solutions.
There was a change in the C API in numpy 1.20.0
. In some cases, pip
seems to download the latest version of numpy
for the build stage, but then the program is run with the installed version of numpy
. If the build version used in <1.20
, but the installed version is =>1.20
, this will lead to an error.
(The other way around it should not matter, because of backwards compatibility. But if one uses an installed version numpy<1.20
, they did not anticipate the upcoming change.)
This leads to several possible ways to solve the problem:
numpy>=1.20.0
pyproject.toml
(oldest-supported-numpy
)--no-binary
--no-build-isolation
For a more detailed discussion of potential solutions, see https://github.com/scikit-learn-contrib/hdbscan/issues/457#issuecomment-773671043.
Upvotes: 46
Reputation: 4041
numpy
While upgrading the numpy
version would often solve the issue, it's not always viable. Good example is the case when you're using tensorflow==2.6.0
which isn't compatible with the newest numpy
version (it requires ~=1.19.2
).
As already mentioned in FZeiser's answer, there was a change in numpy
s C API in version 1.20.0
. There are packages that rely on this C API when they are being built, e.g. pyxdameraulevenshtein
. Given that pip
s dependency resolver doesn't guarantee any order for installing the packages, the following might happen:
pip
figures out that it needs to install numpy
and it chooses the latest version, 1.21.2
as of the time writing this answer.numpy
and its C API, e.g. pyxdameraulevenshtein
. This package is now compatible with numpy 1.21.2
C API.pip
needs to install a package that has a requirement for an older version of numpy
, e.g. tensorflow==2.6.0
which would try to install numpy==1.19.5
. As a result, numpy==1.21.2
is uninstalled and the older version is installed.pyxdameraulevenshtein
, its current installation relies on the updated numpy
C API, yet the numpy
version was downgraded which would result in the error.You should rebuild the package with the outdated numpy
C API usage to make sure it's compatible with the currently installed version of numpy
. E.g. for pyxdameraulevenshtein
:
pip uninstall pyxdameraulevenshtein
pip install pyxdameraulevenshtein --no-binary pyxdameraulevenshtein
Upvotes: 31
Reputation: 80428
This worked for me (when nothing else on this page did):
# Create environment with conda or venv.
# Do *not* install any other packages here.
pip install numpy==1.21.5
# Install all other packages here.
# This works as a package may build against the currently installed version of numpy.
This solved a particularly brutal issue that was unresolvable by all other answers on this page as of 2022-04-11:
Other answers try to fix the problem after it occurred, this fixes the problem before it occurs.
In addition, experiment with different versions of Python, e.g. 3.8, 3.9, 3.10.
Reference: Excellent answer by @FZeiser that explains why this works.
Upvotes: 6
Reputation: 5266
What worked for me was:
pip uninstall numpy
conda install -y -c conda-forge numpy
As bizarre as it might sound... I didn't even have to uninstall it with Conda which seemed odd to me. I am using python 3.9
Upvotes: 9
Reputation: 125
I had this issue but could not update numpy because of some incompatibility conflict with another package that I needed which requires numpy<=1.21.0
. The error numpy.ndarray size changed, may indicate binary incompatibility
was generated from a personal package. The solution was to modify the pyproject.toml
file of my package and set :
requires = ["numpy==1.21.0", <other packages>]
which was previously set to "numpy>=1.21.0"
, causing the error.
Upvotes: 1
Reputation: 63
I encountered with the same problem with python3.10.4,numpy1.21.5, I solved it only after I updated numpy to 1.22.3 via pip uninstall numpy and pip install numpy. Only pip install --upgrade numpy didn't work.
PS D:\quant\vnpy-master\examples\veighna_trader> python .\run.py Traceback (most recent call last): File "D:\quant\vnpy-master\examples\veighna_trader\run.py", line 31, in from vnpy_optionmaster import OptionMasterApp File "D:\it_soft\python3.10.4\Lib\site-packages\vnpy_optionmaster__init__.py", line 26, in from .engine import OptionEngine, APP_NAME File "D:\it_soft\python3.10.4\Lib\site-packages\vnpy_optionmaster\engine.py", line 34, in from .pricing import binomial_tree_cython as binomial_tree File "binomial_tree_cython.pyx", line 1, in init binomial_tree_cython ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject
Upvotes: 3
Reputation: 101
I was facing same issue in raspberry pi 3. Actually the error is with pandas
. Although tensorflow need numpy~=1.19.2
, but pandas
is not compliable with it. So, I have upgraded (because downgrading is not) my numpy
to latest version and all works fine!!!!.
root@raspberrypi:/home/pi# python3
Python 3.7.3 (default, Jan 22 2021, 20:04:44)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy as np
>>> np.__version__
'1.21.5'
>>> import pandas as pd
>>> pd.__version__
'1.3.5'
>>> import tensorflow as tf
>>> tf.__version__
'2.4.0'
>>> tf.keras.__version__
'2.4.0'
>>> tf.keras.layers
<module 'tensorflow.keras.layers' from '/usr/local/lib/python3.7/dist-packages/tensorflow/keras/layers/__init__.py'>
Same issue here also - https://github.com/bitsy-ai/tensorflow-arm-bin/issues/5
Tensorflow source: https://github.com/bitsy-ai/tensorflow-arm-bin
Upvotes: 6
Reputation: 146
Install older version of gensim, It works!
pip install gensim==3.5.0
or
conda install gensim==3.5.0
Upvotes: 0
Reputation: 3519
I'm in Python 3.8.5. It sounds too simple to be real, but I had this same issue and all I did was reinstall numpy. Gone.
pip install --upgrade numpy
or
pip uninstall numpy
pip install numpy
Upvotes: 341
Reputation: 83
Use python virtual environments and install gensim
using :
pip install gensim==3.8.3
Upvotes: 3
Reputation: 41
After you pip install any package, makes sure you restart the Kernel and should work. usually packages get upgraded automatically and all you need is a quick restart. At least, this what worked in my situation and I was getting the same error when I tried to install and use pomegranate.
Upvotes: 4
Reputation: 1368
For anyone using Poetry it is necessary to have experimental.new-installer
set to true
for an application with a numpy<1.20
dependency to be built correctly i.e:
poetry config experimental.new-installer true
It is true
by default but if (as was the case for me) it has been changed it can catch you out.
My application uses Tensorflow and I did not therefore have the option of upgrading to >1.20
. Poetry also does not support --no-binary
dependencies.
Upvotes: 6
Reputation: 377
I had this issue when using the tensorflow object api. Tensorflow is currently NOT compatible with numpy==1.20 (although this issue is not apparent until later). In my case, the issue was caused by pycocotools. I fixed by installing an older version.
pip install pycocotools==2.0.0
Upvotes: 23
Reputation: 521
try with numpy==1.20.0
this worked here, even though other circumstances are different (python3.8 on alpine 3.12).
Upvotes: 52
Reputation: 54
For almost the same image : python:3.7-slim-buster
I started to have this problem just today, it was non exitent before.
I solved it by removing numpy from requirement.txt file and doing instead the following in my Dockerfile:
RUN pip3 install --upgrade --no-binary numpy==1.18.1 numpy==1.18.1 \
&& pip3 install -r requirements.txt
I use some old versions of keras and its librairies and updrading to numpy 1.20.0 didn't work for those librairies. But I think the solution consist in the first command that I gave you wich tell pip to try to not compile numpy and download a pre-compiled version.
The trick in the command is that you might find people telling you to use --no-binary option of pip to solve the problem, but they don't specify how and it might be tricky (as it happened to me); you have to write the package two times in the command in order for it to work or else pip will throw you an error.
I think the --upgrade option in the first command isn't necessary.
Upvotes: 1