Reputation: 96976
I am trying to compile numpy
v1.12 in order to get support for ATLAS/LAPACK routines.
The problem
The settings I am using for compilation do not appear to work in bringing ATLAS/LAPACK libraries into numpy
.
The setup
I do not have admin privileges on the host(s) I am working on (a computational cluster).
However, the nodes offer access to gcc
4.7.2 and 5.3.0, glibc
2.17 and 2.22, and ATLAS/LAPACK libraries and headers v3.10.2 via GNU modules.
For compatibility reasons, I am working with a virtual environment that contains Python 2.7.16. Likewise, I am installing an older version of numpy
for the same reason. If things work, I may explore newer versions of numpy
but at this time, that is what I am working with.
My source directory for numpy
has a configuration file called site.cfg
, which includes these directives:
[ALL]
library_dirs = /usr/local/lib:/net/module/sw/glibc/2.22/lib64:/net/module/sw/atlas-lapack/3.10.2/lib
include_dirs = /usr/local/include:/net/module/sw/glibc/2.22/include:/net/module/sw/atlas-lapack/3.10.2/include
[atlas]
libraries = lapack,f77blas,cblas,atlas
library_dirs = /net/module/sw/atlas-lapack/3.10.2/lib
include_dirs = /net/module/sw/atlas-lapack/3.10.2/include
I am compiling numpy
via the following command:
$ CFLAGS="${CFLAGS} -std=c99 -fPIC" LDFLAGS="-L/home/areynolds/.conda/envs/genotyping_environment/lib -Wl,-rpath=/home/areynolds/.conda/envs/genotyping_environment/lib -Wl,--no-as-needed -Wl,--sysroot=/,-L/net/module/sw/glibc/2.22/lib64" python setup.py build --fcompiler=gnu95
I am using --fcompiler=gnu95
as the ATLAS/LAPACK libraries were compiled with GNU Fortran. I am overriding CFLAGS
and LDFLAGS
variables in order for the GCC toolkit to be able to compile and link properly.
The question
After compilation, I test the numpy
library to see what is installed via one method:
$ python
...
>>> import numpy.distutils.system_info as sysinfo
>>> sysinfo.get_info('atlas')
ATLAS version 3.10.2 built by root on Wed Jun 1 15:39:08 PDT 2016:
UNAME : Linux module0.altiusinstitute.org 3.10.0-327.10.1.el7.x86_64 #1 SMP Tue Feb 16 17:03:50 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
INSTFLG : -1 0 -a 1 -l 1
ARCHDEFS : -DATL_OS_Linux -DATL_ARCH_UNKNOWNx86 -DATL_CPUMHZ=2876 -DATL_AVXMAC -DATL_AVX -DATL_SSE3 -DATL_SSE2 -DATL_SSE1 -DATL_USE64BITS -DATL_GAS_x8664
F2CDEFS : -DAdd_ -DF77_INTEGER=int -DStringSunStyle
CACHEEDGE: 229376
F77 : /net/module/sw/gcc/5.3.0/bin/gfortran, version GNU Fortran (GCC) 5.3.0
F77FLAGS : -O -mavx2 -mfma -m64 -fPIC
SMC : /usr/bin/x86_64-redhat-linux-gcc, version x86_64-redhat-linux-gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-4)
SMCFLAGS : -O -fomit-frame-pointer -mavx2 -mfma -m64 -fPIC
SKC : /usr/bin/x86_64-redhat-linux-gcc, version x86_64-redhat-linux-gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-4)
SKCFLAGS : -O -fomit-frame-pointer -mavx2 -mfma -m64 -fPIC
{'libraries': ['lapack', 'f77blas', 'cblas', 'atlas', 'f77blas', 'cblas'], 'library_dirs': ['/net/module/sw/atlas-lapack/3.10.2/lib'], 'define_macros': [('ATLAS_INFO', '"\\"3.10.2\\""')], 'language': 'f77', 'include_dirs': ['/net/module/sw/atlas-lapack/3.10.2/include']}
This looks okay, maybe?
But when I check via another method, I get a different answer:
>>> np.show_config()
lapack_opt_info:
libraries = ['openblas', 'openblas']
library_dirs = ['/usr/local/lib']
define_macros = [('HAVE_CBLAS', None)]
language = c
blas_opt_info:
libraries = ['openblas', 'openblas']
library_dirs = ['/usr/local/lib']
define_macros = [('HAVE_CBLAS', None)]
language = c
openblas_info:
libraries = ['openblas', 'openblas']
library_dirs = ['/usr/local/lib']
define_macros = [('HAVE_CBLAS', None)]
language = c
blis_info:
NOT AVAILABLE
openblas_lapack_info:
libraries = ['openblas', 'openblas']
library_dirs = ['/usr/local/lib']
define_macros = [('HAVE_CBLAS', None)]
language = c
lapack_mkl_info:
NOT AVAILABLE
blas_mkl_info:
NOT AVAILABLE
Despite the manual setup described in site.cfg
, there are no mentions of ATLAS, nor is LAPACK apparently pointed to the correct module directory (/net/module/sw/atlas-lapack/3.10.2
).
How do I correctly compile support for ATLAS/LAPACK into numpy
, or truly test that I have a working ATLAS/LAPACK setup integrated into numpy
, which gives me a consistent (and reliable) answer?
Upvotes: 19
Views: 1622
Reputation: 121
An orthogonal suggestion, but it may be helpful in general, not just for your particular problem.
Give a look at Spack. It is a package manager that builds packages from sources. It is a very interesting and promising project which allows you to build a vast variety of libraries/software with just a few steps.
I've just checked and py-numpy
is a supported package (i.e. spack list numpy
). If you want to install with default options (check them with spack info py-numpy
), you can just install it with a simple spack install py-numpy
and it will be built along with missig dependencies.
If you want to change something, e.g. you may want to use a particular implementation or version of BLAS/LAPACK, you can easily specify the wanted dependency (i.e. spack install py-numpy ^openblas
)
I can assure you it will save you a lot of headaches, it did for me a lot of times. I use it both on the HPC I work on and on my local machines.
I've just shown you a few commands, but I just scraped the surface of what you can easily do with it (you may have multiple variant of numpy built with different BLAS implementations, or same implementation with different options, or ... in case I suggest you to start looking at "spec syntax").
Upvotes: 1
Reputation: 511
I did it with numpy 1.11.1 (which means that my answer may not be 100% accurate in your case) with the following receipe:
export ATLAS=<folder with the atlas/lapack libraries>
export LAPACK=$ATLAS
cat > site.cfg <<EOF
[atlas]
atlas_libs = lapack, f77blas, cblas, atlas
EOF
python setup.py bdist_wheel
Upvotes: 0
Reputation: 461
BLAS/LAPACK is an optional dependency for numpy. So, depending on what you are trying to do, you might leave ATLAS out completely.
If you want to make sure that your numpy was compiled against the correct libraries, I would go outside of Python and numpy and use ldd
on the compiled libraries.
I'm on latest numpy and Python 3.7, so filenames will look different for you.
> cd <numpy_dir>
> find . -name "*.so"
./core/_dummy.cpython-37m-darwin.so
./core/_multiarray_tests.cpython-37m-darwin.so
./core/_multiarray_umath.cpython-37m-darwin.so
./core/_operand_flag_tests.cpython-37m-darwin.so
./core/_rational_tests.cpython-37m-darwin.so
./core/_struct_ufunc_tests.cpython-37m-darwin.so
./core/_umath_tests.cpython-37m-darwin.so
./fft/fftpack_lite.cpython-37m-darwin.so
./linalg/_umath_linalg.cpython-37m-darwin.so
./linalg/lapack_lite.cpython-37m-darwin.so
./random/mtrand.cpython-37m-darwin.so
Then I ran ldd
(I used otool -L
since I'm on macOS) on each file. The following 3 files were compiled against the BLAS library.
core/_multiarray_umath.cpython-37m-darwin.so
linalg/_umath_linalg.cpython-37m-darwin.so
linalg/lapack_lite.cpython-37m-darwin.so
Upvotes: 0
Reputation: 2175
Since you can get the numpy source onto the machine, I would assume you can get any files you like into your user space. Have you considered installing numpy from a wheel?
Numpy 1.16.4 has support for Python 2.7. You haven't said what architecture your nodes are, but I would be a bit surprised if there isn't a wheel available. You should be able to download it directly from PyPi yourself:
https://pypi.org/project/numpy/1.16.4/#files
Once you've downloaded the wheel file and transferred it, assuming you've already installed pip
, etc., you can install it:
pip install --no-index --user (file).whl
Also, I would be reluctant about saying that ATLAS/LAPACK is the best option. It has been benchmarked here, and it looks like OpenBLAS is just fine: https://markus-beuckelmann.de/blog/boosting-numpy-blas.html.
Upvotes: -1