Cannot find BLAS on a machine with MKL when installing scipy - scipy

I installed Intel MKL and other libraries for a customized numpy. Here is my ~/.numpy-site.cfg:
[DEFAULT]
library_dirs = /usr/lib:/usr/local/lib
include_dirs = /usr/include:/usr/local/include
[mkl]
library_dirs = /opt/intel/mkl/lib/intel64/
include_dirs = /opt/intel/mkl/include/
mkl_libs = mkl_rt
lapack_libs =
[amd]
amd_libs = amd
[umfpack]
umfpack_libs = umfpack
[djbfft]
include_dirs = /usr/local/djbfft/include
library_dirs = /usr/local/djbfft/lib
This configuration file seems OK during the installation of numpy. But when I was installing scipy via pip3 install scipy, it reported that
numpy.distutils.system_info.BlasNotFoundError:
Blas (http://www.netlib.org/blas/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [blas]) or by setting
the BLAS environment variable.
In my mind MKL is an implementation of Blas so just mentioning MKL should be fine. I've tried
export LD_LIBRARY_PATH=/opt/intel/mkl/lib/intel64:$LD_LIBRARY_PATH‌​
export BLAS=/opt/intel/mkl/lib/intel64
Copy the content in the [mkl] section and paste into the [blas] section in the file ~/.numpy-site.cfg
But none of these works. So what is going wrong? Does scipy respect ~/.numpy-site.cfg? Thank you.

What is the extension on your libs in ..../intel64? I had a similar problem where because the extensions where .so.3.0 the the setup script was not finding the libraries. My solution was to create symlinks: https://stackoverflow.com/a/23325759/1430829 . Maybe this will work for you too?

Related

Distributing pybind11 extension linked to third party libraries

I'm working on a pybind11 extension written in C++ but I'm having a hard time understanding how should it be distributed.
The project links to a number of third party libraries (e.g. libpng, glew etc.).
The project builds fine with CMAKE and it generates a .so file. Now I am not sure what is the right way of installing this extension. The extension seems to work, as if I try copy the file into the python lib directories it is picked up (I can import it, and it works correctly). However, this is clearly not the way to go I think.
I also tried the setuptools route (from https://pybind11.readthedocs.io/en/stable/compiling.html) by creating a setup.py files like this:
import sys
# Available at setup time due to pyproject.toml
from pybind11 import get_cmake_dir
from pybind11.setup_helpers import Pybind11Extension, build_ext
from setuptools import setup
from glob import glob
files = sorted(glob("*.cpp"))
__version__ = "0.0.1"
ext_modules = [
Pybind11Extension("mylib",
files,
# Example: passing in the version to the compiled code
define_macros = [('VERSION_INFO', __version__)],
),
]
setup(
name="mylib",
version=__version__,
author="fab",
author_email="fab#fab",
url="https://github.com/pybind/python_example",
description="mylib",
long_description="",
ext_modules=ext_modules,
extras_require={"test": "pytest"},
cmdclass={"build_ext": build_ext},
zip_safe=False,
python_requires=">=3.7",
)
and now I can build the extension by simply calling
pip3 install
however it looks like all the links are broken because whenever I try importing the extension in Python I get linkage errors, as if setuptools does not link correctly the extension with the 3rd party libs. For instance errors in linking with libpng as in:
>>> import mylib
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: /home/fabrizio/.local/lib/python3.8/site-packages/mylib.cpython-38-x86_64-linux-gnu.so: undefined symbol: png_sig_cmp
However I have no clue how to add this link info to setuptools, and don't even know if that's possible (it should be the setuptools equivalent of CMAKE's target_link_libraries).
I am really at a loss after weeks of reading documentation, forum threads and failed attempts. If anyone is able to point me in the right way or to clear some of the fog it would be really appreciated!
Thanks!
Fab
/home/fabrizio/.local/lib/python3.8/site-packages/mylib.cpython-38-x86_64-linux-gnu.so: undefined symbol: png_sig_cmp
This line pretty much says it clearly. Your local shared object file .so can't find the libpng.so against which it is linked.
You can confirm this by running:
ldd /home/fabrizio/.local/lib/python3.8/site-packages/mylib.cpython-38-x86_64-linux-gnu.so
There is no equivalent of target_link_libraries() in setuptools. Because that wouldn't make any sense. The library is already built and you've already linked it. This is your system more or less telling you that it can't find the libraries it needs. And those most likely need to be installed.
This is also one of the reasons why Linux distributions provide their own package managers and why you should use the developer packages provided by said distributions.
So how do you fix this? Well your .so file needs to find the other .so files against which you linked to understand how this works I will refer you to this link.
My main guess is based on the fact that when you manually copy the files it works - That during the build process you probably specify the rpath to a local directory. Hence what you most likely need to do is specify to your setuptools that it needs to copy those files when installing.

How to write a minimally working pyproject.toml file that can install packages?

Pip supports the pyproject.toml file but so far all practical usage of the new schema requires a 3rd party tool that auto-generates these files (e.g., poetry and pip). Unlike setup.py which is already human-writeable, pyproject.toml is not (yet).
From setuptools docs,
[build-system]
requires = [
"setuptools >= 40.9.0",
"wheel",
]
build-backend = "setuptools.build_meta"
However, this file does not include package dependencies (as outlined in PEP 621). Pip does support installing packages using pyproject.toml but nowhere does pep specify how to write package dependencies in pyproject.toml for the official build system setuptools.
How do I write package dependencies in pyproject.toml?
Related StackOverflow Questions:
How to init the pyproject.toml file
This question asks for a method to auto-generate pyproject.toml, my question differ because I ask for a human-written pyproject.toml.
my question differ because I ask for a human-written pyproject.toml
First, the pyproject.toml file is always "human-writable".
Then, it is important to know that in this context setuptools and Poetry take the role of what are called "PEP 517 build back-ends", and there are many such back-ends available today, setuptools and Poetry are just two examples of them.
As of today, it seems like most (if not all) of the build back-ends I know of expect their configuration (including dependencies) to be written in pyproject.toml.
PEP 621
There is a standard called PEP 621 that specifies how a project's metadata, including dependencies, should be laid out in the pyproject.toml file.
Here is a list of build back-ends I know of that have support for PEP 621:
enscons
flit_core (see flit)
hatchling (see hatch)
pdm-pep517 (see pdm)
setuptools (experimental support since version 61.0.0)
trampolim
whey
For all PEP 621 compatible build back-ends, the dependencies should be written in pyproject.toml file like in the following:
[project]
name = "Thing"
version = "1.2.3"
# ...
dependencies = [
"SomeLibrary ~= 2.2",
]
References:
https://peps.python.org/pep-0621/
https://setuptools.pypa.io/en/latest/userguide/pyproject_config.html (experimental support for PEP 621 has been added to setuptools version 61.0.0, released 2022-03-24)
setuptools (before version 61.0.0)
In setuptools before version 61.0.0 there is no support for writing the configuration in pyproject.toml (in other words: no PEP 621 support). You have to either write a setup.cfg, or a setup.py, or a combination of both.
My recommendation is to write as much as possible in setup.cfg. Such a setup.cfg could look like this:
[metadata]
name = Thing
version = 1.2.3
[options]
install_requires =
SomeLibrary ~= 2.2
packages = find:
and in most cases the setup.py can be omitted completely or it can be as short as:
import setuptools
setuptools.setup()
References about the dependencies specifically:
https://setuptools.readthedocs.io/en/latest/userguide/dependency_management.html
https://www.python.org/dev/peps/pep-0508/
https://www.python.org/dev/peps/pep-0440/
Again, note that in most cases it is possible to omit the setup.py file entirely, one of the conditions is that the setup.cfg file and a pyproject.toml file are present and contain all the necessary information. Here is an example of pyproject.toml that works well for a setuptools build backend:
[build-system]
build-backend = 'setuptools.build_meta'
requires = [
'setuptools >= 43.0.0',
]
poetry
In poetry everything is defined in pyproject.toml, but it uses poetry-specific sections. In other words, Poetry does not currently use the PEP 621 standard, but there are some plans to move to this standard in the future.
This file can be hand-written. As far as I can tell, there is no strict need to ever explicitly install poetry itself (commands such as pip install and pip wheel can get you far enough).
The pyproject.toml file can be as simple as:
[tool.poetry]
name = 'Thing'
version = '1.2.3'
[tool.poetry.dependencies]
python = '^3.6'
SomeLibrary = '~2.2'
[build-system]
requires = ['poetry-core~=1.0']
build-backend = 'poetry.core.masonry.api'
References:
https://python-poetry.org/docs/pyproject/
https://python-poetry.org/docs/dependency-specification/

Installing Cairo, Helm on Windows

How do I install Helm (https://hackage.haskell.org/package/helm) on Windows 7 (64-bit)?
(Update: I had posted a lot of error messages here, but I've moved them to my answer to not clutter up the question.)
Installation for Windows 64-bit:
I'm including error messages, for if you follow all the steps up to that point and then just try to install directly. This is a conglomeration of a bunch of ad-hoc steps from following many different posts. Any simplification would be appreciated!
Note: Do all work in directories without spaces. I'm doing all work in C:/PF; modify this to your directory.
Download MSYS2-x86_64 from https://msys2.github.io/ and install it. Cabal install cairo (or helm) will give something like:
Configuring cairo-0.13.1.0...
setup.exe: Missing dependencies on foreign libraries:
Missing C libraries: z, cairo, z, gobject-2.0, ffi, pixman-1, fontconfig,
expat, freetype, iconv, expat, freetype, z, bz2, harfbuzz, glib-2.0, intl,
ws2_32, ole32, winmm, shlwapi, intl, png16, z
Download C libraries. In MINGW64 (NOT MSYS2 - I had trouble with MSYS2 at random stages in the process), use the package manager:
pacman -Ss cairo
to search for the Cairo package. You'll find "mingw64/mingw-w64-x86_64-cairo", so install that:
pacman -S mingw64/mingw-w64-x86_64-cairo
*.pc files should have been added to C:\PF\msys64\mingw64\lib\pkgconfig and C:\PF\msys64\usr\lib\pkgconfig. (pkg-config needs to be able to find these files. It looks in PKG_CONFIG_PATH, which by default should have the lib/pkgconfig folder above. Moving the file here is easiest. See Can't install sdl2 via cabal) If you get
The pkg-config package ... version ... cannot be found
errors then check your *.pc files.
Repeat with other required libraries, like atk
pacman -S mingw64/mingw-w64-x86_64-atk
(I don't know the complete list, but error messages later on will let you know what to get.)
Get the development files for these libraries (as suggested by How to install cairo on Windows). Most of them are bundled up at http://ftp.gnome.org/pub/gnome/binaries/win64/gtk+/2.22/. Unzip.
Copy files (.a, .dll.a) in lib to C:\PF\msys64\mingw64\lib. Copy the pkgconfig folder, which contains the .pc files.
Copy files in include to C:\PF\msys64\mingw64\include.
Add C:\PF\gtk+-2.22.1\bin to the path.
(2) and (3) might be redundant. I don't know - I did them both.
At this point you can probably do "cabal install cairo". (Warning: if your end goal is something else, you may not want to "cabal install" intermediate packages, see https://wiki.haskell.org/Cabal/Survival#Issue_.232_--_Not_installing_all_the_packages_in_one_go.)
See (4) for the syntax in specifying extra-include-dirs and extra-lib-dirs (but if you copied the files above this shouldn't be necessary),
Any time you get
Missing (or bad) header file
check to see you copied the *.h files to mingw64\include and/or add the include folder to the PATH. Use cabal install -v3 to get verbose error messages if the problem persists.
If you get something like
cairo-0.13.1.0: include-dirs: /mingw64/include/freetype2 is a relative path
which makes no sense (as there is nothing for it to be relative to). You can
make paths relative to the package database itself by using ${pkgroot}. (use
--force to override)
try --ghc-pkg-options="--force" (as mentioned at https://github.com/gtk2hs/gtk2hs/issues/139).
Get SDL. Otherwise you'll get
configure: error: *** SDL not found! Get SDL from www.libsdl.org.
If you already installed it, check it's in the path. If problem remains,
please send a mail to the address that appears in ./configure --version
indicating your platform, the version of configure script and the problem.
Failed to install SDL-0.6.5.1
Follow the instructions in (2) to get sdl/sdl2 libraries. (See instructions here Installing SDL on Windows for Haskell (GHC).)
The new version helm-0.7.1 requires sdl2, but there are other dependency issues with helm-0.7.1 as of writing. Download SDL from http://sourceforge.net/projects/msys2/files/REPOS/MINGW/x86_64/ (direct download link to newest version as of writing http://sourceforge.net/projects/msys2/files/REPOS/MINGW/x86_64/mingw-w64-x86_64-SDL-1.2.15-7-any.pkg.tar.xz.sig/download), unzip. "cabal install sdl" gives
* Missing (or bad) header file: SDL/SDL.h
* Missing C library: SDL
This problem can usually be solved by installing the system package that
provides this library (you may need the "-dev" version). If the library is
already installed but in a non-standard location then you can use the flags
--extra-include-dirs= and --extra-lib-dirs= to specify where it is.
so we specify where the dirs are (change the name depending on where you extracted sdl to)
cabal install sdl --extra-include-dirs=C:/PF/sdl\include --extra-lib-dirs=C:/sdl/lib
If you got SDL2 (http://libsdl.org/download-2.0.php) (for a newer version of Helm): there is a fatal bug that hasn't been fixed in the release version. (If you don't fix it, cabal install -v3 things which depends on it will give error
winapifamily.h: No such file or directory
("winapifamily.h: No such file or directory" when compiling SDL in Code::Blocks) Download https://hg.libsdl.org/SDL/raw-file/e217ed463f25/include/SDL_platform.h, replace the file in the include folder and in C:/PF/msys64/mingw64/include/SDL2.
Download gtk2hs from http://code.haskell.org/gtk2hs and run
the following
cd gtk2hs/tools
cabal install
cd ../glib
cabal install
cd ../gio
cabal install
cd ../pango
cabal install --ghc-pkg-options="--force"
(Maybe you have already installed glib and gio from before? I did this step because normal install of Pango caused an error for me (https://github.com/gtk2hs/gtk2hs/issues/110)
pango-0.13.1.0: include-dirs: /mingw64/include/freetype2 is a relative path
which makes no sense (as there is nothing for it to be relative to). You can
make paths relative to the package database itself by using ${pkgroot}. (use
--force to override)
Once the Helm developers get things updated you should be able to do "cabal install helm" but right now there seem to be dependency issues. For me, cabal automatically tries to install helm-0.4 (probably because 0.4 didn't give upper bounds on dependencies, while newer versions do. You could try "cabal unpack"ing and deleting the upper bounds...). Then
cabal unpack helm-0.4
Installing gives an error because "pure" got moved to Prelude. Open helm-0.4\src\FRP\Helm\Automaton.hs and change line 17:
import Prelude hiding (id, (.), pure)
Now
cabal install
Try to compile and run a program using Helm
(This is 0.4 - look on the website for a newer sample if you tried a newer Helm)
import FRP.Helm
import qualified FRP.Helm.Window as Window
render :: (Int, Int) -> Element
render (w, h) = collage w h [filled red $ rect (fromIntegral w) (fromIntegral h)]
main :: IO ()
main = run $ fmap (fmap render) Window.dimensions
If you get an error about a missing .dll (sdl.dll), find it in a bin/ folder and add the folder to your PATH (or copy it to somewhere on your path).

Porting Newlib with current autotools

I'm trying to build a toolchain for my hobby kernel, but I'm running into problems when building Newlib. Whenever I try to run autoreconf in my kernels directory under newlib/libc/sys/ I get an error:
configure.in:5: error: support for Cygnus-style trees has been removed
Here is the content of configure.in (basically, taken from the below tutorial):
AC_PREREQ(2.59)
AC_INIT([newlib], [NEWLIB_VERSION])
AC_CONFIG_SRCDIR([crt0.S])
AC_CONFIG_AUX_DIR(../../../..)
NEWLIB_CONFIGURE(../../..)
AC_CONFIG_FILES([Makefile])
AC_OUTPUT
and the source for Makefile.am (again mostly from tutorial):
AUTOMAKE_OPTIONS = cygnus
INCLUDES = $(NEWLIB_CFLAGS) $(CROSS_CFLAGS) $(TARGET_CFLAGS)
AM_CCASFLAGS = $(INCLUDES)
noinst_LIBRARIES = lib.a
if MAY_SUPPLY_SYSCALLS
extra_objs = $(lpfx)syscalls.o
else
extra_objs =
endif
lib_a_SOURCES =
lib_a_LIBADD = $(extra_objs)
EXTRA_lib_a_SOURCES = syscalls.c crt0.S
lib_a_DEPENDENCIES = $(extra_objs)
lib_a_CCASFLAGS = $(AM_CCASFLAGS)
lib_a_CFLAGS = $(AM_CFLAGS)
if MAY_SUPPLY_SYSCALLS
all: crt0.o
endif
ACLOCAL_AMFLAGS = -I ../../..
CONFIG_STATUS_DEPENDENCIES = $(newlib_basedir)/configure.host
Yes, I have tried removing the AUTOMAKE_OPTIONS=cygnus.
I've Googled around and been trying to understand this, and as far as I can tell, it is because of the version of autotools I'm using. According to the tutorial I used originally (OSDev - OS Specific Toolchain), I need an older version. My problem is that I'm using Kubuntu, which uses the apt package manager, and that version is not available to fall back to even temporarily. There has to be some fix for this. Either Newlib is outdated (this release is from December of 2013...) or the developers are crazy for depending on an outdated autotools version.
The only other thing I can think is that this is a message from the newlib configuration scheme itself in which case I have no idea how to modify my configure.in and Makefile.am to align with the new newlib configure format. That tutorial is the only one I've found that didn't use libgloss (which I'd prefer not to do) so far and the documentation of adding a new target is rather lacking in the documentation for newlib (or I missed something).
Here is some version information:
System: Kubuntu 14.04
Automake: 1.14.1
Autoconf: 2.69
Newlib: 2.1.0
Unfortunately I'm afraid using automake 1.12 or earlier is your only choice. Ubuntu has an Automake1.11 separate package to help you there, if I'm not mistaken, since the compatibility between 1.12 and 1.14 is generally good, but before that it was spotty.
I am writing this answer for people struggling with the tutorial described here.
I am in the same situation you are (or were), I am building a kernel from scratch and I wanted to port newlib to my toolchain. Unfortunately I think the tutorial has become out of date because I followed the instructions EXACTLY, even installing the correct software with the proper versions (including the correct newlib version). The accepted solution above didn't work for me but I found another solution that might work for others:
Step 1 - get the correct software
Acquire Automake (v1.12) and Autoconf (v2.65) from here:
http://ftp.gnu.org/gnu/automake/automake-1.12.tar.gz
http://ftp.gnu.org/gnu/autoconf/autoconf-2.65.tar.gz
Step 2 - build process
Untar both of the archives:
tar xf automake-1.12.tar.gz
tar xf autoconf-2.65.tar.gz
Create a destination folder:
mkdir ~/bin
Create a build folder:
mkdir build
cd build
Configure automake first:
../automake-1.12/configure --prefix="~/bin"
Make and install
make && make install
Now lets configure autoconf
../autoconf-2.65/configure --prefix=~/bin
Then make and install:
make && make install
You should now have the proper binaries in ~/bin!
Step 3 - update PATH
To add these binaries to your path temporarily (recommended):
export PATH=~/bin:$PATH
Once you update your path, rerun autoconf and autoreconf and it should complete.

Setuptools how to build shared library before package installation

My pacakge has *.py files and *.c files, the *.py files use ctypes to import shared library
built from the c source.
Now I have problem how to write my setup.py.
The setup script needs to build my_c_file.c to my_c_file.so, and then copy it to python libpath.
I want to know the what is the 'should' way?
You should probably have a look at Building C and C++ Extensions with distutils.
If you build a setup.py file around the example below, setuptools should compile your c file into my_c_lib.so and automatically add it to your installed package (untested).
from distutils.core import setup, Extension
c_module = Extension('my_c_lib',
sources = ['my_c_file.c'])
setup (name = 'my_package',
version = '1.0',
description = 'This is a package in which I compile a C library.',
ext_modules = [c_module])