-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uploading wheels to pypi #1
Comments
Wheels are done for Windows and Linux. Waiting on OSX due to travis-ci/travis-ci#8552. |
@xoviat, you have to be careful about how you build gmp/mpir when you want to distribute. Here's how I do it, https://github.com/symengine/symengine-wheels/blob/master/config.sh#L24. Note that you have to enable |
Actually I just reused your |
Yes. You need to add |
Here is a wrapper project I made that builds wheels for https://gitlab.com/alisianoi/flint-py The README.md there describes the build process a bit. The wheels are on PyPI: https://pypi.org/project/flint-py/ I bumped the version from |
@alisianoi Thanks for sharing! Would it make sense to integrate this with python-flint instead of having a separate project? It doesn't make much difference either way as long as it's maintained in some form. |
I think it would definitely make sense to integrate this with python-flint. Setting this up to compile is not completely trivial so it would be much better if there are precompiled wheels for Windows, OSX and manylinux. They could be built in github actions. |
@oscarbenjamin Indeed. Unfortunately I don't have much time as I'd like to maintain python-flint, and it'd be awesome if someone else was willing to step up. Not least considering all the new multivariate polynomial stuff (and more) in the latest Flint that hasn't been wrapped in python-flint yet. |
Okay so maybe reopen this issue and I'll see what I can do. I've almost managed to build a wheel on OSX (build script is below) but I'm hitting against two problems right now:
The script I have so far is: #!/usr/bin/env bash
set -o errexit
PYTHON=python3.8
FLINTVER=2.7.1
ARBVER=2.19.0
mkdir -p local
PREFIX=$(pwd)/local
curl -O https://www.flintlib.org/flint-$FLINTVER.tar.gz
curl -O -L https://github.com/fredrik-johansson/arb/archive/refs/tags/$ARBVER.tar.gz
mv $ARBVER.tar.gz arb-$ARBVER.tar.gz
git clone https://github.com/fredrik-johansson/python-flint
tar xf flint-$FLINTVER.tar.gz
cd flint-$FLINTVER
./configure --prefix=$PREFIX
make -j4
make install
cd ..
tar xf arb-$ARBVER.tar.gz
cd arb-$ARBVER
./configure --prefix=$PREFIX --with-flint=$PREFIX
make -j4
make install
cd ..
$PYTHON -m venv venv
source venv/bin/activate
pip install -U pip wheel
pip install numpy cython==0.27.3
# N.B. bugs in both older and newer Cython versions...
cd python-flint
pip wheel . --global-option=build_ext \
--global-option="-I$PREFIX/include" \
--global-option="-L$PREFIX/lib"
pip install . --global-option=build_ext \
--global-option="-I$PREFIX/include" \
--global-option="-L$PREFIX/lib"
cd ..
python -c 'import flint' |
Okay I've got this working to make some usable OSX wheels. Changes needed in
Apply this diff in setup.py: diff --git a/setup.py b/setup.py
index 0a739c9..448466a 100644
--- a/setup.py
+++ b/setup.py
@@ -21,7 +21,7 @@ default_include_dirs += [
ext_modules = [
Extension(
- "flint", ["src/pyflint.pyx"],
+ "flint.flint", ["src/flint/pyflint.pyx"],
libraries=libraries,
library_dirs=default_lib_dirs,
include_dirs=default_include_dirs)
@@ -34,6 +34,8 @@ setup(
name='python-flint',
cmdclass={'build_ext': build_ext},
ext_modules=ext_modules,
+ packages=['flint'],
+ package_dir={'': 'src'},
description='Bindings for FLINT and Arb',
version='0.3.0',
url='https://github.com/python-flint/python-flint', That gives the following package structure:
Once those changes are made then this script generates a relocatable OSX wheel on OSX 11.1 with either Python 3.7 or 3.8 from python.org installers. It does not work with python 3.9. I think that is because a newer cython version is needed but the newer cython has a bug that doesn't work for EDIT (the changes above were made in #14 so this script works directly with master now): #!/usr/bin/env bash
set -o errexit
PYTHON=python3.8
GMPVER=6.2.1
MPFRVER=4.1.0
FLINTVER=2.7.1
ARBVER=2.19.0
PYTHONFLINTVER=0.3.0
mkdir -p local
PREFIX=$(pwd)/local
curl -O https://gmplib.org/download/gmp/gmp-$GMPVER.tar.xz
curl -O https://www.mpfr.org/mpfr-current/mpfr-$MPFRVER.tar.gz
curl -O https://www.flintlib.org/flint-$FLINTVER.tar.gz
curl -O -L https://github.com/fredrik-johansson/arb/archive/refs/tags/$ARBVER.tar.gz
mv $ARBVER.tar.gz arb-$ARBVER.tar.gz
rm -rf python-flint
git clone https://github.com/fredrik-johansson/python-flint
tar xf gmp-$GMPVER.tar.xz
cd gmp-$GMPVER
./configure --prefix=$PREFIX --enable-fat --enable-shared --host=x86_64-apple-darwin
make -j3
make install
cd ..
tar -xf mpfr-$MPFRVER.tar.gz
cd mpfr-$MPFRVER
./configure --prefix=$PREFIX --with-gmp=$PREFIX
make -j3
make install
cd ..
tar xf flint-$FLINTVER.tar.gz
cd flint-$FLINTVER
./configure --prefix=$PREFIX --with-gmp=$PREFIX --with-mpfr=$PREFIX
make -j3
make install
cd ..
tar xf arb-$ARBVER.tar.gz
cd arb-$ARBVER
./configure --prefix=$PREFIX --with-flint=$PREFIX --with-gmp=$PREFIX --with-mpfr=$PREFIX
make -j3
make install
cd ..
$PYTHON -m venv venv
source venv/bin/activate
pip install -U pip wheel delocate
pip install numpy cython==0.27.3
# N.B. bugs in both older and newer Cython versions...
cd python-flint
pip wheel . --global-option=build_ext \
--global-option="-I$PREFIX/include" \
--global-option="-L$PREFIX/lib"
delocate-wheel -w .. *.whl
cd ..
tags="cp37-cp37m cp38-cp38"
osx_from="11_0"
osx_to="10_9"
for tag in $tags; do
wheelold=python_flint-$PYTHONFLINTVER-$tag-macosx_${osx_from}_x86_64.whl
wheelnew=python_flint-$PYTHONFLINTVER-$tag-macosx_${osx_to}_x86_64.whl
if [ -f $wheelold ]; then
mv $wheelold $wheelnew
fi
if [ -f $wheelnew ]; then
wheelfinal=$wheelnew
fi
done
echo ------------------------------------------
echo
echo Built wheel: $wheelfinal
echo
echo Link dependencies:
delocate-listdeps $wheelfinal
echo
pip install $wheelfinal
echo
echo Demonstration:
echo
python -c 'import flint; print("(3/2)**2 =", flint.fmpq(3, 2)**2)'
echo
echo Done!
echo
echo ------------------------------------------ |
The changes for It doesn't build GMP or MPFR but still works on my system because I have those installed. I guess I should get it to compile them as well then I could see how to get it working in Github Actions. Ultimately |
I've updated the script above so that it builds GMP and MPFR as well. I'm unsure about one thing which is that delocate reports the depedencies as:
That seems to list libflint twice. I'm not sure if something is messed up there or how to fix it... |
At the time of writing after merging #22 the situation is that wheels are built in CI for Linux and OSX using the latest releases of GMP, FMPR, Flint and Arb: $ python -c 'import flint'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\pc\src\flint\python-flint\install_venv\lib\site-packages\flint\__init__.
py", line 1, in <module>
from ._flint import *
ImportError: DLL load failed while importing _flint: The specified module could not be fo
und. Although this isn't yet working I though I'd record the steps I followed for now:
Longer explanation:
This will just install the latest versions of a bunch of packages and I don't think it's possible (or at least it's not recommended) to try to pin them.
After all of these steps there should be a wheel in the $ ls dist
python_flint-0.3.0-cp310-cp310-win_amd64.whl As mentioned before though I found that after installing the wheel I couldn't import it because of the missing dll error. |
This is discussed in #10 |
For comparison these are the files I get after running the $ ls .local/include
acb_calc.h acb_mat.h arb.h bool_mat.h fmpz_extras.h mpfr.h
acb_dft.h acb_modular.h arb_hypgeom.h dirichlet.h fmpzi.h partitions.h
acb_dirichlet.h acb_poly.h arb_mat.h dlog.h gmp.h
acb_elliptic.h arb_calc.h arb_poly.h double_interval.h hypgeom.h
acb.h arb_fmpz_poly.h arf.h flint mag.h
acb_hypgeom.h arb_fpwrap.h bernoulli.h fmpr.h mpf2mpfr.h
$ ls .local/lib
libarb.so libflint.so libgmp.la libgmp.so.10.4.1 libmpfr.so.6
libarb.so.2 libflint.so.17 libgmp.so libmpfr.la libmpfr.so.6.1.0
libarb.so.2.14.0 libflint.so.17.0.0 libgmp.so.10 libmpfr.so pkgconfig
$ ls .local/bin
ls: cannot access '.local/bin': No such file or directory |
If you $ cd build/lib.win-amd64-cpython-310/flint/
$ ls
__init__.py _flint.cp310-win_amd64.pyd
$ mingw-ldd _flint.cp310-win_amd64.pyd --dll-lookup-dirs .
kernel32.dll => not found
libarb-2.dll => not found
libflint-17.dll => not found
libgmp-10.dll => not found
msvcrt.dll => not found
python310.dll => not found The kernel32.dll and msvcrt.dll files are in The others $ mingw-ldd _flint.cp310-win_amd64.pyd --dll-lookup-dirs .
kernel32.dll => not found
libarb-2.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libarb-2.dll
libflint-17.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libflint-17.dll
libgmp-10.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libgmp-10.dll
libmpfr-6.dll => not found
libwinpthread-1.dll => not found
msvcrt.dll => not found
python310.dll => not found Now $ mingw-ldd _flint.cp310-win_amd64.pyd --dll-lookup-dirs .
kernel32.dll => not found
libarb-2.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libarb-2.dll
libflint-17.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libflint-17.dll
libgcc_s_seh-1.dll => not found
libgmp-10.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libgmp-10.dll
libmpfr-6.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libmpfr-6.dll
libwinpthread-1.dll => not found
msvcrt.dll => not found
python310.dll => not found The final two DLLs are $ mingw-ldd _flint.cp310-win_amd64.pyd --dll-lookup-dirs .
kernel32.dll => not found
libarb-2.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libarb-2.dll
libflint-17.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libflint-17.dll
libgcc_s_seh-1.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libgcc_s_seh-1.dll
libgmp-10.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libgmp-10.dll
libmpfr-6.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libmpfr-6.dll
libwinpthread-1.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libwinpthread-1.dll
msvcrt.dll => not found
python310.dll => not found And so we're good for DLLs. Now let's try importing: $ cd ..
(install_venv)
pc@NUCPC MINGW64 /c/Users/pc/src/flint/python-flint/build/lib.win-amd64-cpython-310
$ python
Python 3.10.8 (tags/v3.10.8:aaaf517, Oct 11 2022, 16:50:30) [MSC v.1933 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import flint
>>> flint.fmpz(2)
2
>>> flint.fmpz(2) ** 2
73786976329197944836
>>> flint.fmpz(2) + 2
-2079229367761764350
>>> flint.fmpz(2) + flint.fmpz(2)
-2079229359171829756 Okay so we've managed to load the DLLs and import the module. That's step 1 but obviously we need $ python test/test.py
test_fmpz...Traceback (most recent call last):
File "C:\Users\pc\src\flint\python-flint\test\test.py", line 439, in <module>
sys.stdout.write("test_fmpz..."); test_fmpz(); print("OK")
File "C:\Users\pc\src\flint\python-flint\test\test.py", line 26, in test_fmpz
assert ltype(s) + rtype(t) == s + t
AssertionError |
The cibuildwheel project does not do anything to "repair" Windows wheels by bundling DLLs (like using auditwheel or delocate for Linux or OSX wheels). There is a project called $ pip install delvewheel
$ delvewheel show python_flint-0.3.0-cp310-cp310-win_amd64.whl --add-path .local/bin:.local/lib/
Analyzing python_flint-0.3.0-cp310-cp310-win_amd64.whl
The following DLLs will be copied into the wheel.
libflint-17.dll (C:\Users\pc\src\flint\python-flint\.local\bin\libflint-17.dll)
libgmp-10.dll (C:\Users\pc\src\flint\python-flint\.local\bin\libgmp-10.dll)
libmpfr-6.dll (C:\Users\pc\src\flint\python-flint\.local\bin\libmpfr-6.dll)
libarb-2.dll (C:\Users\pc\src\flint\python-flint\.local\lib\libarb-2.dll)
libgcc_s_seh-1.dll (C:\msys64\mingw64\bin\libgcc_s_seh-1.dll)
libwinpthread-1.dll (C:\msys64\mingw64\bin\libwinpthread-1.dll)
The following DLLs are assumed to be present in the end user's environment and will not be copied into the wheel.
kernel32.dll
msvcrt.dll
python310.dll The $ delvewheel repair python_flint-0.3.0-cp310-cp310-win_amd64.whl --add-path .local/bin:.local/lib/
repairing python_flint-0.3.0-cp310-cp310-win_amd64.whl
finding DLL dependencies
copying DLLs into python_flint.libs
mangling DLL names
Traceback (most recent call last):
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in _run_code
exec(code, run_globals)
File "C:\Users\pc\src\flint\python-flint\.local\venv\Scripts\delvewheel.exe\__main__.py", line 7, in <module>
File "C:\Users\pc\src\flint\python-flint\.local\venv\lib\site-packages\delvewheel\__main__.py", line 67, in main
wr.repair(args.target, no_mangles, args.no_mangle_all, args.lib_sdir)
File "C:\Users\pc\src\flint\python-flint\.local\venv\lib\site-packages\delvewheel\_wheel_repair.py", line 590, in repair
_dll_utils.replace_needed(extension_module_path, needed, name_mangler, self._verbose)
File "C:\Users\pc\src\flint\python-flint\.local\venv\lib\site-packages\delvewheel\_dll_utils.py", line 361, in replace_needed
raise RuntimeError(
RuntimeError: Unable to rename the dependencies of _flint.cp310-win_amd64.pyd because this DLL has trailing data. If this DLL was crea
ted with MinGW, run the strip utility. Otherwise, include libgmp-10.dll;libflint-17.dll;libarb-2.dll in the --no-mangle flag. In addit
ion, if you believe that delvewheel should avoid name-mangling a specific DLL by default, open an issue at https://github.com/adang134
5/delvewheel/issues and include this error message. |
The $ delvewheel repair python_flint-0.3.0-cp310-cp310-win_amd64.whl --add-path .local/bin:.local/lib/ '--no-mangle=libflint-17.dll;libarb-2.dll;libgmp-10.dll;libmpfr-6.dll;libgcc_s_seh-1.dll'
repairing python_flint-0.3.0-cp310-cp310-win_amd64.whl
finding DLL dependencies
copying DLLs into python_flint.libs
mangling DLL names
patching flint\__init__.py
updating python_flint-0.3.0.dist-info\RECORD
repackaging wheel
fixed wheel written to C:\Users\pc\src\flint\python-flint\wheelhouse\python_flint-0.3.0-cp310-cp310-win_amd64.whl The
|
Here is a script that I've tested more or less end to end in a fresh msys2 install. This builds and tests a Windows wheel: #!/bin/bash
#
# make_wheels.sh
#
# Build relocatable Windows wheel for python_flint using the mingw-w64
# toolchain in the MSYS2 enironment.
#
# - First install Python
#
# https://www.python.org/ftp/python/3.10.8/python-3.10.8-amd64.exe
#
# - Then checkout the code:
#
# $ git clone https://github.com/fredrik-johansson/python-flint/issues/1
#
# - Then install msys2
#
# https://repo.msys2.org/distrib/x86_64/msys2-x86_64-20221028.exe
#
# - Then open msys2, cd into the checked out repo. Change setup.py to say
#
# libraries = ["arb", "flint", "mpfr", "gmp"]
#
# - Then run this script.
set -o errexit
PYTHONDIR=/c/Users/pc/AppData/Local/Programs/Python/Python310/
WHEELNAME=python_flint-0.3.0-cp310-cp310-win_amd64.whl
# Install the mingw-w64 toolchain
pacman -S --noconfirm mingw-w64-x86_64-gcc m4 make mingw-w64-x86_64-tools-git
# This takes ~1hr
bin/build_dependencies_unix.sh
# Add the libpython310.a file to Python installation
cd $PYTHONDIR
gendef python310.dll
dlltool --dllname python310.dll --def python310.def --output-lib libpython310.a
mv libpython310.a libs
cd -
# Make a virtual environment to install the build dependencies
python -m venv .local/venv
source .local/venv/Scripts/activate
pip install numpy cython wheel delvewheel
# Build the wheel
C_INCLUDE_PATH=.local/include/ LIBRARY_PATH=.local/lib/ python setup.py build_ext -cmingw32 -f bdist_wheel
# Make the wheel relocatable
delvewheel repair dist/python_flint-0.3.0-cp310-cp310-win_amd64.whl \
--add-path .local/bin:.local/lib/ \
'--no-mangle=libflint-17.dll;libarb-2.dll;libgmp-10.dll;libmpfr-6.dll;libgcc_s_seh-1.dll'
# Make a virtual enironment to test the wheel
python -m venv test_env
source test_env/Scripts/activate
pip install wheelhouse/$WHEELNAME
python -c 'import flint; print(flint.fmpz(2) + 2)' # 2 + 2 = 4? Of course the output is that |
The outstanding items are:
|
That's now fixed by #37. Now wheels are built in CI for Windows, OSX and Linux and all tests pass on all platforms although some parts of the codebase are not yet covered by the tests. |
I'd recommend using Cirrus CI for macOS arm64 wheels. It provides free native arm64 runners, and it has been a nice experience using Cirrus CI for SciPy and the other projects that I've seen try it. universal2 wheels are kinda pointless, thin wheels are much better to produce (and easier). |
Thanks Ralf. Do you know why the SciPy Cirrus CI setup uses homebrew to build the wheels rather than one of the Cirrus CI Python images? Does that have any effect on binary compatibility? |
I don't remember why that is - it shouldn't have effect on binary compatibility AFAIK. @andyfaff can you comment on the Homebrew usage? |
I suppose the homebrew python is only used to run cibuildwheel which will then install its own versions of Python to do the actual wheel building so it shouldn't affect binary compatibility. I guess the reason for using homebrew is that with Cirrus CI you have to choose which image to use and if you want to run on OSX arm64 you do:
https://cirrus-ci.org/guide/macOS/ The Cirrus CI recommended way to have Python ready is to do
https://cirrus-ci.org/examples/#building-pypi-packages I don't understand exactly what all of these configuration options do. Maybe it's not possible to combine those and we're forced to choose between the two images in which case we can either choose to have Python installed or we can choose to use arm64 but then have to install everything ourselves (except homebrew which is preinstalled). |
That's correct. Homebrew python is only used to install cibuildwheel. When running, cibuildwheel takes control of installing all the pythons required for each wheel. For each python3.x wheel you build it downloads and installs the canonical cpython interpreter, which is then used to build the wheel. Binary compatibility is then with that python. Cibuildwheel uses a similar process for windows. On Linux cibuildwheel uses the PyPA manylinux docker images to build wheels. For scipy we use cirrus to build macosx_arm64 wheels and GitHub to build macosx_x86_64. The thin wheels are smaller in size. If absolutely required you might be able to direct cibuildwheel to make a fat binary on cirrus, macos has no issues cross compiling C code on an M1, but it would depend on how complex your build process is. We don't do this with scipy as we have to build fortran code as well, as soon as fortran compilers are involved things get messy |
Thanks @andyfaff. Unless I misunderstand it looks like it's straight-forward for cibuildwheel to build any of x86_64, arm64 and universal2 under OSX with either architecture as the build host. The tricky parts if not using purely native builds are:
If there was a way to get the artefact from GitHub Actions to Cirrus then I guess we would only need Cirrus just to test the wheels under M1. |
Thanks all for your help. I've opened #42 which adds Cirrus CI config to build OSX arm64 wheels. I already tested it in my fork including downloading the wheels and testing them locally. |
Cirrus is also useful for building linux_aarch64 natively. |
Is I'm aware that there are some size limits in PyPI although I'm not sure exactly what they are. Currently with linux_x86_64, win_x86_64, macos_x86_64 and macos_arm64 and 3 versions of Python we're up to 150 megabytes of wheels for the next release of python_flint. The Linux wheels are the biggest weighing in at 30 megabytes each so I guess aarch64 would be another 100 megabytes. Certainly for now at least I think aarch64 is out of scope for the next release. We're very close to having the platforms above sorted and then we should get a release of python_flint out ASAP. The outstanding issues btw are:
|
Scipy has started making linux_aarch64 wheels. We were getting a few requests for them, I don't have the stats to hand. |
I believe the default limit on PyPI is 60 MB per wheel. I have uploaded single releases to PyPI where the 25+ wheels weighed in at over 300 MB without a hitch.
My experience is that essentially all Windows computers have |
I'm not concerned that the end users system might not have msvcrt.dll. I think it's pretty common and won't go away for a long time. My concern is about mixing C runtimes within the same process. To be clear I am absolutely not any kind of expert on these things and I'll be very happy to accept that others know more than me. Let me explain my understanding so that I can be corrected if I've got it wrong... Here I am building _flint.pyd which is a DLL file and I'm bundling it into a wheel with a bunch of other DLLs (libgmp.dll etc). The intended target for these is a user who is using the standard builds of CPython for Windows which are built using MSVC and link against the UCRT DLLs. When the user uses this module they will be using a CPython executable plus its own DLLs that are all linked against UCRT DLLs. Currently the DLLs in the Windows wheel here are linked against msvcrt.dll so My understanding is that it can be possible to mix C runtimes like this if it is done very carefully and there are various rules like not passing file handles or other things between the different parts that use different runtimes. I don't know what all the rules are for doing this and I also don't really have a good way to audit the 4 different libraries that are being bundled in order to see whether this is being done correctly. However it seems to work okay right now and the wheel works at runtime doesn't crash, passes all tests both locally and in CI etc. I made a PR #41 which uses the msys2 UCRT toolchain to build all the dependencies. I've tested that part locally and in CI and it seems that the DLLs for GMP, Flint etc all link against UCRT as intended. However it seems like the _flint._pyd that is built by numpy.distutils using mingw64 somehow still links against msvcrt.dll. Those wheels build okay but then crash immediately at runtime. I think what that shows is that there are potential problems involved with mixing runtimes although the way it's done currently with the Windows wheels seems to work so far. I'm not sure how exactly to tell numpy.distutils to use the UCRT toolchain for building _flint.dll (I think it is using it...) or how to tell it that msvcrt.dll should not be linked. One possible solution is to switch from using msys2 and the mingw64 toolchain when building _flint.pyd and use MSVC for that. I just gave that a try and it fails because MSVC can't find |
If you're only compiling c code and not fortran, then can't you just use the normal build chain and avoid mingw64. |
To be fair I haven't actually tried to build GMP with MSVC. I was really just following the approach taken by gmpy since I know they've been working at this for longer. The problem with MSVC is compiling GMP. This is also why gmpy uses mingw64 for building the C dependencies (aleaxit/gmpy#352 (comment)). The attraction of GMP is that it is really fast at integer arithmetic and much of what this collection of C libraries does is arranged around leveraging that speed by using fast integer multiplication to speed up everything else. GMP gets this speed in part by having a lot of hand-written assembly code for different CPUs: There is a fork of GMP called MPIR which was intended to have better MSVC support among other things. I think that's what the current conda packages of python-flint for Windows use. MPIR is no longer maintained though so I think it's better to use GMP. |
I've just merged #42 which builds and tests wheels for macos on arm64. That means that CI now builds and tests the following wheels:
I think that's good to go and it's time for a new release of python_flint so we can let people test these wheels out. |
Hi Oscar, MPIR is still fully supported on WIndows x64 (by me) and I have a complete working build chain using Visual Studio for MPIR, MPFR, MPC, FLINT2 and ARB. I support GMPY2 and I would hope to support Python Flint on WIndows x64 using MPIR. If I can help in what you are doing please let me know. |
Last night I pushed a 0.4.0 release of python-flint to PyPI with binary wheels for Windows x86-64, OSX x86-64 and arm64 and manylinux_2_17 x86-64. I was hoping I could close this issue but unfortunately I immediately discovered a problem with the Linux wheels when testing them locally:
I've deleted the Linux wheels from PyPI now so In the mean time though
I have tested locally on Windows and OSX x86-64 but I don't immediately have access to OSX arm46 (i.e. Apple M1 hardware) to test there. Theoretically all these wheels were tested in CI but as seen for the Linux wheels that is not enough to guarantee that they work outside of CI. |
So the Linux wheels seem to be double badged as $ wheel unpack python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Unpacking to: python_flint-0.4.0...OK
$ tree python_flint-0.4.0
python_flint-0.4.0
├── flint
│ ├── _flint.cpython-311-x86_64-linux-gnu.so
│ └── __init__.py
├── python_flint-0.4.0.dist-info
│ ├── LICENSE
│ ├── METADATA
│ ├── RECORD
│ ├── top_level.txt
│ └── WHEEL
└── python_flint.libs
├── libarb-4c2cc9a4.so.2.14.0
├── libflint-c1ff2639.so.17.0.0
├── libgmp-bfc87f94.so.10.5.0
└── libmpfr-90ec1309.so.6.1.0
3 directories, 11 files It looks like auditwheel has bundled the right .so files for the immediate dependencies but has not bundled anything else. I am not sure if that should be considered correct or not. I guess we can check with auditwheel: $ auditwheel show python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
is consistent with the following platform tag:
"manylinux_2_17_x86_64".
The wheel references external versioned symbols in these
system-provided shared libraries: libc.so.6 with versions
{'GLIBC_2.3', 'GLIBC_2.7', 'GLIBC_2.14', 'GLIBC_2.2.5'},
libpthread.so.0 with versions {'GLIBC_2.3.2', 'GLIBC_2.2.5',
'GLIBC_2.3.4'}, libm.so.6 with versions {'GLIBC_2.2.5'}
This constrains the platform tag to "manylinux_2_17_x86_64". In order
to achieve a more compatible tag, you would need to recompile a new
wheel from source on a system with earlier versions of these
libraries, such as a recent manylinux image. In CI when building the wheels auditwheel says:
The configuration for this is here: I think that I have previously seen some issue with downloading a Linux wheel from CI and that possibly at the time I "fixed" it by using a different manylinux image: at least that is the first thing I can try. |
It seems that NumPy's wheels are also double badged and auditwheel reports a similar result: $ auditwheel show numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
is consistent with the following platform tag:
"manylinux_2_17_x86_64".
The wheel references external versioned symbols in these
system-provided shared libraries: libm.so.6 with versions
{'GLIBC_2.2.5'}, libgcc_s.so.1 with versions {'GCC_4.2.0',
'GCC_4.3.0', 'GCC_3.0', 'GCC_3.4', 'GCC_3.3', 'GCC_4.8.0'},
libquadmath-96973f99.so.0.0.0 with versions {'QUADMATH_1.0'},
libc.so.6 with versions {'GLIBC_2.14', 'GLIBC_2.4', 'GLIBC_2.17',
'GLIBC_2.3', 'GLIBC_2.3.4', 'GLIBC_2.6', 'GLIBC_2.3.2', 'GLIBC_2.2.5',
'GLIBC_2.10', 'GLIBC_2.7'}, libpthread.so.0 with versions
{'GLIBC_2.3.2', 'GLIBC_2.3.4', 'GLIBC_2.2.5'},
libgfortran-040039e1.so.5.0.0 with versions {'GFORTRAN_8'}
This constrains the platform tag to "manylinux_2_17_x86_64". In order
to achieve a more compatible tag, you would need to recompile a new
wheel from source on a system with earlier versions of these
libraries, such as a recent manylinux image. Also the contents of the wheel seem similar at least for what is bundled:
|
I've tried manylinux2010 (#46) and manylinux_2_24 (#47) but in all cases I get the same crash. The problem is specifically in string conversion e.g. converting an fmpz to a decimal string so my guess is that that codepath goes through to libm or libc somehow and then some ABI compatibility breaks. I don't know whether any of GMP, MPFR, Flint or Arb is using libm or libc in a non-standard way somehow that manylinux/auditwheel just can't account for when attempting to produce a relocatable wheel. Converting to a hex string works for sufficiently small >>> import flint
>>> hex(flint.fmpz(2))
'0x2'
>>> hex(flint.fmpz(2<<2))
'0x8'
>>> hex(flint.fmpz(2<<61))
'0x0'
>>> hex(flint.fmpz(2<<60))
'0x2000000000000000'
>>> hex(flint.fmpz(2<<62))
Illegal instruction (core dumped) Worth noting that the internal representation for >>> import flint
>>> str(flint.fmpz(0))
'0'
>>> str(flint.fmpz(1))
Illegal instruction (core dumped) Conversion is done using I have a branch that already uses flint-3.0.0-alpha1 in gh-43! Downloading those wheels on Linux also fails though but just not in exactly the same way. It works up to (Pdb) p str(flint.fmpz(2**62-1)) == str(2**62-1)
True
(Pdb) p str(flint.fmpz(2**62)) == str(2**62)
Illegal instruction (core dumped) Even if using flint-3.0.0-alpha1 did work that is really just working around the ABI issue. Somehow cibuildwheel/manylinux/auditwheel are not doing their job properly because the wheel is not relocatable. |
All tests passed on macOS 13 (Ventura) with Python 3.9, 3.10, 3.11 (downloaded from Python.org) on an iMacPro 2017. All tests passed on Windows 10 Pro with Python 3.9 and 3.11 (downloaded from Python.org) on a Surface Pro 2017 (i5-7300U). It might be worth adding the test files to the wheels inside a submodule so that one could do something like:
to run all the tests without cloning the source repo. |
Thanks @NathanDunfield!
Am I right in presuming that would be x86-64 (i.e. not the M1 CPU)?
Yes, that is a good idea. I've included them in the sdist but they are not in the wheels. The wheels are already quite large because of the bundled binaries so the test files would not noticeably increase the size. |
Yes, it's an Intel processor (Xeon W-2150B), not Apple Silicon. |
I've just pushed a 0.4.1 release to PyPI. The issue with Linux wheels remains unresolved: I have no more time to work on this and it is unclear to me what to do investigate this further in any case. Unless anyone has any bright ideas I just need to defer this problem. Instead of fixing the Linux wheels issue I have pushed updated instructions for building from sdist on Linux: On the bright side the Windows and OSX wheels seem to work very well. Thanks to @NathanDunfield for testing on Windows and OSX x86-64 for a range of Python versions. I have now also tested on OSX arm64 and that seems to work great as well. Maybe this issue should be closed and replaced with a Linux-specific wheel issue. |
Oh, also as of the 0.4.1 release the tests can be run against the installed distribution as suggested by @NathanDunfield above:
|
I've opened #51 to track the problem with the Linux wheels. In the mean time there are wheels for Windows and OSX on PyPI and I have tested them on both platforms and both architectures for OSX. I'm going to close this issue, so please use #51 to followup on the problem with the Linux wheels. Thanks everyone above for your help! |
FYI I'm working on uploading wheels to pypi over at https://github.com/xoviat/build-python-flint. After some pain, I've managed to build python-flint on windows and will probably have builds for OSX and manylinux as well.
The text was updated successfully, but these errors were encountered: