-
Notifications
You must be signed in to change notification settings - Fork 45
Description
Particularly with the tweaks in #57 , it should be pretty straightforward to build source and binary wheel distributions and upload to PyPI and Conda-Forge, if there is any interest—it certainly should make the project more accessible to newcomers, particularly for students and those newer the field, simplify package and dependency management for users, and increase visibility to reach a wider audience.
Task list of things the need to be done:
- Modernize packaging (again)
- Fix any immediate packaging/Python compat problems
- Move static config to pyproject.toml
- Update manifest & pyproject.toml
- Update and simplify remaining setup.py
- Update Readme/etc. with up to date guidance
- Update/drop any other legacy stuff/Python support
- Add support for publishing wheels to PyPI
- Drop in, test and tweak cibuildwheel GHA workflow
- Add key to repo secrets & drop in PyPI GHA workflow
- Tag a new 1.4.0 release
- Publish to Conda-Forge
- Run greyskull on PyPI package
- Tweak/manually add anything necessary
- Push to staged-recipies & go through review process
Mostly out of date now that cibuildwheel and greyskull exist
Doing so should be as simple as running python setup.py sdist on any machine, then python setup.py bdist_wheel on any platforms and Python versions you'd like to build binary wheels for, to make installation easier, faster and less cumbersome and error-prone (of course, source installation will still work on any platform that has the compilers). and finally twine upload dist/* to upload to PyPI. The only somewhat complex factor is build platforms; of course, building binary distributions is optional—if any of this is too much work you can only distribute select builds or source-only, which is minimal effort but brings all the benefits of pip/PyPI distribution aside from avoiding the need to build locally
Alternatively (and probably preferably), you could go source-only on PyPI (or only a few select wheels, e.g. Windows), and use conda-forge to automate most of the build process, like PyART does. I should be able to help get that set up, if there's interest, and it shouldn't be too involved. If we go with that, then most of the remaining text is unnecessary since the whole build process is almost entirely automated; it just requires some initial configuration and conda-build/conda-smithy handles the rest, and all the builds are automatically done in the CIs. Of course, the advantage there is that the package is natively installable with conda without resorting to pip, and binaries are automatically compiled for all major platforms and Python versions making installation and maintenance a breeze (rebuilds or updates are mostly automated, and only require changing at most a line or two).~~
Let me know if I can be of further help!
Even more out of date build matrix details now that cibuildwheel exists
The most critical OS is (of course) Windows due to the difficulty and overhead of building it themselves; you can build all the required package versions automatically with the free Appveyor CI service, or in a few minutes on any local machine with the Anaconda/Miniconda (to switch Python versions quickly) and the compilers installed; I'd be happy to help there if needed. Similarly, macOS is less critical but can similarly be done locally, or with Travis now that they offer macOS builds. Linux is a bit more complicated due to the variety of distributions and glibc versions so building locally can be rather fragile and not always work on other distros, but thankfully there's the manylinux project, officially endorsed by the PSF/PyPA, that offers pre-built docker containers that can easily be run locally or automatically via Travis to generate the required builds.
As for your build matrix, given 64-bit systems have been widely available for well over 10 years (and exclusively available for close to that), you probably can stick with just x64 builds; Python 2.7, 3.5. 3.6 and 3.7 (if the package actually runs properly on it; it very likely does but I haven't formally tested) should cover the overwhelming majority of users (in our statistics on the Spyder repo, with somewhere around 3000+ issue reports per year and growing, IIRC its been years since we've seen a Python 3.x user with <3.5), and avoids the complexity of building on 3.4 due to the problem VS versions and hacky workarounds necessary for x64 builds).