On 5 October 2015 at 20:43, Tim <jtim.arn...@gmail.com> wrote: > > I have a package I want to share but have a question about packaging. > > Mostly the package is pure python code, but it also requires some binary > libraries (*.so, *.dll, *.dylib). I want to bundle these libs so users don't > have to compile. The package will run on *nix/windows/mac platforms. > > Currently I handle this in setup.py. In the 'build' phase, I copy the > platform-specific libs to a subdirectory called 'libs'. > > class MyBuilder(build_py): > def run(self): > conditional logic for copying > appropriate library files to 'libs' > etc etc. > build_py.run() > > And that seems to work, but after reading more from the Python Packaging > Authority, I wonder if that is the right way. Should I be using wheels > instead? > I think my brain fried a little bit while going through the doc.
The idea of a wheel is that you want to distribute your code fully precompiled to end users who will be able to install it without needing any C compilers etc. Of course this requires you to supply wheels for each platform of interest. If this is what you want to do then yes absolutely use wheels. Note that if you have installed setuptools and wheel and you use setuptools in your setup.py then building a wheel is as simple as running "python setup.py bdist_wheel" (once your setup.py is complete). If the binary libraries in question are extension modules then you should just declare them as such in your setup.py and distutils/setuptools/wheel will take care of bundling them into the wheel. If the binary libraries are not extension modules and you are building them separately (not using distutils) then you can declare them as "datafiles" [1] so that they will be bundled into your wheel and installed alongside your python code. [1] https://packaging.python.org/en/latest/distributing/#package-data -- Oscar -- https://mail.python.org/mailman/listinfo/python-list