On 2019-07-27 22:04, Drew Parsons wrote:
I've uploaded python-scipy 1.2.2 to unstable.
Previously the build system ran through distutils, but dh now gives an
error on that, and says pybuild should be used instead. So I
reorganised debian/rules to use dh --buildsystem=pybuild. I also
reorganised rules to stop the second invocation of dh in the
build-arch rule.
The build completes successfully. But it is taking 3 times longer
than it did before. It is also using around 75% more disk space than
before.
...
Is an increase in build resources like this known to happen when
pybuild is used instead of distutils?
This seems to be why scipy 1.2.2-1 used more build resources:
scipy modules are built in 2 steps:
1) setup.py config_fc --noarch build (with override_dh_auto_configure)
2) setup.py install ... --force --no-compile (with
override_dh_auto_install)
Previously (building with distutils), both steps built via a
build/src.linux-x86_64-* builddir, so the --no-compile flag meant object
files were not compiled twice.
Now with pybuild, the config_fc step is still built in
build/src.linux-x86_64-*, but the install step is handled via
build/src.linux-amd64-*. So object files are compiled twice, first for
x86_64, and then for amd64. Likewise arm64 first "configures" builds in
an aarch64 builddir, and then "installs" via arm64.
Looks like some discrepancy between setup.py config_fc and pybuild's
setup.py install in handling DEB_*_ARCH_CPU and DEB_*_GNU_CPU.
DEB_*_CPU is not used explicitly by scipy's debian/rules.
Is setup.py config_fc even needed? Is it a hangover from distutils and
not needed in a pybuild build? It's not well documented and not listed
by "python3 setup.py --help-commands", though is mentioned in
INSTALL.rst.txt (with a brief reference in scipy/special/setup.py and
scipy/integrate/setup.py). But debian/rules doesn't use it with
--fcompiler to specific the fortran compiler.
Drew