>> No idea. I'm keeping an eye on upstream release but the only option
>> I can see to make it work is to depend on the LLVM version that does
>> not exist in unstable.
>
> I found some time and energy to rebase the 1092 llvm-19 compatibility
> branch against 0.44.0
> https://github.com/detrout/ll
> pybuild-autopkgtest(1) has an example of this.
sorry for the noise
>>This way it would be possible to deal with the Dependencies on our own
>>and avoid the problem of missing dependencies due to these @builddep@,
>>variables.
>
> Do you mean you don't want it to use @builddeps@ or what is the
> Making sure the installed code works.
I expect the same :)
> But also you should understand that what you want to do is very different
> from running the test suite, as you explicitly don't want to install deps
> needed for running it.
I want to run the same test suite (which is most often pro
> You said "I want to check during the autopkgtest that the runtime
> dependencies are well defines in the package" and for that you shouldn't
> have any additional packages installed, but to run tests you normally need
> additional packages, usually quite a lot of them.
I was spoken about the tes
>>I was spoken about the test specific dependencies.
>>
>>not the dependencies of the packages (dependencies which are imported in the
>>upstream module code).
>
> Well, "the runtime dependencies" means the latter.
agreed
>
>>> Most of them are, considering that the build process of a pure-Pyth
> Runtime sure, and we already use that.
Do you have a pointer to this logic, It would be great to be able to update the
B-D via this mecanisme like we do with cabal-debian.
We should have a pip-debian which could take care of upgrading the B-D in the
control file each time we integrate a new u
> I've dropped using @builddep@ in general within the packages I'm
> involved to achieve exactly what Andrey has written, I want to detect
> broken or missing dependencies, broken tests by changed dependencies in
> other depending packages. And don't want to get this hidden by packages
> that are w
> This sounds like a job for a custom autopkgtest, not for one that runs
> build-time tests.
In that case what is the purpose of pybuild-autopkgtest ?
We are already running test almost automatically via pybuild during the build ?
it seems that it was written to avoid code duplication between d/
Hello,
Is it possible instead of using pybuild-plugin-autopkgtest during the
build, to call the same logic from d/t/control.
something like
Test-Command: pybuild-autopkgtest-run
This way it would be possible to deal with the Dependencies on our own
and avoid the problem of missing dependencies
I solved this issue by running the test like this
HOME="$AUTOPKGTEST_TMP" WITH_QT_TEST=False SILX_TEST_LOW_MEM=False xvfb-run -a
--server-args="-screen 0 1024x768x24" $py -m pytest --pyargs silx 2>&1
thanks
Fred
Hello, I am trying to solve this issue
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1099038
My question is why pytest try to create a file in the modules system library ?
Thanks for your help.
Fred
> this is the same as we did for the Python 3.12 transition. Please note
> that we don't enable any of the experimental features in Python 3.12 (no
> GIL, JIT compilation), so assuming there are currently no other RC
> issues in your packages, there should plenty of time to fix any 3.13
> related
do we know how long we will have to fix all the FTBFS and autopkgtest before
the freeze ?
I am a bit worrying for the scientific stack , will we have enough time to work
with our upstream in order to fix all these FTBFS. In the scientific stack,
things are going slowly
We are not 100% of o
> Do you think there is value in those that we don't have in
> autopkgtest-pkg-pybuild? in my experience conda test commands are just
> ad-hoc commands to run e.g. pytest, but my experience with them is narrow.
I do not have lot's of experience for now, but at least the test suite is most
of the
Hello,
do you know if it exist a tool which automatically produce autopkgtest test
snipset from a conda meta.yaml file ?
since plenty of upstream are writing these kind of scripts, it would be great
to execute these test via autopkgtests when possible.
I have a student which is writting a to
> Those will correspond to build dependencies in a typical simple case, as
> the upstream tests need all those optional deps, assuming you mean the
> tests dependencies (and not the runtime dependencies which should already
> be in Depends).
Yes the runtime dependencies should be listed in the pyt
> Maybe we indeed want a "minimal" autopkgtest environment, but many
> upstream tests will fail in those and I don't see an automatic way to test
> a random package in this way.
Even if not minimal, at least correspond to the upstream declares dependencies.
by 'declare' I am not even sure of the
My use case, is to check that all the Dependencies computer by dh_python3 from
the build tools are indeed listed in the Depends of the binary package.
I think about package whcih provide oiptional dependencies via the extra. In
that case the extra should be declare when calling dh_python3.
So I
I have one concerne when using pybuild autopkgtest.
It install by default the build dependencies. which is not what peoples are
doing when they install the packages.
It should be possible to define the autopkgtest dependencies. This way we could
catch missing dependencies in python- dependencies
> I'm not 100% sure I understand your question, but is something
> preventing you from installing the script with a
> debian/binoculars.install file?
nothink but it seems to me (I may be wrong), that pybuild install all files
directly in the python3- package
Am I wrong ?
I end up with the scrip
Hello,
I am modernizing the binoculars package. I switch it to pyproject.toml and now
I need to update the packaging.
I would like your advices in order to replace this d/rules
---
export DH_VERBOSE=1
export PYBUILD_NAME=binoculars
export PYBUILD_AFTER_INSTALL=rm -rf {destdir}/usr/bin/
%:
ok for me
- Le 4 Jan 24, à 13:19, Alexandre Detiste alexandre.deti...@gmail.com a
écrit :
> Le jeu. 4 janv. 2024 à 07:48, Andreas Tille a écrit :
>> > @Vincent: this one package "gtextfsm" is yours
>> > do you green light an upload ?
>>
>> If you ask me the package is team maintained and a
None, before_build=None, build_args=None, after_build=None,
before_install=None, install_args=None, after_install=None, before_test=None,
test_args=None, after_test=None, test_nose=False, test_nose2=False,
test_pytest=False, test_tox=False, test_custom=False,
dir='/home/picca/
Hello, I am updating the xraylarch package which contain something like this in
the setup.cfg
```
install_requires =
asteval>=0.9.28
numpy>=1.20
scipy>=1.7
uncertainties>=3.1.4
lmfit>=1.2.1
pyshortcuts>=1.9.0
xraydb>=4.5
silx>=0.15.2
matplotlib>=3.5
sqlalch
Hello,
I am the maintainer of silx
I have this problem with the gui application
$ silx view
Traceback (most recent call last):
File "/usr/bin/silx", line 33, in
sys.exit(load_entry_point('silx==1.1.2', 'console_scripts', 'silx')())
The sheband distributed by the upstream did not changed
#!python
but previously it was replaced by a python3 shebang.
here for the previous version
D: dh_python3 dh_python3:179: version: 5.20230109
D: dh_python3 dh_python3:180: argv: ['/usr/bin/dh_python3', '-i',
'-O--buildsystem=pybuild']
D:
Hello, I updated pymca with the new upstream version, and now the pymca package
depends on the missing python2:any package (which is not available in Debian).
This dependency was generated by dh_python3 during the build all.
dh_python3 -i -O--buildsystem=pybuild
D: dh_python3 dh_python3:179:
Hello Louis
> It seems the only thing this line does is to install /usr/bin/silx. This can
> be
> done 'manually' via
> dh_install (see man dh_install).
Yes it install only this script for now. I can do it by end. but in thaht case,
I need to let python build the script from the entry point an
Hello,
I try to update the silx package and I want to replace this call
python3 setup.py install_scripts -d debian/silx/usr/bin
with the right call without setup.py.
thanks for your help
Frederic
There is a fix from the upstream around enum.
https://github.com/boostorg/python/commit/a218babc8daee904a83f550fb66e5cb3f1cb3013
Fix enum_type_object type on Python 3.11
The enum_type_object type inherits from PyLong_Type which is not tracked
by the GC. Instances doesn't have to be tracked by
My pytango package has the same probleme...
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1024078
I: pybuild base:240: cd /<>/.pybuild/cpython3_3.11_tango/build;
python3.11 -m pytest tests
ImportError while loading conftest
'/<>/.pybuild/cpython3_3.11_tango/build/tests/conftest.py'.
tests/c
in order to debug this, I started gdb
set a breakpoint in init_module_scitbx_linalg_ext
then a catch throw and I end up with this backtrace
Catchpoint 2 (exception thrown), 0x770a90a1 in __cxxabiv1::__cxa_throw
(obj=0xb542e0, tinfo=0x772d8200 , dest=0x772c1290
) at
../../../..
Hello, I am trying to fix this bug
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1024859
I find the error message not very informative...
:~/$ python3.11
Python 3.11.0+ (main, Nov 4 2022, 09:23:33) [GCC 12.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
Hello, just for info.
I can confirm that the probleme was in the upstream build system, which was not
compatible with the setuptools > 60.
After patching the build system, I can confirm that without any change to the
rules files. It works out of the box.
So there is no issue in pybuild :)), or a
ild:379: clean: plugin distutils failed with: exit code=1:
python3.10 setup.py clean
dh_auto_clean: error: pybuild --clean -i python{version} -p 3.10 returned exit
code 13
make: *** [debian/rules:16: clean] Error 25
dpkg-buildpackage: error: fakeroot debian/rules clean subprocess returned exit
stat
Ok, I understand better what is going on.
in pyproject.toml there is these lines.
[build-system]
requires = [
"wheel",
"setuptools<60.0.0",
"oldest-supported-numpy",
"scipy",
"sphinx",
"nbsphinx",
"silx>=0.10",
"Cython>=0.25"
]
indeed the setuptools < 60, can not
d’origine -
De: Scott Kitterman <deb...@kitterman.com>
À: debian-python@lists.debian.org
Envoyé: Tue, 01 Nov 2022 20:36:20 +0100 (CET)
Objet: Re: build package xrayutilities - wheel and pip with setuptools
On Tuesday, November 1, 2022 3:31:47 PM EDT PICCA Frederic-Emmanuel wrote:
> >
> I don't think it should do that, so we need to investigate. Where can I
> find
> the updated packaging?
I did not push the change right now, I will push once I solve this issue :).
my opinion is that I should force via PYBUILD_SYSTEM=distutils
Fred
>It looks to me like the current pyproject.toml file for pyfai is not
>sufficient
> to build the package, so I would tempted to keep what you have now.
Due to the presence of this file, pybuild try to build using the "new way"
instead of building with setup.py.
I do not know if other package a
> As far as I can see, the package doesn't ship any files in /usr/bin.
> Why do
> you need to build man pages (I'm assuming that's what that's
> for? More
> generically, what problem did that step in the process solve that's not
> solved
> now?
this is for the pyfai package which I need to
thanks for your help.
I have one more question
I have this command from the previous build
{interpreter} setup.py build_man
how can I translate this with the new build systeme ?
> Hello Frederic,
Hello Carsten
> please could you provide next time direct links to the VCS/Tracker of
> your package, that prevents time to search for the correct package on my
> or others people side. Also a speaking subject content is helping me to
> decide if I want to spend time on takin
Le 2022-10-30 00:39, Scott Kitterman a écrit :
Adding pybuild-plugin-pyproject to build-depends should solve it. It
looks like it is trying to do the new pyproject.toml style build
without all the necessary parts in place.
the package contain a pyproject.toml file and a setup.py one.
the bu
Hello, I try to fix an FTBFS in the python-xrayutilities package.
When I try to build it, I get this error message., I do not understand
why I need to add a build dependency to python3-wheel.Is it something
missing in the dependency of python3-setuptools or python3.10 ?
thanks for your help
F
Hello, I am packaging the latest tomopy package.this software use ctype to open
some extensions during the import, but I have this error message.
>>> import tomopy
Traceback (most recent call last):
File "", line 1, in
File "/usr/lib/python3/dist-packages/tomopy/__init__.py", line 63, in
from to
For the record, I found it..., the upstream modify the HDF5_PLUGIN_PATH when
loading the dxtbx module.
they guess that they are using conda and override the path. All this is useless
on Debian since the plugin are system installed properly.
Cheers
Fred
# Ensures that HDF5 has the conda_base
Hello Neil
> Looks like you need a -v option to see more detail.
thanks for the advices, I found by removing files one by one that the failling
behavious is due to the import of the library itself.
the failing test PASS by himself, but if I add a useless import dxtbx inside,
it failes.
so ther
=
platform linux -- Python 3.10.6, pytest-7.1.2, pluggy-1.0.0+repack
rootdir: /home/picca/debian/science-team/dxtbx, configfile: pytest.ini
plugins: requests-mock-1.9.3, forked-1.4.0, xdist-2.5.0, mock-3.8.2
Hello Stephano
I end up with this
#! /usr/bin/make -f
export DH_VERSOBE=1
export PYBUILD_NAME=dxtbx
export PYBUILD_SYSTEM=distutils
export PYBUILD_AFTER_CONFIGURE=cmake -DPython_EXECUTABLE=/usr/bin/{interpreter}
-S . -B {build_dir}/lib
export PYBUILD_AFTER_BUILD=make -C {build_dir}/lib
expor
> Oh. In this case setting PYTHONPATH (if it works, and I'm not 100% sure
> it
> will) sounds like a better option.
> So, the cmake build path is, as far as I know, defined in
> /usr/share/perl5/Debian/Debhelper/Buildsystem.pm to be just
> obj-$DEB_HOST_GNU_TYPE".
thanks for the info. It seems to
> /usr/share/perl5/Debian/Debhelper/Buildsystem.pm to be just
> "obj-$DEB_HOST_GNU_TYPE".
Thanks for the info, if I an not wrong during the build process we can setup a
new builddir.
So is is possible to obtain the real builddir during the build ?
> When trying to run tests you should look how does the upstream intend to
run them.
Yes they are building inplace the module like this
from https://github.com/dials/dxtbx/blob/main/.azure-pipelines/unix-build.yml
# Build dxtbx
- bash: |
set -e
. conda_base/bin/activate
set -ux
Hello Andrey
> Does the same happen when you run the test in the source tree manually?
I do not know, I am in the process to build the package in sbuild so I just try
to fix the build process.
If I have hard code the lib path the test are running failling for other
missing modules, but this is
Hello, I am packaging a python extension which use cmake as build system
here the repo
https://salsa.debian.org/science-team/dxtbx
I try to activate the test but this cause this kind of trouble
During handling of the above exception, another exception occurred:
/usr/lib/python3.10/importlib/__i
Hello, I am working on the silx package and the upstream install_requires is
sort of wrong.
It depends on the hdf5plugin, which is not necessary on Debian.
the purpose of this hdf5plugin is to register an hdf5 pluging when it is
uploaded.
the code of the application use a try execpt in order to
> Hello,
> I’d suggest you build it from source (python.org/ftp... with the needed
> version) as an additional python version, and then create your venv using the
> 3.6.
> You can dm me if you might need more details.
It would be great to have a python-builder package whcih generates a pythonX
Hello, I am working on the lmfit-py package
lintian complain about this
https://salsa.debian.org/science-team/lmfit-py/-/jobs/909498
I use sphinx, so my question is: do you know how to fix this issue
lot's of package are affected by this
https://packages.debian.org/search?searchon=contents&key
What about helping
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=946035
python-language-server
is the only one missing now.
cheers
> If you want to embed python in an application, you need to use
> python3-embed.pc
> Or python3-config --embed
then it links the program with -lpython3.8
so what is the purpose of python3.pc ?
thanks
Fred
returned 1 exit status
so it seems that the program is not linked with the Python3 library
I use pkg-config to obtain the library
picca@2a02-8420-6c55-6500-d012-4688-0bee-a0c6:~/hkl/contrib/haskell$ pkg-config
--libs python3
picca@2a02-8420-6c55-6500-d012-4688-0bee-a0c6:~/hkl/contrib/haskell
I found this
--sourcedirectory=src
is it equivalent to -D
subsidiary question is it possible to run a command before all the dh_auto_xxx
without overrideing eveythings ?
I need to run a command whcih generate the setup.py file so I need to do
override_dh_auto_:
do_something
Hello, I have a packahe where the setup.py is not located at the root of the
directory.
So I need to do
override_dh_auto_XXX:
dh_auto_XXX -- -d
Is there a export somthing whcih allows to says where is the setup.py to deal
with ?
thanks
Frederic
> You should consider /usr/lib// if you want to make your
> package multiarch-safe.
And what about ?
/usr/lib//
whcih one is better ?
> > The issue is that the current build system do not provide rpath for
> > these libraries so I can not add one via chrpath.
> Well, ideally you need to fix the build system so that it sets the correct
> rpath directly.
I found patchelf whcih allows to add a rpath :))
So I just need to set the
Hello, I am working on the dials[1] package. This scientific software
produce a bunch of python extension via boost python, but also a bunch
of libraries, which are the common part of the python extension.
when I package it, I moved the common library under the /usr/lib//
directory. This way the
> Lintian-brush is a fine tool, but (correct me if I am wrong) it would
> generate a patch excluding badges, and patches require maintenance.
You are right
maybe we should have a dh_privacy helper for this purpose.
cheers
Fred
what about lintian brush ?
Hello, I am working on the vitables package.
during the build I get this error message [1]
= test session starts ==
1567 platform linux -- Python 3.7.6, pytest-4.6.9, py-1.8.1, pluggy-0.13.0
1568 rootdir: /builds/science-team/vitables/debian
Indded
sorry for the noise
Fredric
Hello,
I am packaging a python application . So I dediced to put the module under the
private directory
/usr/share/
but this software contain a cython extension.
So at the end I have a lintian Error due to binary file under /usr/share.
What is the best soltuion when we need to package a so
Hello
on unstable it works but on buster (and we need to make it work for buster)
we have this message
Python 3.7.3 (default, Apr 3 2019, 05:39:12)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import pkg_resources
>>> pkg_resources.get_distribu
Hello,
We are working on the next pyfai package.
This new version use entry_points like this
gui_requires = ['PyQt5', 'h5py', 'hdf5plugin', 'PyOpenGL']
opencl_requires = ['pyopencl']
extras_require = {
'calib2': gui_requires, # Keep compatibility
'gui': gui_requires,
Hello,
and if at the end the upstream could take care of the Debian packaging, by
adding a
.salsa-ci.yml in the upstream directory, in order to have a feedback with nice
badges ?
Cheers
Hello, in one of my package (pymca), there is a syntax error like this.
byte-compiling
/builds/science-team/pymca/debian/output/pymca-5.5.2+dfsg/debian/python-pymca5/usr/lib/python2.7/dist-packages/PyMca5/Object3D/Object3DPlugins/ChimeraStack.py
to ChimeraStack.pyc
File
"/usr/lib/python2.7/di
Hello, I would like to know if there is an equivalent of cabal-debian[1] ,
which helps a lot maintaining haskell packages.
I allow to do the initial packaging and also the upgrade of the packaging, by
updating the Build-dependencies, etc...
Cheers
Frederic
[1] https://hackage.haskell.org/pac
Hello sandro,
I can not find python-pyqtgraph in your list.
It seems to me that this package has reverse dependencies, but the python2
binaries where remove..., but this is another problem.
Cheers
Frederic
Hello Sandro,
> I've just submitted
> https://salsa.debian.org/python-team/tools/dh-python/merge_requests/9
> to address this; not sure how quickly it will get merged & released
thanks a lot, but what about backports. On backports we still need this mapping.
Cheers
Hello,
I am preparing the new spyder package.
since the removal of pylint3 from the src:pylint.
I need to remove the Build-Depends: pylint3.
Now dh_python3 still produce a pylint3 dependency for the binary packages.
It seems that thsi is hard coded here[0]
Is it a bug of dh-python ?
Cheers
F
Hello,
> The Python 3 variant of the package should begin provide the
> /usr/bin/foo wrapper script interface when the Python 2 package is
> dropped.
> This really ought to be codified somewhere, and I'm not sure that the
> DPMT wiki page is visible enough or will be consulted by maintainers
> wh
Hello it seems that the pylint package does not provide pylint3 anymore (since
21h ;)
But the spyder package still require pylint3 and pylint when installint spyder
or spyder3.
this is why the next taurus will FTBFS.
So is it something expected and the spyder package should be fixed, or a bug i
it would be nice, if the python2 packages could be skipped in bulleye but not
for backports in buster.
Is it something which could be envision ?
Fred
Hello,
while preparing one of our package (pyfai), we endup with an FTBFS due to
backports.functools_lru_cache [1].
here the backtrace
Traceback (most recent call last):
File "./run_tests.py", line 543, in
unittest.defaultTestLoader.loadTestsFromNames(options.test_name))
File "/usr/lib
Hello, here a diff between the python3.6 and python3.7 modules once updated
via 2to3.
r:/tmp$ diff core*
174c174
< for (variance, tag) in zip(variances, tags))
---
> for (variance, tag) in list(zip(variances, tags)))
181c181
< for (coords, value) in zip(trans
Hello Andreas,
> Patches are welcome (I have no idea what the construct is doing neither
> how to replace it with something valid).
> Patch welcome as well - preferably as commit to Git.
done but now, we need to understand why lintian complain about python module at
the wrong place before uploa
I think that the real problem with the current build is that the conf.py file
change the sys.path.
this is why we see this syntax eror.
sphinx pick the wrong path.
I can not work on this now..., I am not in fron of a Debian box nor have
access to one todays...
Cheers
Fred
I found in the code a string with a ur''
This is the problematic line.
I do not know if this is a valid string construction.
I also dound that you need to remove the sys.path modifications from the
conf.py.
this can cause some troubles during the build.
Cheers.
Fred
I found the culprite, the conf.py file of the documentations prepend ".." to
sys.path befor importing the module.
This is why it use the wrong version of the built module.
Now during the build I have this
D: dh_python3 dh_python3:164: args: []
D: dh_python3 dh_python3:166: supported Python vers
Now the test part :))
Correlated variables. ... ok
Tests the input of correlated value. ... ok
==
ERROR: Failure: ImportError (No module named tests.support)
--
> try adding python3-setuptools to Build-Depends
ok I removed all the black magic from the debian/rules and added setuptools :)
so now, I have this error when building the documentation
PYTHONPATH=`pybuild --print build_dir --interpreter python3`
http_proxy='http://127.0.0.1:9/' sphinx-build -N
You are right , I did not noticed, that setuptools was not part of the build
dependencies...
Hello Andreas, it seems to me that the problem is due to the 2to3 conversion.
I looked at the first failure when you re-activate the unit test[1]
to my opinion, the code is modify in place with 2to3.
So the code on the source after the configuration is already converted to
python3.
And during t
Hello Andreas,
during the test does it load the moduels from the source files or does it use
the one under the build directory.
Maybe there is a missmatch between python2 code and 2to3 python code targeting
python3.
did it helped ?
Fred
I think that there is a problem with cffi
pyopencl was built with
python3-cffi-backend i386 1.11.5-1 [80.2 kB]
but the backend used for the test is the current 1.11.5-3.
here the Debian changelog
python-cffi (1.11.5-3) unstable; urgency=medium
[ Ondřej Nový ]
* Use 'python3 -m sphinx
Hello,
I rebuilt pyopencl,and the problem vanished.
so what should I do now ?
ask for a binNMU or try to understand what is going on ?
thanks for your time.
Fred
picca@mordor:/tmp$ python3.7-dbg -c "import pyopencl"
/usr/lib/python3/dist-packages/pkg_resources/_vendor/pypars
tWarning)
* ob
object :
type: tuple
refcount: 0
address : 0xb5f507a4
* op->_ob_prev->_ob_next
object : Erreur de segmentation
picca@mordor:~$ python3.7 -c "import pyopencl"
maybe the problem is in pyopencl.
But I find really strange thaht I have the WArning messages with python3.7-dbg
but not with python3.7
Is it normal ?
Ok, I could simplify the problem to a single import
picca@mordor:~$ python3.7-dbg
Python 3.7.0+ (default, Aug 31 2018, 23:21:37)
[GCC 8.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import silx.opencl
/us
Hello,
I try to understand this [1] test failure with python3.7-dbg
so, I ran this on my unstable box and I got this error from within gdb
testTrainingSpectrumReading (specfilewrapperTest.testSpecfilewrapper) ... ok
--
Ran 43 t
Hello,
I am trying to upgrade spyder and activate the unti test during the build.
But when pybuild try the test it ends with this error message
dh_auto_test: pybuild --test --test-pytest -i python{version} -p "3.7 3.6"
returned exit code 13
With no other information.
So My question is how ca
> remove control file and invoke py2dsp - it will regenerate it
> That said, you probably want dch (debchange) rather than new control
> file
Thanks a lot,
It would be nice to have an equivalent of dgit-main-xxx for maintaining python
packages.
Maybe in the policy ?
this way peoples new commer
Hello,
once debianize, is there a command which allow to update the control file for a
new upsteam version in order to take into account the new python dependencies ?
It would simplify a lot the maintenance of python packages.
py2dsp update
like the cme command ?
Cheers
Frederic
1 - 100 of 177 matches
Mail list logo