Re: How to package a Python module with shared libraries ?
Le dimanche 25 avril 2010 à 19:37 +0100, Floris Bruynooghe a écrit : > You could try setting RUNPATH/RPATH into the extension module. If > it's using distutils you can use --rpath to setup.py build_ext, see > the --help of the build_ext sub-command. Unfortunately that won’t work, since the rpath used by dlopen() is the one provided by the binary, not the one for the module being loaded. This is a known issue with interpreters, and no one has been able to provide a decent solution. A possible way would be to make a wrapper Python extension, that would parse the module for its NEEDED components and load them manually using the given rpath before loading the actual module. Which in turn will not be possible since there is no shared version of libbfd - which would be required for doing this. Cheers, -- .''`. Josselin Mouette : :' : `. `' “A handshake with whitnesses is the same `- as a signed contact.” -- Jörg Schilling -- To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org Archive: http://lists.debian.org/1272267892.15552.14.ca...@meh
Re: How to package a Python module with shared libraries ?
On Mon, Apr 26, 2010 at 09:44:52AM +0200, Josselin Mouette wrote: > Le dimanche 25 avril 2010 à 19:37 +0100, Floris Bruynooghe a écrit : > > You could try setting RUNPATH/RPATH into the extension module. If > > it's using distutils you can use --rpath to setup.py build_ext, see > > the --help of the build_ext sub-command. > > Unfortunately that won’t work, since the rpath used by dlopen() is the > one provided by the binary, not the one for the module being loaded. You are correct, I drew the wrong conclusions from some earlier experience but upon checking the dlopen manual I see my memory must have gotten confused over time. > This is a known issue with interpreters, and no one has been able to > provide a decent solution. > > A possible way would be to make a wrapper Python extension, that would > parse the module for its NEEDED components and load them manually using > the given rpath before loading the actual module. > > Which in turn will not be possible since there is no shared version of > libbfd - which would be required for doing this. Is it not possible to do it with just including elf.h and using the structures in there to find the relevant NEEDED and RPATH/RUNPATH tags in the dynamic section? It seems like a substantial amount of code would be required tough. Regards Floris -- Debian GNU/Linux -- The Power of Freedom www.debian.org | www.gnu.org | www.kernel.org -- To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org Archive: http://lists.debian.org/20100426100441.gb13...@laurie.devork
Re: How to package a Python module with shared libraries ?
Le lundi 26 avril 2010 à 00:14 +0200, Jakub Wilk a écrit : > * Sylvestre Ledru , 2010-04-25, 23:33: > >> > * All Scilab libraries are stored in /usr/lib/scilab/ > >> > > >> > * cdll.LoadLibrary("/usr/lib/scilab/libscilab.so") will fails > >> > since /usr/lib/scilab is not in the search path for libraries > [...] > >> > * I don't want load explicitly all Scilab libraries (there are a few) > >> > > >> > * Calling python the following way make things work: > >> > LD_LIBRARY_PATH=/usr/lib/scilab python > >> > but I want to avoid the user to have to do manipulations > >> > >> You could try setting RUNPATH/RPATH into the extension module. If > >> it's using distutils you can use --rpath to setup.py build_ext, see > >> the --help of the build_ext sub-command. > >I forgot to say but I also tried with the rpath but library load of > >Python does not seems to follow it. > > > >Since I am also the packager of Scilab, I could also move the library > >into /usr/lib but since Scilab has many libraries, I was hoping to store > >them separately but if there is no other way, I could do that... > > Or you could just try building libraries in /usr/lib/scilab with rpath. > Then cdll.LoadLibrary("/usr/lib/scilab/libscilab.so") should work as > intended. Indeed. I rebuilt Scilab with the rpath and it works. Thanks for the tip! I wonder what is the best solution: * move all content of /usr/lib/scilab/ to /usr/lib/ (it is more than 70 shared lib). But I would have to do that upstream also just for this Python module. * add the rpath for the libraries in /usr/lib/scilab/ (Sorry if I leave the scope of Python here) Sylvestre -- To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org Archive: http://lists.debian.org/1272281208.1523.79492.ca...@korcula.inria.fr
Re: How to package a Python module with shared libraries ?
Le lundi 26 avril 2010 à 13:26 +0200, Sylvestre Ledru a écrit : > Indeed. I rebuilt Scilab with the rpath and it works. Thanks for the > tip! > I wonder what is the best solution: > * move all content of /usr/lib/scilab/ to /usr/lib/ (it is more than 70 > shared lib). But I would have to do that upstream also just for this > Python module. This is feasible only if upstream commits to ABI stability for these 70 libraries, or if you introduce Debian-specific versioning. > * add the rpath for the libraries in /usr/lib/scilab/ This is usually the correct solution in such cases. But it will prevent the Python module from loading given how the toolchain currently works. Cheers, -- .''`. Josselin Mouette : :' : `. `' “A handshake with whitnesses is the same `- as a signed contact.” -- Jörg Schilling -- To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org Archive: http://lists.debian.org/1272290815.15552.25.ca...@meh
Re: How to package a Python module with shared libraries ?
Le lundi 26 avril 2010 à 11:04 +0100, Floris Bruynooghe a écrit : > On Mon, Apr 26, 2010 at 09:44:52AM +0200, Josselin Mouette wrote: > > Unfortunately that won’t work, since the rpath used by dlopen() is the > > one provided by the binary, not the one for the module being loaded. > > You are correct, I drew the wrong conclusions from some earlier > experience but upon checking the dlopen manual I see my memory must > have gotten confused over time. You have a good excuse, since this behavior is completely counter-intuitive. > > A possible way would be to make a wrapper Python extension, that would > > parse the module for its NEEDED components and load them manually using > > the given rpath before loading the actual module. > > > > Which in turn will not be possible since there is no shared version of > > libbfd - which would be required for doing this. > > Is it not possible to do it with just including elf.h and using the > structures in there to find the relevant NEEDED and RPATH/RUNPATH tags > in the dynamic section? It seems like a substantial amount of code > would be required tough. Yes, but it might be worth writing a generic Python extension to do that, given that we already have several packages affected. -- .''`. Josselin Mouette : :' : `. `' “A handshake with whitnesses is the same `- as a signed contact.” -- Jörg Schilling -- To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org Archive: http://lists.debian.org/1272291076.15552.30.ca...@meh
Re: Is it worth back porting PEP 3147 to Python < 3.2?
Hi Barry, Nice to see someone of the core python team taking part in distribution development. On Thu, Apr 22, 2010 at 01:52:11PM -0400, Barry Warsaw wrote: > How much of the transition testing is automated? It would be very interesting > for example, to have a test framework that could run any combination of Python > packages against various versions of Python, and get a report on the success > or failure of it. This may not be a project for the distros of course - I > think upstream Python would be very interested in something like this. For > example, a tool that grabbed packages from the Cheeseshop and tested them > against different versions would be cool. If snakebite.org ever gets off the > ground, that might be the best place to put something like this together > (though we'd care less about OSes that aren't Debian and Ubuntu). Unfortunately, Logilab does not have a much man-power to offer to set this up at the moment, but would something like http://apycot.hg-scm.org/ fit your description of a test framework ? We also have it running at logilab.org and cubicweb.org of course: http://www.logilab.org/view?rql=testconfig&vid=summary http://www.cubicweb.org/view?rql=testconfig&vid=summary As you can see with these second and third links, tests include lintian and piuparts checks. Is it something like this that you had in mind? -- Nicolas Chauvat logilab.fr - services en informatique scientifique et gestion de connaissances -- To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org Archive: http://lists.debian.org/20100426163526.gb26...@volans.logilab.fr
Re: Is it worth back porting PEP 3147 to Python < 3.2?
On Apr 26, 2010, at 06:35 PM, Nicolas Chauvat wrote: >Nice to see someone of the core python team taking part in >distribution development. Well, it's official now - I've joined the Ubuntu platform team at Canonical, so I'm very keen on helping to solve problems with Python on Debian and Ubuntu. >On Thu, Apr 22, 2010 at 01:52:11PM -0400, Barry Warsaw wrote: >> How much of the transition testing is automated? It would be very >> interesting >> for example, to have a test framework that could run any combination of >> Python >> packages against various versions of Python, and get a report on the success >> or failure of it. This may not be a project for the distros of course - I >> think upstream Python would be very interested in something like this. For >> example, a tool that grabbed packages from the Cheeseshop and tested them >> against different versions would be cool. If snakebite.org ever gets off the >> ground, that might be the best place to put something like this together >> (though we'd care less about OSes that aren't Debian and Ubuntu). > >Unfortunately, Logilab does not have a much man-power to offer to set >this up at the moment, but would something like >http://apycot.hg-scm.org/ fit your description of a test framework ? That's for continuous integration of Mercurial, right? >We also have it running at logilab.org and cubicweb.org of course: >http://www.logilab.org/view?rql=testconfig&vid=summary >http://www.cubicweb.org/view?rql=testconfig&vid=summary > >As you can see with these second and third links, tests include >lintian and piuparts checks. > >Is it something like this that you had in mind? Yes. What are you using to drive this? I'm not really up on CI tools, but Hudson has been getting a lot of buzz. http://hudson-ci.org/ What I like about your display is that a failure in one area does not necessary mean a failure elsewhere. That way you can better see the overall health of the package. What I have in mind is defining a set of best practices, embodied as much as possible in tools and libraries, that provide carrots to Python developers, so that if they adhere to these best practices, the can get lots of benefits such as nearly automatic and effortless packaging in Debian and Ubuntu. It's things like 'python setup.py test' just working, and it has an impact on PyPI, documentation, release management, etc. These best practices can be opinionated and simple. If they cover only 80% of Python packages, that's fine. Developers would never be forced to adhere to them, but it would be to their advantage to do so. -Barry signature.asc Description: PGP signature