On Thu, 2013-01-24 at 14:02:20 -0700, Adam Conrad wrote:
> So, regarding doko's specific issue in this bug, I have just landed
> a fix to ld.so that will work around his specific issue.  That said,
> this is still, IMO, a very real implementation bug in dpkg-shlibdeps
> and, by extension debhelper.  Or the other way around.  Take your
> pick.  But the fix clearly needs to happen in both spots.
> 
> If you assume all builds are cross (which, of course, they aren't,
> but it's easier to wrap your head around the issue if you do), then
> you're creating a situation where we do:
> 
> LD_LIBRARY_PATH=/lib/host_arch build_arch-perl [...]
> 
> Because ld.so quite helpfully stuffs LD_LIBRARY_PATH first on the
> search path (as it's meant to do), any case where it doesn't skip
> broken/unwanted/unknown libraries will cause perl to fail to load,
> thus causing dpkg-shlibdeps to never actually run and do what you
> asked it to do.
> 
> The obviously correct way to do this is:
> 
> dh_shlibdeps -l/lib/foo
>  dpkg-shlibdeps -l/lib/foo
>   export LD_LIBRARY_PATH=/lib/foo
>    do shlibdeps stuff
> 
> So that you end up with your build_host version of perl (being
> invoked by both debhelper and dpkg-shlibdeps) being called with
> the default search paths, and then internal bits being called
> differently.
> 
> Even this could, of course, have consequences if dpkg-shlibdeps
> internally forks other build_host binaries, I haven't looked, so
> it may need slightly closer examination to only call specific bits
> with LD_LIBRARY_PATH, rather than a blanket export.
> 
> Does all this make sense?

Indeed, that's what I thought too when I first read Matthias report,
using LD_LIBRARY_PATH is really not a good idea given the interactions
with the interpreter and any forked process, I'm fixing this now to be
included for 1.17.x to either use a different environment variable, or
better yet to add the new option, and probably start issuing warnings
when the variable is set.

Thanks,
Guillem


-- 
To UNSUBSCRIBE, email to [email protected]
with a subject of "unsubscribe". Trouble? Contact [email protected]

Reply via email to