On 08/02/14 07:17, Gustav Fransson Nyvell wrote:
On 08/02/14 13:13, Gustav Fransson Nyvell wrote:
On 08/02/14 12:54, Marc Espie wrote:
On Sat, Aug 02, 2014 at 12:26:06PM +0200, Gustav Fransson Nyvell wrote:
Hi, there,
I wanted to run something by you, mkay. About package management. I
wonder
if this has been shouted at already. I remember from SunOS that
packages are
installed in a different manner than let's say Red Hat and of course
OpenBSD. They install it in the form /pkgs/PROGRAM/VERSION, example
/pkgs/gimp/1.0. GoboLinux does this. I think this has some
advantages over
installing /usr/local/bin/gimp1.1 and /usr/local/bin/gimp2.0. What
do you
think? What have you said?
Ready to be shouted at;
This puts more strain on the file system actually, which is probably
the main reason we don't do it. Also, there is generally a lot of
churning
to do to make the package self-contained.
As far as policy goes, having stuff set up like that looks more
flexible, but
it is a fallacy. Instead of having the distribution solve issues
concerning
incompatible versions and updates, the toll falls instead on the
individual
sysadmin, to make sure things they have work together. It can lead to
security nightmares, because it's "so simple" to have the newer version
alongside the old version that sticky points of updating take much
longer
to resolve.
It's a bit like having mitigation measures that you can turn on and
off...
if it's possible to turn these off, there's not enough incentive to
actually
fix issues.
Likewise for packages. By making it somewhat LESS convenient to install
several versions of the same piece of software, we make it more
important
to do timely updates.
Also, we don't have the manpower to properly manage lots of distinct
versions
of the same software. So this kind of setup would be detrimental to
actually testing stuff.
I guess there could be both. But I think that if there's a security
issue with one version of a software then there quite possibly are
multiple ways of limiting the impact of that issue. Disallowing
multiple versions to force people to upgrade is not really a good
reason, from how I see it. Old software will always have more holes,
because they're older and more well observed, but they have
qualities, too, like speed. GIMP-1.0 is amazing on Lenovo X41 from
2005, but probably has bugs. Of course none of these systems will
stop someone who wants to run version x of a software. Maybe
something entirely different is needed? Okay, maybe I should complain
about the status quo... thing is when packages install in /var, /usr,
/etc and /opt they're so spread out it's hard to know what is what.
This might be because I'm new but/and scripts can find orphan files
in this structures, but you need the scripts for that. Having
everything in /pkgs/PKG/VER would not cause this splatter. Programs
without dependees (i.e. non-libs, non-utilprograms) could fit in this
structure without any extra filesystem magic. Well, the grass is
always greener.
BTW, you create multiple versions by your mere existence. There are
lots of old versions laying around, but they can't be installed
together right now.
No, I have multiple versions by experience and usage of a package.
If I want multiple versions of something, I'll take the older version of
xxxx
and modify it to call the older version of it's libraries, etc. I have
done this
on one occasion. It wasn't fun but it was doable, for the period I wanted
both around.
OpenBSD's philosophy of packages bound together, with a specific version
of the OS is entirely reasonable. You don't want to have versions of the
same thing running, or at least you shouldn't. If you do, virtualizing
might be a more sane way to go.
This is all open source, and you have the freedom to change, or mangle
things as you wish..
--STeve Andre'