On 7/22/2020 9:59 PM, Michael De Roover wrote:
On 7/23/20 6:28 AM, Ted Mittelstaedt wrote:
Linux is 10 times worse because they aren't even including the c
compiler or development tools
anymore.
Every distribution I've laid my hands on so far has GCC packages and
most development packages affixed with either -dev or -devel (most of
the time).
But many "systemadmins" out there think they are Unix admins
yet are afraid to compile programs. They will go to the FreeBSD port or
the Linux precompiled apt-get stuff. The reason is more and more
non-technical people are getting their hands on this stuff.
I don't disagree with this but I also think there's more to it than
that. For me personally I avoid compiling from source when I can get
away with it - not because I can't run make - but simply because binary
packages are convenient. Having a package manager take care of updates
in the whole system is convenient. Having distribution maintainers that
say "okay we are going to go stable, bleeding edge or whatever with the
whole project" is useful when they can spend the time looking at the
upstream projects, and choose the most fitting software versions and
such to suit that goal. And when there's billions of machines running
very similar architectures, there is an argument to be made that making
every single one of them compile everything from source is rather
pointless. Why should every machine in existence be tasked with
CPU-intensive compilation workloads when a handful of dedicated
compilation servers can do exactly that, and a million times better?
Well for starters there is no way for ME to validate that the compiled
software you built for me isn't busy running your Doom network server
behind my back. (do people still even run Doom servers?)
You are making an argument that is a desktop argument. That is, the
argument goes Those That Know Better Will Do It For You.
Also, I have had at least 5 Open Source programs over the years that
I found Really Useful to have that the authors decided they wanted to
"take commercial" or they had other religious conversions that made them
decide to go on a rampage and issue take down notices everywhere they
could find their source. One of those for example was when
Nasty-Company-Who-Shall-Not-Be-Graced-With-A-Mention decided to start
charging
for software that created .gif files and the graphics community went
on a ballistic rampage jihad and destroyed every scrap of .gif code it
could find so as to force users to migrate to .png. I did not wish to
migrate to .png so I was very glad that I had saved all the old code,
safe from the fires of the religious zealots.
Lastly, the way I look at it is when I field a new server, if it cannot
recompile it's OS, kernel, make world, and all of it's applications from
source, then it's a piece of excrement that I do not want in service.
It is also a fact that I have had pre-production servers blow up on
"make worlds" In a few cases this was bad ram, in one case the server
was returned to the manufacturer under warranty. These are machines
that did not display any issues before the OS load. Do not ask me why
it was possible to install all the binaries for the OS and have it boot
with no problems yet blow chunks/blue screen/abend/take a dive into the
toilet/whatever your preferred term for crashing and burning is.
I don't generally run FreeBSD or Linux as a desktop OS, BTW so that
does affect my view of things.
So yes, there is definitely an argument in favor of compiling the
stuff at least on a server.
Ted
_______________________________________________
Please visit https://lists.isc.org/mailman/listinfo/bind-users to unsubscribe
from this list
ISC funds the development of this software with paid support subscriptions.
Contact us at https://www.isc.org/contact/ for more information.
bind-users mailing list
bind-users@lists.isc.org
https://lists.isc.org/mailman/listinfo/bind-users