2010/1/26 pancake <panc...@youterm.com>: > Anselm R Garbe wrote: >> >> What about the good old way of providing one master makefile for each >> platform instead of these scripts that are doomed to fail anyways >> sooner or later? >> >> > > It's not only about platform, for small projects i find single makefiles ok, > but for big ones you need to separate the configuration/build steps, because > you need to get information about which libraries, include files, programs, > etc.. are in the system. In some situations just create a .mk file with such > information, but you will probably need to export some of the information > into a .h file. > > Sometimes the configuration step is not as simple as run this and tell me > if it works or not. This makes a shellscript much more manageable than > a makefile, because it's not a task of the make and you end up forking from > make which is inneficient and ugly to maintain. > > About having one different mk file for each platform..well I always find > anyoing > to have to maintain N files that does the same, or for building..it's a mess > for packaging because there's no standard on this, and automatize the > build or do some packaging becomes a mess. > > You get the package and you have to spend few seconds to identify which > makefile you have to use, and then if you get compilation errors you have to > guess which dependencies are missing. then you can try to find INSTALL or > README files to see whats missing or try to fix the program if you get that > its not a dependency problem. So this makes the build process simpler, but > less error-prone and more annoying for packagers and users. > > The only good thing from autotools is that they are the standard, and this > hardly simplified the steps of development, packaging, compilation and > deployment. This is: make dist, make mrproper, automatic detection of > file dependencies, check for dependencies, etc. > > For suckless projects i dont find logic to use such a monster, but for big > projects it is many times a must. Because you end up with conditional > dependencies, recursive checks to ensure consistence of a program, etc. > > If you have a build farm or any massive-compilation environment, you expect > that all the packages are going to build and react in the same way. But this > is not true. There are some basics in the software packaging that not > everybody understand or know. > > Things like sand-boxing installation (make DISTDIR=/foo), cross-path > compilation (VPATH), > optimization flags detection for compiler, make dist, etc. are things that > many of the > makefile-only projects fail to do. I'm not trying to say that those are > things that all > packages must have, but it's something that standarizes the build and > install process, > and simplifies the development and maintainance. > > I wrote 'acr' because i was looking in something ./configure-compatible, but > lightweight > and simpler to maintain than the m4 approach. It works for me quite well, > but I try to > only use it for the projects that really need to split the build in two > steps. For building > in plan9 I just distribute a separate mkfile which doesnt depends on the > configure stage. > > But plan9 is IMHO a quite different platform to not try to support it from > the acr side, > because the makefiles should be completely different too.
Well I've heared these reasons before and I don't buy them. There are toolchains like the BSD ones and they proof pretty much that the "everything is a Makefile approach" is the most portable and sustainable one. Running a configure script from 10 years ago will fail immediately. I know that your problem vector is different, but I think reinventing square wheels like autoconf again is not helping us any further. And I really believe that sticking to mk or make files in large projects saves you a lot of headaches in the long term (think years ahead, like 10 years or so). Cheers, Anselm