> I've never understood why someone would use $(shell ...) in a recipe... > I mean, the recipe will be run in the shell!!
I remember we once had a library where the command-line to the archiver was too long (about a quarter megabyte, IIRC). We worked round this by having a temporary scratch dir, hard-linking every .o file into it, then running the archiver in that directory, so as to trim all the paths off .o files and get the command-line short enough. We *would* have populated the scratch-dir by generating rules with define and eval and having those do the job, but some project managers weren't happy to let build machines take any software changes, not even the make upgrade to get define/eval working properly, so we hacked it by having the archiver command start by populating the scratch-dir. Of course, it couldn't do that by running a single command (the command-line would have been too long ...) so we did it with a $(shell ...) per object file via $(foreach ...). It was kinda ugly but it worked ;^> It looked something like (breaking lines a bit more, for mail's sake): $(GENDIR)/libhuge.a: libhuge-objtmpdir $(object) $(foreach O,$(?:libhuge-objtmpdir=),$(shell \ ln -f $O $(@D)/objtmp/)) \ cd $(@D)/objtmp; \ $(AR) $(ARFLAGS) ../$(@F) $(notdir \ $(?:libhuge-objtmpdir=)) \ || failed=yes; \ cd ..; rm -fr objtmp; [ -z "$$failed" ] libhuge-objtmpdir: $(GENDIR)/objtmp/.exists (Separate infrastructure for auto-creating directories looked after the .exists target, creating its directory.) I think we could, with hind-sight, have used the foreach to generate one command per object file, all separated with semicolons, so that no single command was too big for the shell, but I don't think we knew whether the shell's problem was with the command-line (as a whole, with all its semi-colons in it) being too long or only an individual command (between semicolons) being too long and we didn't bother to experiment, once one of my colleagues had offered a working solution. (It replaced various things I'd come up with that were hideous, splitting $(object) using $(wordlist...), IIRC. In particular, I tried using make's infrastructure for libblah.a(each.o) but the time cost grew quadratically with the number of object files - which was prohibitive.) Using $(shell ...) meant that the hard-linking didn't contribute any bytes to the command-line, which we liked. Aside from that, I mostly eliminated my colleagues' uses of $(shell ...) from our makefiles - but they kept finding reasons to add more :-( > However, it seems to be a popular thing to do based on questions to the > mailing lists and StackOverflow, etc. My suspicion is that it's done by > people who are not fully grasping how make works. I think, in most cases, yes. Or, to look at it another way, make is an incidental tool they use, not something they see as a core part of their job, so they haven't invested much time and effort in learning how to make the most of it. They know enough to get by and have a few hammers with which to interpret everything as a limited number of varieties of nail. One of those is to use $(shell ...), which creates all sorts of problems - e.g. when their $(shell ...) commands change facts that invalidate make's cache of information about the file-system, without make knowing to update. Naturally, they then blame make for what goes wrong when their hammering breaks everything. Some of them then go on to write replacements for make, that solve the particular problem that they hadn't read enough of the make manual to solve well in make, and trumpet their replacement as a vast improvement on make ... there are now dozens of these available. All but one of those I've looked into is manifestly less powerful than make (and I'm still waiting for my former colleagues to get the exception released to open source). Eddy. _______________________________________________ Bug-make mailing list Bug-make@gnu.org https://lists.gnu.org/mailman/listinfo/bug-make