Eli: >> cc fred.c -c -o fred.o >> cc bob.c -c -o bob.o >> error on line 20 -XXXXX >> error on line 30 -dddd >> error on line 330 -dddd >> makefile:342: recipe for target 'fred.o' failed >> makefile:350: recipe for target 'bob.o' failed
> You need to look in both anyway. That is true of the very specific example Tim had given, in response to a particular objection which his example did genuinely address. Think for a moment longer and you shall see that there are other cases where your answer won't apply. The interleaved output from make -j n is often much harder to read. (When the output isn't even line-interleaved, but has some command's output starting part way through a line of the output of another command, life gets even harder.) Even in the specific example: on which file should I be looking at line 20, at line 30, or line 330 ? The answer "look in every failing file for every failing line number" scales quadratically, which is not good. Even when I know which line a message relates to, it can be hard to work out how that message relates to that line (especially if there's a preprocessor involved and the "undefined symbol xxyyz" relates to a line in which xxyyz never appears, but only arises after expansion of a mess of macros). It gets nasty fast to have to look in several files, at line 30 of each, and work out which of them is the one -dddd relates to; and it gets easy to end up mistaking which one it relates to and "fixing" the wrong one. In a make with -j 12ish compiling many files, the starts and ends of compiles interleave with one another and with the output of running commands. Some of that output is errors, which (if the commands being run are well-behaved - not always true) shall mean the affected files are named by make in the output when they finish. Some output is warnings, that may actually be important, but the command succeeds: there's no marker indicating the end of a file that succeeds, so there are many files that warning might have come from; most likely one of those started recently, but not always easy to tell which. When hundreds of files are compiling - why else use -j 12ish ? - it can get hard to keep track of which are currently active at any given line of output, even when every file has both start and end markers for its output, e.g. if some files take much longer than others. The output from make -j several is hard to make sense of. Your shiny new feature really does sound (I haven't had occasion to try it) like a vast improvement. That *is* a strong case for turning it on by default. Naturally, before doing so, it's important to do ample testing: and one way to do that is to initially release make with it off by default but available, so that lots of folk can try it and give you feed-back. All the same, once it's been well tested, if no serious problems arise (and aren't fixed by your collective ingenuity) this is a good thing to turn on by default. Philip >>> "Doctor, my hammer has a head so large that I always hit my thumb" >>> "Throw out that hammer and get a non-broken one" >>> >>> The GNU tool working around the brokenness of some non-free software? >>> Some would call that collaboration in retaining your chains. This problem doesn't arise exclusively with non-free software, so this is an emotive argument, an appeal to bad example. Programmers, generally, are slap-dash about the things they throw together to build the software they're so proud of having (in their own opinions) written so nicely and well. They throw together shell scripts to wrap the compiler with some other commands that they also want to run on the same files, or that they need run before handing off their output to the compiler (e.g. they invented their own way of doing localisation, because it's so much more fun to reinvent the wheel than learn to use gettext): and they're profoundly negligent of error-reporting "because it's always worked for me" so anyone else inheriting their code is stuck with the results. They write contorted shell-scripts inline in the make-file as the command for a rule, using the "[ condition ] && action" short-hand for "if [ condition ]; then action; fi" and tack on "|| true" to avoid having make report the command as failed when [ condition ] was false; so the rule never gets reported as failing, even when it does, but "that's OK, because the error messaeges tell you what failed". The hideous monstrosities of make files that get generated by "helper" scripts that "work round the deficiencies of make" (almost invariably, in fact, they work round the author's poor knowledge of how to use make well - but those who inherit the code are stuck with the results) end up doing perverse things that most on this list would never dream of. I'd love to answer that with "so replace the bad scripts and hideously wrapped (and warped) build system with something well written", but the poor bug-fixer who's inherited the code from the oh-so-clever author is stuck with the need to get things fixed today, which doesn't leave any time for re-writing the tangled mess that's the only known way to build the product (and that "works", albeit only on that one machine in the corner, that we'd have thrown away years ago, but for it being the one we have to run delivery builds on). So fix today's bug first, file a bug report about the work needed to convert the build system to something that isn't so hideously broken and pray that management doesn't respond to the latter by hiring a consultant who'll introduce a proprietary IDE, bill for six months' work converting the mess to that and leave you with a worse mess than you started with, that's now one huge step further from being releasable as free software (thanks to that proprietary part of the build system) even if you do manage to talk those with pointy hair into considering the option. When your short-handed project uses a component from an upstream project, there are strong reasons for not modifying the upstream any more than you absoluptly must - you don't want to maintain a big patch-set relative to them, to apply to each future release, and you can't rely on them to always accept the patches you send them. So, even if you could vastly improve their hideous build infrastructure, you don't want to, unless you're really really sure they won't reject your better build system because "it wasn't invented here", or because some crucial ego in the upstream team likes it the way it is. I caricature the horrors I have seen in a long career moving from job to job (and yes, proprietary does all of this *much* worse than free, but free still has its horrors) - yet I do not actually exaggerate as much as you might think. If the inheritor of a mess knows about the shiny new feature in make that'll reduce the pain, so much the better. Otherwise, at present, the maintainer who wants to solve the slowness by using -j some is all too often blocked by those who find the output unreadable; fixing that by default would remove one more hurdle to doing the job better with make (and thereby remove one more excuse for those wedded to bad software to use for converting the "hideously slow and clumsy make system" to some fancy horror generated by (and locked into) a shiny IDE). Even in the free world - where big projects end up with nice build infrastructure, because they have the volunteers to sort it out - little projects get stuck with what their originator cobbled together. All of which is a case for making life better for those who, in each development team, are pulling in the direction of using make well instead of switching to something shinier but less flexible and powerful. Parallel-sync-output is one way to make their lives better and make it easier for them to win over their peers. Once we know how well it works in practice - and, in particular, have addressed any performance issues it incurs - I strongly encourage making it the default ! Eddy. _______________________________________________ Bug-make mailing list Bug-make@gnu.org https://lists.gnu.org/mailman/listinfo/bug-make