On Fri, 2009-10-09 at 16:56 +0300, Andrei Kopats wrote: > Hello. > > Today I had a problem with GNU Make - it has key -j for control max jobs > count. > User can set particular count, or infinite. > With particular user must define count, but, user not always know the best. > With infinite - system could get frozen. Today I built project on system > with 2 Gb of RAM, and my computer got frozen because infinite jobs count > eaten all memory. As result, build longed 4 times more, than single job. > It will be fine, if make will support some intellectual mechanism for > detect optimal jobs count for the system, based on CPU cores count and > available memory.
There's no way for GNU make to know this. One target could just do a "mkdir", while another target could link a huge program. Or, one target could perform some operation with high network latency (and thus a lot of downtime where the system could be doing other things) and another might be doing something extremely CPU-intensive. Etc. Any heuristic is simply going to be wrong, and I'm not interested in going that way. Make already allows you to limit the number of jobs based on the system load with the -l option; a common idiom is to use "-j -l 2.00" or something, so you can have an unlimited number of jobs but no more jobs will be run if the system load exceeds 2.00 (in this case), until it falls below that again. _______________________________________________ Bug-make mailing list Bug-make@gnu.org http://lists.gnu.org/mailman/listinfo/bug-make