Hi, I used a script in Tup's test suite to generate some projects with 1k, 10k and 100k C files to test the performance of Ninja vs. GNU Make (on Linux).
There isn't much difference in full builds from a clean checkout. For no-op builds, Make still takes <1s with 10k source files. At 100k source files Make's performance explodes a bit, to 73s vs. Ninja's 1.5s. Pretty much all of Make's time goes into processing the 100k ".d" dependency files corresponding to the 100k source files (whereas Ninja implements special handling of dependency files: as soon as they are generated by the compiler it reads them and saves the information to a single binary file in the build root, to speed up future start-up times). Another interesting difference I noticed is that Ninja tries hard to build things in the order they’re listed in the build file (dependencies permitting). Make does try to build depth-first, but it seems to push the targets to the end of a queue if it needed to build their prerequisites first, so it ends up building the leaf targets at the very end (it leaves all the linking steps until all the object files have been compiled, instead of linking each executable as soon as its own dependencies are available). This seems like an optimisation opportunity for Make to reduce bottlenecks during parallel builds. More details here: http://david.rothlis.net/ninja-benchmark/ Cheers, Dave. _______________________________________________ Help-make mailing list Help-make@gnu.org https://lists.gnu.org/mailman/listinfo/help-make