Follow-up Comment #1, bug #58735 (project make): I'm not sure I understand the issue being reported: an example would help.
Make works by first reading the initial makefile(s) and all (existing) included files. During this step it doesn't complain about missing makefiles. Then after that is all completed, make tries to rebuild all the makefiles (existing or not). If any fail to rebuild (and were not included with "-include" / "sinclude"), then make fails. If all were rebuilt, then make starts over from scratch and re-reads everything. It is true that currently (due to an implementation detail) they are built in reverse order, as if you'd typed "make inc3.mk inc2.mk inc1.mk Makefile". If you have correct dependencies for your included makefiles this shouldn't matter, but that can easily be changed so they are built in the same order they were encountered. Of course if you enable parallelism you really have to have correct dependencies anyway or you can't be sure what order things will actually run in. Is the reverse order the issue you're concerned with? I'm not sure what you meant by "wrong include concept" reported in 1998. If you mean, you think that if make encounters a missing included file, it should stop and try to build that makefile immediately, then continue to read, well, that's one way it could work but that's not how it works. There are some advantages to that, but there are also advantages to the way make works today. I haven't seen anything showing it would be worth the huge backward-compatibility break to change the way make works in this respect. _______________________________________________________ Reply to this item at: <https://savannah.gnu.org/bugs/?58735> _______________________________________________ Message sent via Savannah https://savannah.gnu.org/