https://gcc.gnu.org/bugzilla/show_bug.cgi?id=118223

Thutt <th...@harp-project.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
         Resolution|WONTFIX                     |---
             Status|RESOLVED                    |UNCONFIRMED

--- Comment #7 from Thutt <th...@harp-project.com> ---
> I don't see this as a big issue.  Note I originally misunderstood
> the issue you were talking about. You are saying if you delete a
> header file that was used originally and now you need to rebuild
> everything.

I think you've misunderstood the impact, and the benefit of updating
Gcc to be more friendly to developers who can, and do, encounter the
Make error that this ticket is intended to remove.  Allow me to
provide some emphasis:

o Few developers understand the build process.

  Expecting developers to understand all build tools quirky behavior
  is a tall order, given that a vanishingly small percentage know
  anything about the build software, and even less know anything about
  how the build process is actually implemented using the build
  software.

  When a developer encounters an error produced by Make, it's 100%
  believed to be caused by the build system; a build failure.  In the
  case described here, the error produced by Make:

   make: *** No rule to make target 'hello.h', needed by 'hello.o'.
   make: Target 'hello' not remade because of errors.

  is actually caused because of Gcc's simplistic auto dependency
  generation model.  Due to the paucity of information Make produces,
  the only rational course of action is a full clean build -- even if
  uses of 'hello.h' have been fully purged from the source tree; the
  build cannot be (easily!) repaired.

  If Gcc were updated just a tad, the build process would be
  self-healing and this class of error would be forever banished.

  This would be a net win for developers:

   + A clean build would not be needed in these cases

   + Make will not show indecipherable messages to people who don't
     know Make nor the build process.

o Large software projects are an amalgam

  Every one of the large projects I currently work on take at least an
  hour to perform a full build.  50% of them take SEVERAL hours to
  perform a full build.  Performing a full build is a major
  productivity hit for the day.

  Different parts of the software are built with different tools.
  Make, Maven, Gradle, Go, CMake, Scons, Bazel, and more.  There is a
  top-level coordinating process that puts everything together,
  barely (see above how few people know how build systems work).

  Inducing completely unrelated software to be fully re-built because
  a header file for C code has bee deleted is a major productivity
  killer.  Gcc can be a better citizen and eliminate the requirement
  to unnecessarily rebuild software.

o Not all headers are system-global.

  Some headers are private to the sub-system which uses them, while
  others are public and can be used by everyone.

  As above, deleting a non-public header should not require everyone
  else to rebuild their entire software -- but that's what's required
  with the status quo, though it can be fixed.


> Large projects have handled this always by just make clean and
> rebuild. Incremental builds should only be used for small
> developmental changes. You should always do a full rebuild just in
> case there is some stale changes left behind in a large project.

That is just another telling of the "true Scotsman" fallacy.

A build system does NOT distinguish between a "full" build and an
"incremental" build.  A build system simply determines what software
is out-of-date with respect to artifacts, and rebuilds those pieces.
If invoking the build system with no artifacts present, and all
artifacts present, produces different results, then the build system
is non-deterministic and flawed.

It is possible to have a build process that is correct in Make (Netapp
did it, at great expense).  Generally, however "good enough" is where
things stand.  With the proposed change, the bar for "good enough" is
automatically raised and all uses of auto dependency generation with
Gcc will benefit with fewer broken incremental builds.

(As an aside,, newer build systems (such as Bazel (not a fan)) force
much more work upfront on the developer so that these incremental
build issues do NOT happen.)

> GCC for an example always requires you do a full bootstrap to make
> sure nothing gets miscompiled. I am not seeing why this is a huge
> problem in general.

If something is mis-compiled in an incremental build, then the build
process is non-deterministic and flawed; it should be fixed.

I would prefer to not use a flawed build process as a defense of why
an improvement to auto dependency generation should not be undertaken.
Do you agree?

You've rejected this suggestion and closed this ticket because the
status quo is satisfactory to you.

It would be much better if you could discuss the pros & cons of the
suggestion rather than the feels.  Is there something about this that
does NOT work and does not improve the auto-dependency generation?
To that end, please allow me the leeway to reopen.

Reply via email to