automatic dependency tracking

2002-01-18 Thread Tom Lord


Is there anything known to be non-portable about this technique for
generating a list of all included files?


cc [-I options] -E  \
| sed -e '{
/^#[[:space:]]\{1,\}[[:digit:]]\{1,\}[[:space:]]\{1,\}"/!d
s/^[^"]*"//
s/"[^"]*$//
  }'
| sort -u


I noticed that depcomp uses something similar for some compilers, but
not others.

-t




proposal to fork the build-tools projects

2002-10-13 Thread Tom Lord


* Maintaining the build tools (autoconf etc) is currently too hard.

The maintainers have to struggle to write portable shell code; they
have to constantly avoid the temptation to introduce new tool
dependencies in the wrong place; they can't even rely on GNU Make.

Those constraints really matter to a small number of packages (the
"bootstrap packages") but could be reasonably relaxed for other
packages.


* Fork the build utils projects.

Let's (more formally) identify a _small, finite_ set of GNU packages
that should be maintained in such a way that they can be built in many
environments (or for native GNU systems), even if no other GNU tools
are already installed.  Call this set the "bootstrap packages".

Let's then formally identify a _subset_ of the bootstrap packages,
which are those GNU tools that other (non-bootstrap) GNU packages are
allowed to depend on for the build process.  For example, GNU Make
would probably be in this subset, as would a GNU shell.  Call this set
the "build environment packages".

Let's then fork the build tools projects, permanently, creating a
bootstrap-utils package.  The bootstrap-utils package will _only_
support the bootstrap packages -- they can remove and refuse features
for other packages.  I would expect these forks to be very stable,
tending mostly to be simplified over time.

Then let's have a second permanent fork of the build tools projects
to provide build-tools to all non-bootstrap packages.   This fork of
the build-utils can assume that all of the "build environment
packages" have been previously installed.

Setting up two permanent forks of the build tools separates the
detailed, delicate work of preserving the critical bootstrapping paths
from the potentially radical, sweeping work that could benefit all the
non-bootstrapping packages.  It's two very different kinds of work
with incompatible goals -- so it should be two different projects.

-t





Re: proposal to fork the build-tools projects

2002-10-13 Thread Tom Lord



   > I guess this discussion is void - it is not about
   > "forking". It is about creating your own set of buildtools
   > and to try to reach a level of matureness that we see in the
   > autotools today.

Yes and no.

Yes -- I have the nascent "package-framework" which has some parts I
think could be useful for the non-bootstrapping fork of auto*.

But no - it's not just about that.  It's also about trying to
restructure the "project topology" of the community to reduce the
number of constraints on people working on the auto* tools so that,
whatever technology they choose, they can do fancier, cleaner stuff
more easily.

-t







Re: proposal to fork the build-tools projects

2002-10-13 Thread Tom Lord



I personally pretty much agree with the larger picture you propose --
but we can separate concerns.

Forking the build tools projects is orthogonal to any particular
direction to take the non-bootstrapping fork.

-t



   From: Bruce Korb <[EMAIL PROTECTED]>
   Organization: Home
   X-Accept-Language: en
   CC: [EMAIL PROTECTED], [EMAIL PROTECTED]
   Content-Type: text/plain; charset="us-ascii"
   Sender: [EMAIL PROTECTED]
   X-BeenThere: [EMAIL PROTECTED]
   X-Mailman-Version: 2.0.11
   Precedence: bulk
   List-Help: <mailto:[EMAIL PROTECTED]?subject=help>
   List-Post: <mailto:[EMAIL PROTECTED]>
   List-Subscribe: <http://mail.gnu.org/mailman/listinfo/automake>,
   <mailto:[EMAIL PROTECTED]?subject=subscribe>
   List-Id: Discussion list for automake 
   List-Unsubscribe: <http://mail.gnu.org/mailman/listinfo/automake>,
   <mailto:[EMAIL PROTECTED]?subject=unsubscribe>
   List-Archive: <http://mail.gnu.org/pipermail/automake/>
   Date: Sun, 13 Oct 2002 09:57:23 -0700
   X-UIDL: WE)"!V6*!!SKh"!B%X!!

   Tom Lord wrote:
   > 
   > * Maintaining the build tools (autoconf etc) is currently too hard.
   > 
   > The maintainers have to struggle to write portable shell code;
   > 
   > Those constraints really matter to a small number of packages (the
   > "bootstrap packages") but could be reasonably relaxed for other
   > packages.

   I'm not alone?

   > Let's then formally identify a _subset_ of the bootstrap packages,
   > which are those GNU tools that other (non-bootstrap) GNU packages are
   > allowed to depend on for the build process.  For example, GNU Make
   > would probably be in this subset, as would a GNU shell.  Call this set
   > the "build environment packages".

   To this should be added a "backfill" library.
   Programmers will know that ``#include ''
   will ensure that all the standard POSIX-isms will be defined
   and ``-lbackfill'' will ensure that much of the commonly
   omitted functions will be added as well.  Get rid of all
   that configury testing cruft.  It'll just "be there".








Re: proposal to fork the build-tools projects

2002-10-13 Thread Tom Lord



Ok, those would be your goals for the non-bootstrapping fork.

Aside from obvious goals (like, "cleaner, simpler, cooler")
my vague idea is:

I want to make it much easier to audit installations and to automate
package dependency handling.  I want to make it easier to construct
multiple user environments (different mixes of package versions) on a
single system (as for multiple users, development environments, or
test environments).  I'd like to use the build tools to solve the
problem described as "RPM hell" (s/RPM//).  Higher-level-still shared library support seems to be
needed.

And, of course, I'd like the auditing features to tie into distributed
revision control :-)

-t






Re: proposal to fork the build-tools projects

2002-10-13 Thread Tom Lord



   > Uncompressed sources of ash with function support are smaller than
   > that.


:-)

I've found that a few standard Posix tools, plus GNU Make, are a nice
combination (c.f. the `package-framework' via anonymous ftp from
regexps.com -- not that the framework is anywhere near done).

Recent work on GNU make also looks nifty.


-t






Re: proposal to fork the build-tools projects

2002-10-13 Thread Tom Lord



   > Tom seems resistent on a backfill library of headers and
   > library functions

Uh...actually, that's part of the intention of `libhackerlab' -- also 
at the regexps.com ftp.

-t





Re: proposal to fork the build-tools projects

2002-10-15 Thread Tom Lord



   >> Let's (more formally) identify a _small, finite_ set of GNU
   >> packages that should be maintained in such a way that they
   >> can be built in many environments (or for native GNU
   >> systems), even if no other GNU tools are already installed.
   >> Call this set the "bootstrap packages".

   > So you are proposing to trade in end user convenience
   > (package builds on any system "out of the box") for autotools
   > maintainer convenience (maintainers can assume a fixed
   > environment).  

No.

A de facto set of bootstrap packages already exist.  autoconf was
first built for those packages, and it was used to make them
extraordinarilly portable (to all unixen, VMS, and several systems
you've all but forgetten about).

Those packages don't have many external dependencies or dependencies
among themselves.   To avoid adding any new dependencies, the auto*
tools are maintained with some pretty severe constraints that impede
adding features to them in reasonable ways, at a reasonable pace.

Outside of the bootstrap tools, many (most?) packages do have external
dependencies.   Adding a very small number of _new_ dependencies
(e.g. GNU Make) to the list is not harming "end user convenience".

So the trade-off is autotools' maintainer _inconvenience_ for an easier
to extend auto* collection.  (Maintainers give up the inconvenience of
the bootstrap packages least-common denominator and everyone gets 
the auto* tools a little closer to being simpler and more featureful.)


> In fact, I don't even believe the premise that the Autotools
> are particularly hard to maintain to the point that it
> hinders progress.

Perhaps not to simply _maintain_ -- after all, that's what the
bootstrap fork would have to do.

But to _extend_ with new features or simpler approaches: that's where,
for example, depending on GNU Make can make a world of difference.


-t






Re: proposal to fork the build-tools projects

2002-10-15 Thread Tom Lord


Perhaps.  What I specifically remember is conditional code for VMS in
some of the earlier packages.

The choice of shell features permitted in auto* scripts, as someone
else recently pointed out, is a clear example of the original
bootstrap considerations impacting more packages than there is any
good reason for.

-t



   From: "Thomas E. Dickey" <[EMAIL PROTECTED]>

   On Tue, 15 Oct 2002, Tom Lord wrote:

   > A de facto set of bootstrap packages already exist.  autoconf was
   > first built for those packages, and it was used to make them
   > extraordinarilly portable (to all unixen, VMS, and several systems
   > you've all but forgetten about).

   I've never seen the port to VMS for autoconf (but have seen mention of
   attempts to do so).  Unless it's been done very recently, you're probably
   referring to a port that was apparently abandoned years ago.

   -- 
   T.E.Dickey <[EMAIL PROTECTED]>
   http://invisible-island.net
   ftp://invisible-island.net









Re: proposal to fork the build-tools projects

2002-10-15 Thread Tom Lord



Here's a quote from "another list" that illustrates a problem with the
auto* approach to release mgt:

   > I'm looking at trying to get autoconf to detect the right version of 
   > BDB (need to export some SVN_FS_GOT_DB_MAJOR variants), and getting 
   > the checks just right probably exceeds my amount of free time that I 
   > can dedicate to this.

The graph of interproject dependencies, including version-specific
dependencies, is very complex -- bordering on intractable.   It
impedes people from using and contributing to projects.  It raises the
cost of building distributions.

The de facto bootstrap tools don't have that problem -- they don't
depend on much at all.   Supporting their builds is an entirely
different problem from supporting the builds of everything else.
One fork for those bootstrap projects -- another for everything else.


-t





Re: proposal to fork the build-tools projects

2002-10-15 Thread Tom Lord



   > Also, the fork is not really the main issue.  The main issue is what
   > to fork _to_.  

I confess, it was an indirect attempt, mostly hoping to build on:

> nobody's really happy with the current status

which I pretty much agree with.

   > You like GNU Make, others like Perl, still others might
   > prefer Cook, Guile, etc.

For the record -- it isn't GNU Make in particular that I'm after.
It's features.  GNU Make seems to me to be expedient.

Where to fork _to_?  Hmm.  Oh hell, here we go: [enclosed]

-t



This agenda puts a lot of demand on configure/build tools:



 The Process is the Solution:
 Envisioning a Better Free Software Industry

Copyright (C) Thomas Lord, Fremont CA
  v1.4, 10-Oct-2002

   [EMAIL PROTECTED], [EMAIL PROTECTED],
  510-825-7915, 510-657-4988

You may freely redistribute verbatim copies of this
document, but changing it is not permitted.  Please
send comments and suggested revisions to the author.
If you would like to distribute a modified version, 
please contact the author to ask for permission.



 INTRODUCTION: THE "AHA!" EXPERIENCE

  My objective with this document is to produce an "Aha!" experience
  in you: a shared understanding of some software engineering issues
  in the free software world, and the business issues that relate to
  them.  In this document, I lay out a practical program for reforming
  the engineering practices in the free software world, explain
  informally why that program makes sense, and point to the business
  opportunities this activity can create.
  
  Rationalizing free software engineering is wise: it alleviates some
  serious RISKS that are accumulating in the free software world, and it
  can lead to products of far greater quality.  It has the chance to
  be lucrative: the engineering practices I advocate directly attack
  the now famous "IT ROI" problem in a focused way -- they suggest a
  new approach to serving customers effectively by moving engineering
  attention and effort closer to their individualized needs.

  Here's the form of this document: First, I will lay out six
  technical goals, as bullet points.  Second, I will explain those
  goals in more detail: elaborating on what they mean, and why they
  are good goals.  Each subsection in the second part contains a
  definition for the bullet point, and a rationale.  Third, I will
  give my recommendations for next steps.

  By the end, if you have the "Aha!" experience, you should have in
  mind the beginnings of a picture of a reformed Free Software/"Open
  Source"/unix industry.  You'll be able to start thinking about how to
  make it so.  You'll be able to begin to articulate why it is such a
  good idea.  If you're anything like me, you'll find it at least a
  little bit exciting, invigorating, and inspiring.




  

  SIX GOALS FOR OUR INDUSTRY



1) Build a public testing infrastructure.

2) Build a public release engineering infrastructure.

3) Make standardized, efficient, cooperative forking the default
   behavior for all free software providers.

4) Design large customer installations as locally deployed development
   sites. 

5) Organize businesses and business units that join moderately sized,
   regionally identified sets of individual consumer and small
   business customers into tractable markets for support, treating the
   support service providers in those markets as "large customer
   installations".

6) Simplify GNU/Linux distributions;  reduce the core code base size; 
   reduce the out of control package dependencies; focus on essential
   functionality, quality, and tractable extensibility.






ELABORATIONS ON THE SIX GOALS




1) Build a public testing infrastructure.

   A public testing infrastructure consists of:

1) Software standards for configure/build/test tools

2) Internet protocol standards for scheduling tests,
   delivering source code to test servers, and retrieving
   results.

3) Public servers, implementing those protocols.

4) Implementation of these standards for critical projects.


   RATIONALE

   Developers and vendors alike need to be able to identify
   configurations of their packages for regular automated testing 
   on a variety of platforms.   Developers especially: the cost of
   quality rises the farther downstream you try to implement it.

   Few if any independent developers can afford to build their own 
   non-trivial test infrastructure;  at the same time, the cost of
   such an infrastructure large enough to serve many developers is
   quite affordable to large companies.

   This is a business opportunity: to build and maintain that
   infrastructure.  It makes sense for the big vendors to pay for this
   infrastructure, and for a 

Re: proposal to fork the build-tools projects

2002-10-21 Thread Tom Lord


   > It could be that we should tell people to use Bash to build
   > GNU packages if their native shells have trouble handling the
   > job.  That would be a smaller change and perhaps worth doing.


How is `bash' built?


>> You need to be able to compile the bootstrap packages in minimal
>> environments, in order to get a very basic GNU environment.

> I don't think we should do this at all.  The smallest version of the
> GNU system need not be "minimal", and making it so would be extra
> work, so we should not.

Well, then I think you agree with me and you should conclude that
forking as I've suggested is the right thing to do.

GNU ALREADY has this property that you dislike wasting effort on, and
it does result in wasted effort (it appears to me).  My proposal to
fork the build tools is a way to eliminate the wasted effort without
purturbing the useful structure it currently preseerves.

The core *utils package and the compiler are already maintained with
relatively strict portability requirements (for example, that they can
be compiled with a variety of compilers, shells, and implementations
of make).   The maintainers have been doing this for years, and it's
very valuable.   That's _not_ wasted work -- it's how people are able
to migrate to the wider world of GNU software.

However, those portability requirements from the core packages turn
into requirements for the build tools (the auto* tools especially).
So, for example, (I gather from this list), there are parts of
autoconf that can't get away from m4 and that can't use shell
functions.  There are parts of automake that can't assume the featuers
of (extremely portable) GNU make.  _That_ is the source of the extra
work that can be eliminated.

It turns into extra work because of all the other apps people are
working on that put demands on the build tools.   Any effort put into 
making the build tools work better for those apps has to pay an
"effort tax" by designing a solution that doesn't disturb the
portability of *utils.

The way to _save_ work is to stabilize a fork of the auto* tools for
the *utils and compiler, and create an easier-to-change fork for
everything else.  The easier-to-change fork can, at least, assume it's
running with a good shell and the *utils.

-t





Re: proposal to fork the build-tools projects

2002-10-24 Thread Tom Lord


   Long-time automake readers already know I'm strongly against
   this sort of structuring.  This yields Makefiles which are
   fragile and undependable.  For instance, if you create a
   temporary file with a "source-like" name in the source tree,
   then the build fails.


Conversely, using the opposite approach, if you add a source file, and
fail to correctly update the makefile, the build fails.  Big whoop.
Just as "fragile and undependable" either way.  This is a purely
rhetorical line of analysis that admits no objective decision making.




There was a brief time where I vacillated, if you read the
archives fully.

It's a fuzzy area.  There's apparently no a priori line of reasoning
that clearly decides the issue.  Agreed?

My experience with a globbing based system goes back about 15 years.
15 years ago, I learned this approach from a prolific community of
hackers who had used it happily for about 5 years previous.  It's just
more practical, in our experience, even if theoretically weird; sort
of like even temperament.

But it doesn't matter -- I don't intend to decide this one minor issue
in this particular thread.



All implementations of make, including GNU make, are missing features
that are helpful when scaling up to larger builds.  They are also
missing features which help dependability and reproducibility of
builds.  In some cases, like using timestamps instead of signatures,
this change is impossible to implement in make -- switching to
signatures would break every Makefile that uses a stamp file.

For this and other reasons I think that make must go.

For the purposes of the fork the build tools discussion, I'm not
disagreeing.   Perhaps make should go, perhaps not.  If it stays, I'd
sure like some cleanups.

Mostly what I want is to reach a state where, instead of writing
makefiles, I just declare a directory's "type" from the construction
perspective.  As in: these are modules for libhackerlab;  or, unless
otherwise specified, these are `main' modules of $x/bin programs.
With `make', the ideal is a series of `include' directives with,
perhaps, one or two variable declarations preceeding.

Explicit file lists vs. globbing?  Eh...that's a comparatively minor
issue.

Getting rid of make -- or making a better make -- or making a trivial
make with a better front end -- any of these sound like reasonable
ways to proceed.   They are certainly design space directions to give
serious consideration to.   I hope we don't decide them by trying to
out-butch one another on mailing lists though  --  I hope we do a
proper design space search and look for concensus over the
_implications_ of these questions for the final design, rather than
over bullshit arguments on a mailing list.  You know, almost as if it
were an R&D project.

But just to pick up on what you seem to be implying when you say "make
must go":

Bootstrapping is an issue.  Even if we were to fork the build tools
and say, of one fork, that certain prereqs are assumable -- we still
should worry about bootstrapping.  It is Bad if you need a live
culture of particular software to bootstrap that same software.

One way to keep bootstrapping under control is to aim for tiny
languages that are less than fully general.  `make' is one example
(though, who knows, it may be formally turing complete).  I'd be
reluctant to leap to using Perl, Python, Ruby, or Scheme, for example,
unless the replacement for Makefiles did _not_ include arbitrary code
in any of those languages.

And, quite seriously: I think pedagogy has to be a consideration.
Implementing build tools offers novices an opportunity for a serious
hardcore tour of the system calls, file system issues, and more.  It
would be nice to hit an abstraction for build tools that doesn't
require a novice to "first step: implement python" if attempting a
fresh implementation.   It's funny, but unsurprising, how the
bootstrapping issue and the pedagogy issue mirror one another.

The `GNU make' maintainer seems to be actively making progressive
changes to GNU make.  If you have some complaint about make, working
with that project seems like an avenue worth exploring.

Timestamps vs. signatures?  Well, sure, timestamps are an unreliable
optimization that programmers have to understand thoroughly in order
to be able to use well.   Signatures _might_ be an improvement, or
they might just drag out edit/compile/debug cycles too long.  But
regardless, there are bigger fish to fry (auditing, automated
uninstall, dependency mgt, ...).

-t





Re: proposal to fork the build-tools projects

2002-10-24 Thread Tom Lord


   > Why not provide both mechanisms, and let people use what they
   > want?

`package-framework' (ftp.regexps.com) actually does this, by the way.

I emphasize the globbing option because it's what I use most often,
and what I think is usually best.

-t








Re: proposal to fork the build-tools projects

2002-10-22 Thread Tom Lord



   > I just don't see it.  Unfortunately I can't afford to spend
   > time thinking about it.  There is too much else that I have
   > to do.

The GNU coding standards, and by extension the configure/build tools,
have a very large role in keeping GNU systems coherent and keeping it
practical for individuals to work with their source.  They also have a
sinister role (when they work poorly) of enabling vendor lock-in among
free software vendors.  If the effort to implement good standards
fails, the outcome is commercialized "free software" systems that
exhibit vendor lock-in and for which individual users are not, as a
practical matter, enabled to make any significant use of the source
themselves.  In other words, it is free software with few or none of
the benefits of sharing source.


   > If Autoconf developers start saying they think this will make
   > their work easier, that is something I will listen to.

But exactly what is "their work" in this case?

In the most negative light (i.e., painting the worst interpretation --
not making an accusation), the FSF looks like a pass-through, vetting
whatever the maintainers want to do (for whatever reason), asserting
the outcome as a GNU standard.  This is especially problematic when
the maintainers are working on infrastructure that's critical to their
employers; considering a proposal that is contradictory to their
employers' business plans and proprietary software; and, it is not
far-fetched to infer, gaining some standing with their employer by
virtue of their maintainership.  Is objective evaluation possible
under those circumstances?  Is it likely to occur?  How surprising
will it be if, when decision makers have those kinds of contraints,
they make decisions that impede free software progress but favor the
progress of companies making money from free software?  It doesn't
even require explicit or self-conscious malevolence for this kind of
decision making to raise conflicts of interest -- it's a structural
problem, arising out of an unbalanced weighting of some valid
perspectives to the exclusion of other valid perspectives.

-t





Re: proposal to fork the build-tools projects

2002-10-26 Thread Tom Lord

   Globbing can inadvertently lead to unwanted files being
   compiled/distributed/deleted/whatever.  If you accidentally
   delete a source file, make won't complain because it won't
   know.

I've played around a bit with an approach to globbing that solves (in
some sense) both problems you've cited.

Builds work by globbing.  Thus, while I'm engaged in heavy program
editting (e.g. directory restructuring, code factoring), I just use a
nice directory editor and change the source tree freely, not having to
worry about updating makefiles.

At the same time, I can periodically take an "inventory" of the tree
and compare that to a "canonical inventory" -- which tells me what
renames have taken place and what files have been added and deleted.

With that approach, I'm _stil_ maintaining an explicit list of files,
but with a few advantages:

1) It's a separate task from Makefile maintenance: it doesn't
   slow down the edit/compile/debug cycle.

2) There is automation to assist building and checking the 
   inventory list.



-t