Re: missing fi for compilation to .obj
> "Eric" == Eric Blake <[EMAIL PROTECTED]> writes: Eric> I don't have the automake sources in front of me, but the file to Eric> patch gets installed as /usr/share/automake/am/depend2.am. Eric> 2002-11-14 Eric Blake <[EMAIL PROTECTED]> Eric> * am/depend2.am: Add missing fi in c.obj rule. Looks good. I'm checking it in on the trunk and the 1.7 branch. Tom
automake buglet
I'm using 1.7.6a. My Makefile.am has: TEXINFO_TEX = ../gcc/doc/include/texinfo.tex My configure.in has: AC_CONFIG_AUX_DIR(..) I expected TEXINFO_TEX to override AC_CONFIG_AUX_DIR, but it doesn't: fleche. automake Makefile.am:61: required file `../texinfo.tex' not found Tom
Re: question about automake build
> "Alexandre" == Alexandre Duret-Lutz <[EMAIL PROTECTED]> writes: >> Take a look at the appended `make' output. Why are we building in >> `tests' twice? Alexandre> There are two different tests/ directories on HEAD... Duh, I can't read. Sorry about that. Tom
question about automake build
I haven't looked at this yet. Take a look at the appended `make' output. Why are we building in `tests' twice? Tom Making all in . make[1]: Entering directory `/home/tromey/gnu/Auto/automake/build' make[1]: Nothing to be done for `all-am'. make[1]: Leaving directory `/home/tromey/gnu/Auto/automake/build' Making all in m4 make[1]: Entering directory `/home/tromey/gnu/Auto/automake/build/m4' make[1]: Nothing to be done for `all'. make[1]: Leaving directory `/home/tromey/gnu/Auto/automake/build/m4' Making all in lib make[1]: Entering directory `/home/tromey/gnu/Auto/automake/build/lib' Making all in Automake make[2]: Entering directory `/home/tromey/gnu/Auto/automake/build/lib/Automake' Making all in tests make[3]: Entering directory `/home/tromey/gnu/Auto/automake/build/lib/Automake/tests' make[3]: Nothing to be done for `all'. make[3]: Leaving directory `/home/tromey/gnu/Auto/automake/build/lib/Automake/tests' make[3]: Entering directory `/home/tromey/gnu/Auto/automake/build/lib/Automake' make[3]: Nothing to be done for `all-am'. make[3]: Leaving directory `/home/tromey/gnu/Auto/automake/build/lib/Automake' make[2]: Leaving directory `/home/tromey/gnu/Auto/automake/build/lib/Automake' Making all in am make[2]: Entering directory `/home/tromey/gnu/Auto/automake/build/lib/am' make[2]: Nothing to be done for `all'. make[2]: Leaving directory `/home/tromey/gnu/Auto/automake/build/lib/am' make[2]: Entering directory `/home/tromey/gnu/Auto/automake/build/lib' make[2]: Nothing to be done for `all-am'. make[2]: Leaving directory `/home/tromey/gnu/Auto/automake/build/lib' make[1]: Leaving directory `/home/tromey/gnu/Auto/automake/build/lib' Making all in tests make[1]: Entering directory `/home/tromey/gnu/Auto/automake/build/tests' make[1]: Nothing to be done for `all'. make[1]: Leaving directory `/home/tromey/gnu/Auto/automake/build/tests'
precompiled header suggestion
Recently gcc added precompiled header support. This is mostly useful for C++, but C might benefit in some cases too. To use it, you make a special `.gch' file by compiling a bunch of .h files. Then you tell gcc to use it when compiling. Automake could usefully automate this. First, when building the .gch file we could do automatic dependency tracking (the process of building this file should support the normal -M flags). Also, if a .gch file is listed in _SOURCES, it would be cool to build this file before trying to build any of the objects associated with the _SOURCES variable. (This could be generalized to all .h files, perhaps, to let us reduce the scope of BUILT_SOURCES.) This could be implemented by adding a new dependency for each .o file. There would also have to be a way to disable .gch support for non-gcc compilers. Tom
Re: precompiled header suggestion
> "Rob" == Robert Collins <[EMAIL PROTECTED]> writes: >> Recently gcc added precompiled header support. This is mostly useful >> for C++, but C might benefit in some cases too. Rob> Are you planning on doing this, or just sketching the design and hoping Rob> for volunteer contributions? I'm hoping someone else will do it :-) Rob> What might be a useful starting point is some manual test cases or Rob> sample rules, to aim for. No problem. libstdc++ is already using it. I've appended some snippets from their Makefile.am. We could probably already get most of this by abusing _PROGRAMS. That's ugly though. I've also appended the section of the gcc manual explaining precompiled headers. Tom pch_input = ${host_builddir}/stdc++.h pch_output_builddir = ${host_builddir}/stdc++.h.gch pch_source = ${glibcxx_srcdir}/include/stdc++.h PCHFLAGS=-Winvalid-pch -Wno-deprecated -x c++-header $(CXXFLAGS) if GLIBCXX_BUILD_PCH pch_build = ${pch_input} pch_install = install-pch else pch_build = pch_install = endif # Build a precompiled C++ include, stdc++.h.gch. ${pch_input}: ${allstamped} ${host_builddir}/c++config.h ${pch_source} touch ${pch_input}; \ if [ ! -d "${pch_output_builddir}" ]; then \ mkdir -p ${pch_output_builddir}; \ fi; \ $(CXX) $(PCHFLAGS) $(AM_CPPFLAGS) ${pch_source} -O0 -g -o ${pch_output_builddir}/O0g; \ $(CXX) $(PCHFLAGS) $(AM_CPPFLAGS) ${pch_source} -O2 -g -o ${pch_output_builddir}/O2g; @node Precompiled Headers @section Using Precompiled Headers @cindex precompiled headers @cindex speed of compilation Often large projects have many header files that are included in every source file. The time the compiler takes to process these header files over and over again can account for nearly all of the time required to build the project. To make builds faster, GCC allows users to `precompile' a header file; then, if builds can use the precompiled header file they will be much faster. To create a precompiled header file, simply compile it as you would any other file, if necessary using the @option{-x} option to make the driver treat it as a C or C++ header file. You will probably want to use a tool like @command{make} to keep the precompiled header up-to-date when the headers it contains change. A precompiled header file will be searched for when @code{#include} is seen in the compilation. As it searches for the included file (@pxref{Search Path,,Search Path,cpp.info,The C Preprocessor}) the compiler looks for a precompiled header in each directory just before it looks for the include file in that directory. The name searched for is the name specified in the @code{#include} with @samp{.gch} appended. If the precompiled header file can't be used, it is ignored. For instance, if you have @code{#include "all.h"}, and you have @file{all.h.gch} in the same directory as @file{all.h}, then the precompiled header file will be used if possible, and the original header will be used otherwise. Alternatively, you might decide to put the precompiled header file in a directory and use @option{-I} to ensure that directory is searched before (or instead of) the directory containing the original header. Then, if you want to check that the precompiled header file is always used, you can put a file of the same name as the original header in this directory containing an @code{#error} command. This also works with @option{-include}. So yet another way to use precompiled headers, good for projects not designed with precompiled header files in mind, is to simply take most of the header files used by a project, include them from another header file, precompile that header file, and @option{-include} the precompiled header. If the header files have guards against multiple inclusion, they will be skipped because they've already been included (in the precompiled header). If you need to precompile the same header file for different languages, targets, or compiler options, you can instead make a @emph{directory} named like @file{all.h.gch}, and put each precompiled header in the directory. (It doesn't matter what you call the files in the directory, every precompiled header in the directory will be considered.) The first precompiled header encountered in the directory that is valid for this compilation will be used; they're searched in no particular order. There are many other possibilities, limited only by your imagination, good sense, and the constraints of your build system. A precompiled header file can be used only when these conditions apply: @itemize @item Only one precompiled header can be used in a particular compilation. @item A precompiled header can't be used once the first C token is seen. You can have preprocessor directives before a precompiled header; you can even include a precompiled header from inside another header, so long as there are no C tokens before the @code{#include}. @item The precompiled header file must be produced for t
Re: How one could integrate Automake in an IDE ?
> "Alain" == Alain Magloire <[EMAIL PROTECTED]> writes: Alain> I'm curious on how the autoXXX tools like automake etc .. can Alain> be integrated nicely part of an IDE. So far what I've seen Alain> is not suitable enough ... Alain> If you know of a good integration, please send the URL. The only integration I'm aware of at all is with KDevelop. I've still never tried that :-( Alain> I'm looking at the Multipage Editor design, when one tab Alain> control(page) shows the raw source and the others shows a Alain> different "view" of the source that can be edited by newbies Alain> easily, of course with round-trip(i.e. the raw source Alain> Makefile.am reflects the other views vice-versa). Yeah, that sounds pretty reasonable, if difficult. It probably makes sense to concentrate on a simple-but-usable subset of automake, at least at first. Otherwise, the problem you'll encounter is that a Makefile.am is pretty complicated to interpret. For instance, there are runtime conditionals and configure substitutions. Another thing to think about is whether to support having a Makefile.am in each directory. It might be simpler just to have a single master Makefile.am at the top level. There are a lot of other things a useful auto*/eclipse integration could do. We could talk here or on the CDT list, as you (and others) prefer... Tom
Re: precompiled header suggestion
> "adl" == Alexandre Duret-Lutz <[EMAIL PROTECTED]> writes: adl> This sounds tricky. Adding such a file as a dependency of each .o file adl> means that _all_ of them will be updated whenever the .ghc changes. Good point. There are other possible approaches, though. For instance, for a given program, we could generate: am--program: $(program_BUILT_SOURCES) $(MAKE) program ... which is sort of like the BUILT_SOURCES implementation, but more targeted. That's sort of backward, since "make program" will no longer work as expected. But you get the idea... I suppose this is sort of secondary, and the main thing is just to have some automation for building the .gch file at all. adl> Putting the .ghc in BUILT_SOURCES automatically will not work if adl> the .ghc includes another BUILT_SOURCES indirectly (direct adl> inclusion is ok because we can issue the dependency ourself). adl> Maybe we can live with such a limitation? Yeah, there could be some problems here. But the user can always add an explicit dependency (just adding the .h to the gch file's _SOURCES would suffice). adl> Also I presume some libraries will also want to install such adl> files? Can they be installed? (Is this what install-pch is adl> about in your libstdc++ quote?) If so, such installation also adl> needs to be conditional. Honestly, I don't know too much about this. I'd suggest we leave open the possibility that they can be installed, though. adl> Maybe it would be simpler to introduce a new primary Yeah. Another idea would be to recognize `*.gch' automatically in a _SOURCES variable corresponding to a program or library. Tom
Re: [SUGGESTION] Having 'make check' use AM_CPPFLAGS
> "Stephen" == Stephen Torri <[EMAIL PROTECTED]> writes: Stephen> TESTS = test_Foo Stephen> test_Foo_SOURCES = test_Foo.cpp As you discovered, you have to list test_Foo in a _PROGRAMS variable. I suggest check_PROGRAMS, as this is what `check' is made for. An entry in TESTS doesn't suffice; these aren't assumed to necessarily be compiled/linked programs. I suppose it almost works due to implicit rules in your `make'. Tom
Re: [PATCH] ylwrap
> "Didier" == dc <[EMAIL PROTECTED]> writes: Didier> I've made a patch several months ago concerning ylwrap, and Didier> posted it on http://savannah.gnu.org/patch/?group=automake , Didier> but it seems that it wasn't included yet. Since there wasn't Didier> any response so far, I joined the list to ask about its Didier> validity. I think this is obviously correct. Alexandre, is ylwrap still maintained in the automake repository? Tom
Re: [PATCH] ylwrap
Tom> Alexandre, is ylwrap still maintained in the automake repository? adl> Yes. Do you think we should mention Automake in the headings of adl> all similar auxiliary files? Sure, but it doesn't matter much to me. A note in HACKING would suffice as well. Tom
Re: .Po / .Plo Question
> "Asim" == Asim Suter <[EMAIL PROTECTED]> writes: Asim> 1) Which tool/script/program generates .Po/.Plo files ? And at what Asim> stage ? They are initially created, as empty files, by configure when building the various Makefiles. Then, they are updated as a side effect of compilation. Asim> 2) While rebuilding I see sometimes .Po files get "bad" in that Asim> they get malformed somehow for which gmake croaks and bails out. Asim> For instance: Asim>a) .deps/libcryptotool_a-sig_verify.Po:1: *** target pattern contains Asim> no `%'. Stop. Asim>b) Missing separator in file.. I haven't seen this before. What platform are you on? What version of make are you using? What compiler? Could you send the contents of that file? Tom
Re: defining xxx_PROGRAMS conditionally?
> "Harlan" == Harlan Stenn <[EMAIL PROTECTED]> writes: Harlan> And where is CVS automake these days? Is it still on Harlan> sourceware.cygnus.com? That machine was renamed to "sources.redhat.com" long ago. But yes, that is where it is hosted. Tom
Re: how to add dependencies to an auto-generated rule?
> "Jeff" == Jeff Rizzo <[EMAIL PROTECTED]> writes: Jeff> Ideally, I'd like to add a dependency on the file VERSION for the rule Jeff> for $(srcdir)/autoconf.h.in ... is there any way to do this? Doesn't it work to just write the dependency in Makefile.am? $(srcdir)/autoconf.h.in: VERSION Maybe my memory is slipping though. Perhaps automake gets confused about this and doesn't write out its own rule. Tom
Re: Adding a manpage to a autoconf/automake project (fwd)
> "Frank" == Frank Aune <[EMAIL PROTECTED]> writes: Frank> In my ROOT/Makefile.am I got so far: Frank> AUTOMAKE_OPTIONS = foreign 1.4 Frank> SUBDIRS = src Frank> I think I should then add in my ROOT/Makefile.am Frank> man_MANS = manpagename.8 Frank> where manpagename.8 resides in ROOT/man/ Perhaps I even Frank> have to add: Frank> SUBDIRS = src man Frank> to ROOT/Makefile.am? It depends on how you want to do things. You could write: man_MANS = man/manpagename.8 in your top-level Makefile.am. Or you could add a Makefile.am to the man/ directory, then update SUBDIRS and AC_CONFIG_OUTPUT in the top level. It's up to you. Tom
Re: Makefile dependency
> "Ralf" == Ralf Corsepius <[EMAIL PROTECTED]> writes: Ralf> => automake-1.7's AM_MAINTAINER_MODE deactivates regeneration of Ralf> Makefile's. Ralf> I am inclined to interpret this as a bug and/or regression from earlier Ralf> versions of automake. I agree. The rule for maintainer mode was that it would deactivate rules for which you needed special maintainer tools. Rebuilding Makefile doesn't fit this category -- all you need is config.status, which you've got. Tom
Re: AM_INIT_AUTOMAKE comes from an older version of automake
> ">" == Piyush Kumar Garg <[EMAIL PROTECTED]> writes: >> configure.in:12: old Automake version. You should recreate aclocal.m4 >> configure.in:12: with aclocal and run automake again. >> I am using RHL8.0. I also tried upgrading automake to 1.7.9 and >> autoconf to 2.57. It doesn't work. It will be helpful, if someone can >> provide pointers on this. Try running "aclocal". Tom
Re: Newbie Request for Help (make dist problem)
> "Scott" == J Scott Amort <[EMAIL PROTECTED]> writes: Scott> - include Scott> - src Scott>- subdir1 Scott>- subdir2 Scott> - extra Scott> - build Scott>- src Scott> The configure.ac, Makefile.am, etc. files are located in the Scott> src subdirectory of the build directory at the bottom (nothing Scott> platform specific is therefore in the top level directory). Automake assumes that your top-level directory will have a Makefile.am... Scott> However, the problem lies Scott> when trying to make dist. It copies all of the include, src and extra Scott> directories into the build directory and only makes a tar.gz of the Scott> files that were originally in the build...src directory (i.e. configure, Scott> configure.ac, etc.). None of the source files make it into the Scott> archive. Why is this happening? This is fallout from that assumption. Scott> Additionally, it seems I would also Scott> need to put the header files into the distribution, so I added: Scott> EXTRA_DIST = \ Scott>$(ISRC)/header1.h \ Scott>$(SSRC)/header2.h Yeah, you can do this. It's a bit more idiomatic to list header files in a _SOURCES somewhere. They are ignored for purposes of compilation, but are distributed. Tom
Re: Non-recursive make & maintenance issue
> "Jirka" == Jirka Hanika <[EMAIL PROTECTED]> writes: Jirka> My view is that these (and other) problems disappear if you use a Jirka> per-directory Makefile.am; but I also see the benefits (esp. compilation Jirka> speed) of a non-recursive Makefile. So the solution could be to support Jirka> generating a single Makefile from multiple Makefile.am's in Jirka> subdirectories. (Just kidding. But interested in seeing the reasons Jirka> why this is nearly impossible.) It isn't impossible. I once wrote up some ideas along these lines: http://sources.redhat.com/ml/automake/2001-07/msg00248.html Obviously I never got around to implementing this :-) Tom
Re: SUBDIRs and slashes
> "adl" == Alexandre Duret-Lutz <[EMAIL PROTECTED]> writes: "Marty" == Marty Leisner <[EMAIL PROTECTED]> writes: adl> [...] Marty> common/Makefile.am:1: directory should not contain `/' Marty> Just wondering for some thoughts on this matter...is Marty> there any reason to insist on single level source Marty> directories in recursive make... adl> I can't think of any. Back in the day there were two problems. First, automake's "dist" rule was not robust if there was a gap in the hierarchy. Second, autoconf wouldn't properly create intermediate directories in this same situation. I don't know whether either of these is still true today. The first one in particular has probably changed, I think I'm remembering our first "dist" implementation :-) Tom
Re: Non-recursive make & maintenance issue
> "Bob" == Bob Friesenhahn <[EMAIL PROTECTED]> writes: Bob> In other words, dealing with junk like Bob> apps_build_postgres_src_build_postgres_SOURCES Bob> is very tiring and failure prone. Is there a reason why it can't Bob> simply be Bob> apps/build-postgres/src/build-postgres_SOURCES ? Yeah, that does seem easier. One problem is that you might refer to these variables elsewhere in your Makefile, and it isn't clear that automake can reliably rewrite all uses. The initial reason for canonicalizing macro names was simply that automake was pretty dumb, and passed through its input directly to its output. Rewriting wasn't really possible. Tom
Re: [MAD SCIENCE EXPERIMENT]: Replace some libtool functionality with handcoded C
> "Alexandre" == Alexandre Oliva <[EMAIL PROTECTED]> writes: Alexandre> the *_OBJECT definitions assume the absence of shell-active Alexandre> characters in filenames, which is probably a safe Alexandre> assumption for Makefiles. It isn't unreasonable for a Java .class file's name to contain "$". libgcj only uses these for headers, but some other project might compile from .class to .o -- if they use an inner class, pow. Tom
Re: Emulating GNU Make conditionals, or: Is there a nice way to automatically set CFLAGS when make is run?
> "Dalibor" == Dalibor Topic <[EMAIL PROTECTED]> writes: Dalibor> They use make -DCHECK=1 to enable adding of special debuggin flags, Dalibor> for example, and make -DPROF=1 to add another set of flags to enable a Dalibor> build fro profiling. You can always add your own targets: debugging: $(MAKE) CFLAGS='-g ...' Then "make mostlyclean debugging" should work ok. This isn't completely robust in all situations -- if something in CFLAGS changes a decision that configure makes, then you must reconfigure. However, the above would work fine most of the time. Maybe I'm misunderstanding what you want? If you've got several common ways to build something, I suggest either building outside the source tree (so you can easily have multiple builds with different options -- this is what I do) or using ccache. Tom
automake -vs- huge projects
Tom Fitzsimmons (CCd) has been working on upgrading libgcj to use newer auto* tools. This has gone swimmingly, except one problem with automake. A little background. libgcj is pretty big. It has 2,243 ".java" files at the moment. Previously it has been using its own slightly hacked automake 1.4. It used to use its own "%" rules to handle compiling .java (since 1.4 couldn't do this). It is part of GCC, which recently decided as a project that requiring GNU make is ok. We have to use subdir-objects, both because nobody wants 2000 .o files in "." and because we have unavoidable basename clashes between .java files. Also, we use a single top-level Makefile.am, as it is way more convenient. The problem is, automake generates an explicit rule for each compilation. Our resulting Makefile.in is nearly 9 megabytes. This is really much too large -- compare to 200K with automake 1.4. One idea we had for a fix is to introduce a new "gnu-make" option that would allow automake to generate code relying on GNU make. Then we could replace all those rules with a single "%.o: %.java". I thought this probably wouldn't be a popular idea, but figured I'd send it past you just in case. We definitely don't want to have a local fork of automake any more -- it is too much of a pain, and GCC as a whole is trying to update and standardize on one set of tools for the whole tree. Any other ideas for how to fix this problem? In absence of a real fix, another option for us is to just hand-write our Makefile. I'd really rather not do that, either. Though I suppose if we start from Automake's output it won't be too awful -- we can still preserve dependency tracking and the like. The long term maintenance on this is likely to be hard though. Tom
Re: automake -vs- huge projects
> "adl" == Alexandre Duret-Lutz <[EMAIL PROTECTED]> writes: adl> Couldn't we use the (existing) .java.o: inference rule in this adl> case? Actually, is there a difference between `%.o: %.java' and adl> `.java.o:' beside portability? -- I'm not asking about the adl> general % construction, just about this precise case where both adl> sides are expected to be in the same directory. My recollection is that we tried this and ran into real problems with it. I don't remember what they are any more. There's an old thread on this: http://sources.redhat.com/ml/automake/1999-04/threads.html#00033 At this point I still believed that these suffix rules would work. I couldn't find the point where we changed things. adl> So the simplest part of the fix would be to disable the output adl> of explicit rule for subdirectory sources without per-target adl> flags when subdir-objects is used. Yeah. That would help us a lot, but... adl> The other side of the coin is that dependency tracking will not adl> work anymore, because the dependency stuff for subdir/X.o should adl> go into subdir/.deps/X.Po but the default suffix rule will put adl> it in ./.deps/subdir/X.Po. I don't see an easy way to fix this, adl> unless we add some clumsy shell computation in the suffix rules adl> (while this can probably be folded into depcomp when it is used, adl> it does not seem to fit well in fastdep rules). Hmm, maybe this is the issue from way back. We definitely want to keep dependency tracking. This is pretty important. fastdep would be great, since we know we'll always be using gcc. If somehow dropping fastdep would get the Makefile.in to a reasonable size, though, I'd be in favor of that. Alternatively running sed or whatever once or twice before the compilation isn't going to hurt as much as having a 9M Makefile.in. So we could just do the rewrite in the suffix rule and pay the cost. adl> Note that this issue is unrelated to the %.o:%.java vs. .java.o: adl> choice. Not completely, since GNU make might give us handy macros that would let us do this transformation in-line. Tom
Re: automake -vs- huge projects
> "adl" == Alexandre Duret-Lutz <[EMAIL PROTECTED]> writes: adl> Furthermore, generally it does not work to compile both the .o adl> and .lo objects of a source file (in the last example Automake adl> is expected to warn that these files are being built both with adl> and without Libtool), so it sounds safe to remove the rules adl> which are not used. This would be a great change, but unfortunately it only gets us down to 3M, which is still about 6x too large. Tom
Re: Usage
> "John" == John Darrington <[EMAIL PROTECTED]> writes: John> One particular problem is the way in which they modify each other's John> input files. After a while, your Makefile.am looks like this: John> SUBDIRS= intl m4 intl m4 intl m4 intl m4 intl m4 intl m4 John> intl m4 intl m4 Report this as a bug against whatever tool (presumably gettext) is modifying Makefile.am. Tom
Re: SUBDIR_OBJECTS option
> "John" == jling <[EMAIL PROTECTED]> writes: John> I read in one thread the mention of a SUBDIR_OBJECTS option in John> automake. Supposedly this would prevent intermediate object files from John> ending up in the directory of the Makefile (I'm trying to use a non- John> recursive Makefile.am). John> Where and how is this option to be specified? Put `subdir-objects' into the AUTOMAKE_OPTIONS variable in Makefile.am. See the manual for this and other information... Tom
Re: automake -vs- huge projects
>>>>> "adl" == Alexandre Duret-Lutz <[EMAIL PROTECTED]> writes: adl> I've found this: adl> 1999-11-22 Tom Tromey <[EMAIL PROTECTED]> adl> * automake.in (handle_single_transform_list): Generate explicit adl> rule for subdir objects. Fixes new addition to subobj.test. I looked into this a bit today. One nice thing about having a patch list is that it records the rationale for changes somewhere... back in those days that sort of information was just lost. Sigh. I suppose the best we can hope for is to try out your change on a different platforms and in different scenarios and hope for the best... Tom
Re: automake -vs- huge projects (1st patch)
> "Tom" == Thomas Fitzsimmons <[EMAIL PROTECTED]> writes: [ suggestions ] Tom> Anyway, this patch brings us closer to using automake-1.8 for libgcj. Tom> Thanks! I think all the patches are in now. Could you try CVS automake and see how big the resulting Makefile.in is? Tom
Re: .cpp to .c
> "Bob" == Bob Lockie <[EMAIL PROTECTED]> writes: Bob> I have: Bob> arson_SOURCES = arson.cpp Bob> in Makefile.am Bob> and this is changed in Makefile.in Bob> arson_SOURCES = arson.c Bob> Any idea why my .cpp is changed to .c? No, that shouldn't happen. Do you have a small test case? Tom
Re: automake -vs- huge projects
> "Bob" == Bob Friesenhahn <[EMAIL PROTECTED]> writes: >> $(CC) -c -o $@ `$(CYGPATH_W) $<` Bob> An simple (but ugly) approach would be to define $(CYGPATH_W) to Bob> 'echo' when not compiling under Cygwin. We already have much worse hacks. But ideally we'd be reducing the amount of weird code we generate, trying to streamline compilations as much as possible (hence the fast-dep stuff). Bob> It seems to me that perhaps 'CYGPATH_W' is misnamed or the use is Bob> outdated. It's entirely possible. This use probably dates to the original Cygwin port of automake and most likely has been carried forward without examination or modification since then. Tom
Re: Principles of a developing from a dist or install?
> "John" == jling <[EMAIL PROTECTED]> writes: John> Is there any sense in me having the user install the package (i.e. do John> a 'make install') and then have them develop off of the code in the John> install directory? ... assuming I have the source code and headers John> copied over during the install process. I can't say whether or not it makes sense. It isn't the standard thing, though, that's for sure. The typical approach is that "make install" simply installs the output of the build process, the idea being that the user can save some space by then deleting the source and build trees, if he so desires. Users who wish to hack on the package usually use the unpacked tree, both because it is already built (and therefore a simple change might involve less recompilation) and because it means these users and the maintainers will have a similar build structure and directory layout (so patches will apply more easily). John> Nevertheless what is the best way or ideal way to proceed? The general idea is to have one single structure used by everybody. This means less testing, less trouble applying patches, etc. People can generally navigate any sensible source layout, no matter how deep; I wouldn't normally consider that a big hurdle. Tom
Re: automake -vs- huge projects
> "Paul" == Paul D Smith <[EMAIL PROTECTED]> writes: Paul> If you're willing to require GNU make then I'm quite confidant Paul> you could write automake as nothing more than a suite of GNU Paul> make macros and functions. Yeah. One idea for the post-auto* world is "let's beef up GNU make and use it as the basis for the next-generation system". This approach has the benefits that most build scripts are already written for make and that GNU make already does a lot of what is needed. The competing idea (which seems to have more mindshare -- not necessarily making it more viable) is to write something brand new, typically in your-favorite-scripting-language, thus avoiding the less pleasant aspects of make (syntax, primarily, but also the flat namespace). From time to time I work on a tool following this approach, but there are several of these out there. We did consider just writing our own Makefile.in for libgcj using GNU make features. It seemed easier to try to fix the size problem though. Tom
Re: utility programs used during build
> "Warren" == Warren Turkal <[EMAIL PROTECTED]> writes: Warren> Is there any analysis on what it would take to create utility Warren> programs that are only used during build in a crosscompiled Warren> environment in automake? Warren> I and working on the libX11 for Freedesktop.org and it builds Warren> a file and uses it during installation, but does not install Warren> it. I am under the impression that automake does not support Warren> this right now. What would be needed to add support for this Warren> feature. I think your plan as posted on the patch list is a good start. I always liked the gcc-style names like `CC_FOR_BUILD', `CFLAGS_FOR_BUILD'; they always seemed the easiest to understand. (Other folks have suggested BUILD_CC, BUILD_CFLAGS, etc. Which, really, aren't that much harder.) Having a `build_' prefix for primaries makes sense to me. For autoconf it would be nice if you could tell it "now I want to do checks on the host compiler" and have it change the definition of CC (and other flags), have a new build-specific config.h, etc. The required changes might be extensive. I agree with Alexandre that we don't have to support target, just host and build. Well, really it might be nice to clean up target library support, but I wouldn't recommend it unless you have a real need; it is pretty messy. Tom
Re: make distcheck problem
> "Lars" == Lars Hecking <[EMAIL PROTECTED]> writes: Lars> if BUILD_SRC_BEOS_SUBDIR Lars> d_beos = beos Lars> endif Lars> SUBDIRS = $(d_beos) Lars> If I run make distcheck in the top level directory, it bombs out at Lars> one point because the beos subdir doesn't exist. Is this a bug in Lars> automake? Is there any way to work around it? I am not running on Use `DIST_SUBDIRS'. Tom
Re: utility programs used during build
> "Ralf" == Ralf Corsepius <[EMAIL PROTECTED]> writes: >> > If you want a clean way, you'd have to split buildtools and >> > host-packages into separate (sub) packages and write a costomized >> > toplevel configure-script to parse and set the configure options for >> > build- and host- compile packages. Ralf> This is the current nominal working principle, as it is applied by Ralf> packages which actually support cross-compilation (gcc, newlib, Ralf> binutils, gdb etc.). Hmm, I think we're mixing scenarios. In gcc, for instance, ordinarily target libraries are put in their own directories with their own configuration. And there is also a surrounding layer of hackery to deal with multilibs. But that isn't what Warren is talking about. He's talking about a situation where you want to build your package for a different host, but first build some helper programs on the build machine to create other parts of your program. E.g., in gcc there are the gen* family of programs, like genattrtab. These are just incorporated in the gcc source directory along with files that will be compiled for the host machine, not the build machine. My opinion on this is that total separation is easier to implement, but not really cleaner. "Clean" depends on the needs of the package at hand, sometimes you'd really rather just lump all the sources together. Alexandre's simple solution of overriding _CC and the like is nice. I think at least one part of this must be handled automatically, and that is the selection of EXEEXT, which can differ between build and host. And really my preference would be to have it all done automatically, since that is easier for the user and less error-prone... still, it looks like the same internal mechanisms are necessary to support build compilers and per-target compilers. Anyway, it looks like there's a big job ahead for Warren :-). Tom
Re: release schedule for 1.9? (Was: Re: automake -vs- huge projects (1st patch))
> "adl" == Alexandre Duret-Lutz <[EMAIL PROTECTED]> writes: adl> Also, since we have switched to API-numbering, bumping that adl> version number has a cost. For instance Debian distributes adl> automake1.4, automake1.6, automake1.7, and automake1.8. If we adl> add another API, it'd better be worth it. Yeah. It turns out that Red Hat still ships 1.5 because some packages still depend on it. Sigh. Obviously this doesn't scale -- at some point it has to be so painless to upgrade that someone like Debian or Red Hat can simply ditch 1.x and convert everything to 1.x+1 all at once. adl> Maybe we could release an "official snapshot" of HEAD? This may adl> also help to better test these uncertain subdir-suffix-rules. adl> Would that be enough? It might. Unfortunately I don't think we can unilaterally make a decision like this. We'll have to involve the other gcc maintainers. I think the ideal for gcc is to have the entire tree requiring a single released version of automake. But, we'll never know if we can do it until we try :-) Tom
Re: pathnames containing spaces
> "Russ" == Russ Allbery <[EMAIL PROTECTED]> writes: Russ> make uses a space as a separator, and getting it to accept spaces in file Russ> names is extremely difficult or impossible depending on the version of Russ> make that you're using. Yeah, and the problem is made worse because quoting for make isn't the same as quoting for the shell. You have to double quote things, sometimes in creative ways -- and automake likes to re-use user input, so in some situations doing this is probably just impossible. The problems aren't just with spaces, dollar signs and colons also cause fits. Tom
Re: FEATURE REQUEST: Uninstall script should be created by AutoMake.
> "Hans" == Hans Deragon <[EMAIL PROTECTED]> writes: Hans>Automake should create a script that simply contains all the "rm" Hans>commands and have it installed with the other binaries. You could write a program to do this, if you wanted to experiment with it. You would run `make -n uninstall' and set the variables NORMAL_UNINSTALL, PRE_UNINSTALL, and POST_UNINSTALL to magic strings so that you can determine command boundaries. This would work for a lot of packages, but not all of them, since most custom uninstall targets (if they exist, an already doubtful proposition) probably don't use these correctly. See the GNU Coding Standards for more info. Tom
Re: Support for precompiled C++ headers
> "Roberto" == Roberto Bagnara <[EMAIL PROTECTED]> writes: Roberto> Can anyone point me to a C++ project that is working with Roberto> precompiled headers and that is doing it with the currently Roberto> available versions of automake and autoconf? >From the gcjx project on sourceforge: BUILT_SOURCES = [...] typedefs.hh.gch headers = [...] ## FIXME: need a better way to convince libtool to let us do this. typedefs.hh.gch: $(headers) -rm -f typedefs.hh.gch $(CXXCOMPILE) -fPIC -DPIC -x c++-header -o typedefs.hh.gch $(srcdir)/typedefs.hh As you can see I had to hack around libtool. I didn't try a more direct approach, perhaps it would have worked. Also, I always build this project with --disable-static. Tom
Re: AM_CPPFLAGS not applied for CHECK programs?
> "Bob" == Bob Friesenhahn <[EMAIL PROTECTED]> writes: Bob> As a follow-up to this posting, I see that when Automake generates a Bob> specific rule for a target built in a subdirectory, it forgets to Bob> include $(AM_CPPFLAGS). This is a serious error. This is documented in the 'Program and Library Variables' node: In compilations with per-target flags, the ordinary `AM_' form of the flags variable is _not_ automatically included in the compilation (however, the user form of the variable _is_ included). So for instance, if you want the hypothetical `maude' compilations to also use the value of `AM_CFLAGS', you would need to write: maude_CFLAGS = ... your flags ... $(AM_CFLAGS) As I recall we made this choice so that people could set a global default and then override it for specific programs. Tom
Re: non-recursive automake advice?
> "tom" == tom fogal <[EMAIL PROTECTED]> writes: tom> Basically I'd like each module to build their own libtool convenience tom> library, and then have /src/Makefile.am link all of those modules' tom> convenience libraries into one that is the union of all of them. Do you really want each separate convenience library, or is that just inertia? I ask because if you don't want separate ones, the task becomes even simpler. tom> Without recursive make this is a little strange, but I've devised a tom> scheme to make each module 'feel' like it has its own, local tom> Makefile.am, by playing with includes. tom> libAll_la_LIBADD = \ tom> $(srcdir)/models/libModels.la \ tom> $(srcdir)/share/libShare.la \ tom> $(srcdir)/input/libInput.la \ tom> $(srcdir)/libCur.la You don't want $(srcdir) here. The .la files are in the build tree, not the source tree. Just write: libAll_la_LIBADD = \ models/libModels.la \ share/libShare.la \ input/libInput.la \ libCur.la tom> Unfortunately in the 'Inc.am' files I need to remember to qualify every tom> filename with not just '$(srcdir)', but tom> '$(srcdir)/modules_directory_name/'. This is only a minor annoyance tom> and definitely worth the trouble, but perhaps I am missing something? Once upon a time I had a plan to introduce a new 'import' statement, that would work like 'include' but magically rewrite things like this as needed. That would make it really simple to do the kind of thing you're trying to do. It was fairly complex, though, and in the end I lost interest... tom> noinst_LTLIBRARIES += $(srcdir)/models/libModels.la (Likewise no srcdir here) Tom
Re: non-recursive automake advice?
> "Bob" == Bob Friesenhahn <[EMAIL PROTECTED]> writes: Bob> Note that the messages appear to indicate that Automake does recurse Bob> once regardless. Some features require a $(MAKE) invocation in the same directory. Offhand I forget what. As I recall, removing this would be tricky. Tom
Re: Portable prefix pattern rules
> "Brian" == Brian <[EMAIL PROTECTED]> writes: Brian> The following doesn't seem to work: Brian> SUFFIXES = .moc.cpp I have never tried it but it is somewhat hard to imagine some versions of make accepting a suffix with two '.'s in it. Brian> The only other alternative I see is to enumerate a rule Brian> containing the actual file names for every single .h to Brian> .moc.cpp conversion, of which there are hundreds. Yeah, this is ugly but it works. IMO, and this is most likely a controversial opinion, it would be reasonable for automake to have an option allowing it to generate (and recognize in Makefile.am) code specific to GNU make. Tom
Re: Portable prefix pattern rules
> "Brian" == Brian <[EMAIL PROTECTED]> writes: Brian> If the autotools were to recognize these pattern rules, scan Brian> the source and automatically generate portable rules for me, I Brian> would be a very happy customer indeed :) Sorry, I thought that was what we were talking about. In terms of just using it, yeah, this doesn't work atm. Having automake recognize GNU make-style '%' rules isn't totally out of the question, IMO. It might be tricky to make this totally reliable, I'm not sure. (E.g., if you considered something like extending the built-in dependency tracking code to support user rules...) Alternatively, code to directly support moc would also be fine. Tom
Re: why does make install depend upon all?
> "Harald" == Harald Dunkel <[EMAIL PROTECTED]> writes: Harald> Please see subject. Of course I would agree that this Harald> dependency is usually a good thing, but sometimes it might Harald> be helpfull to do a 'make install' for another prefix e.g. Harald> in your stow directory without verifying all the dependencies Harald> again. This is traditional; there is no hard requirement for it in the spec, it is just the "way things were done", at least in the GNU world as I understood it back then. Tom
Re: compile not copied? Why?
> "Harald" == Harald Dunkel <[EMAIL PROTECTED]> writes: Harald> What is the criteria for copying the compile script into Harald> the source directory tree? I have some *.cc code, it is Harald> mentioned in my Makefile.am file, configure detects that Harald> the compile script must be used, too, but Automake doesn't Harald> provide it. Did you try 'automake -a'? Tom
Re: what happens to EXTRA_DIST during distcheck?
> "Ed" == Ed Hartnett <[EMAIL PROTECTED]> writes: Ed> In my top level makefile I have an EXTRA_DIST: Ed> # These files get added to the distribution. Ed> EXTRA_DIST = README COPYRIGHT RELEASE_NOTES Ed> But looking at the _build directory created during make distcheck, I Ed> do not see any of these files: They don't get copied to the build directory. They are put in the final .tar.gz that is the distributable result. Tom
Re: Built sources always regenerated
> "Braden" == Braden McDaniel <[EMAIL PROTECTED]> writes: Braden> Forget about BUILT_SOURCES and *_DEPENDENCIES. The sources I'm building Braden> get #include'd by browser.cpp. As such, checking of browser.cpp's Braden> dependencies should cause them to get (re)generated, right? Braden> But it doesn't. If I remove BUILT_SOURCES, the files don't get Braden> generated before browser.cpp gets compiled. Why not? Automake dependency tracking information is computed as a side effect of compilation. So, the first time things are compiled, it has no way of knowing about dependencies on generated files. BUILT_SOURCES is a hack to get around this. It basically inserts a 'make $(BUILT_SOURCES)' before targets like 'all'. (It would be nice to have per-{executable,library} BUILT_SOURCES...) I didn't look into this too deeply but I would guess that it is more make-related than automake-related. You can add rules to the Makefile.am to help with debugging, eg: hack: $(MAKE) $(BUILT_SOURCES) Then in theory 'make hack' should reproduce the problem you're seeing. If that doesn't happen then something weird and perhaps automake-related is going on. If it does happen, you can try plain old Makefile debugging with 'make -d'. Tom
Re: spaces in package version
> ">" == H Nanosecond writes: >> I encountered pavuk-0.9pl24 a url download utility, >> it has a space in the version name in the Init: >> AM_INIT_AUTOMAKE($PACKAGE, "$VERSION $host_alias") >> and this made make distcheck choke from not quoting in shell commands. >> So my question is what is the ruling on spaces in version names? >> Allowed or disallowed? It never occurred to me that somebody would do that. I would say that this is de facto disallowed. It seems to me that the only reasons to support it would be pedantic ones. In practice you can just use a "-" or something else. Tom
Re: pushing autoconf a bit too much?
> "Mark" == Mark Galassi <[EMAIL PROTECTED]> writes: Mark> lib_LTLIBRARIES = libgenie.la @SWIG_LIBRARIES@ Mark> EXTRA_LTLIBRARIES = libge.la libgeperl.la libgeguile.la This is right. Mark> /bin/sh ../libtool --mode=install /usr/bin/install -c libge.la /tmp/junk/lib/libge.la Automake uses a special libtool invocation to install libtool libraries. This insulates automake from details of libtool's implementation. Mark> The *unconditional* library, which is libgenie.la, installed just fine Mark> (it had a .libs/lai file), but the *conditional* libraries in Mark> @SWIG_LIBRARIES@ (in this case libge.la and libgeperl.la) do not Mark> install, I presume because there is no .libs/libge.lai and Mark> .libs/libgeperl.lai. I believe you have to somehow notify libtool that you plan to install these libraries. I think adding a -rpath to the link line will do it. When it can do so, automake discovers this information statically -- but in an EXTRA_ case it cannot. In your situation I recommend using automake conditionals and not @SWIG_LIBRARIES@. This will simplify your life. if SWIG lib_LTLIBRARIES = libgenie.la libge.la libgeperl.la libgeguile.la else lib_LTLIBRARIES = libgenie.la endif Tom
Re: pushing autoconf a bit too much?
>> if SWIG >> lib_LTLIBRARIES = libgenie.la libge.la libgeperl.la libgeguile.la >> else >> lib_LTLIBRARIES = libgenie.la >> endif Mark> I was trying that, and I get these two messages, the first seems Mark> incorrect and the second is very obscure: Mark> [rosalia@odie genie]$ automake -a Mark> automake: libgenie/Makefile.am: `libgenie.la' is already going to be installed in `lib' Mark> libgenie/Makefile.am: libgenie_la_OBJECTS should not be defined Mark> [rosalia@odie genie]$ Sorry about that. These are bugs in 1.4. They're fixed in cvs, I think. Try this instead: if SWIG extra = libge.la libgeperl.la libgeguile.la endif lib_LTLIBRARIES = libgenie.la $(extra) T
Re: Problem with defining a new automake target (perhaps a general make question)
> "Andrew" == Andrew S Townley <[EMAIL PROTECTED]> writes: Andrew> Hi. I have been trying to add support for our ESQL/C Andrew> preprocessor to an Automake makefile, but I'm having trouble Andrew> with a couple of different things. Andrew> The extra sed command is there so that the dependencies will Andrew> actually be set to the .ec file instead of the generated .c Andrew> file and the generated file is removed at the end. This works Andrew> great as long as the esql command completes successfully. If Andrew> the esql command fails, the generated file is not removed and Andrew> the next time the compile takes place, it doesn't reprocess Andrew> the .ec file due to the implicit rules. You're saying that if the esql command fails, it still leaves the .c file, and so the .c.o rule is run the next time? Oops. Programs like this should always remove their temporary files. You could write a shell script to wrap esql and remove the file. Or you could do it in the Makefile rule, but that would be more ugly: esql ... ; status=$$?; rm -f temp-file > /dev/null 2>&1; exit $$status Andrew> I guess my questions boil down to: 1) is this the right way to Andrew> integrate a new target into the automake dependency rules, and Andrew> 2) without adding suffix rules for .ec -> .c -> .o, is there a Andrew> better way to do it? Your approach looks fine to me, except for the temp file thing. Andrew> With regards to the 1st question, I am wondering if the Andrew> solution is linux-specific and therefore won't work on other Andrew> platforms with non GNU C compilers. It is gcc-specific, not Linux-specific. The -Wp stuff is a gcc thing. You can just eliminate that if you don't mind getting rid of dependency tracking. There's no real way to have both without hacking automake itself :-( (At least, not in 1.4.) Tom
Re: Complete Newbie desperately seeking help...
> "Mike" == Mike Reilly <[EMAIL PROTECTED]> writes: Mike> As root, I ran "tar -zxvf automake-1.4.tar.gz". I Mike> then went into the directory that was created, ran Mike> ./configure, make and then make install. Mike> When I checked the version, however, I found that I Mike> still had automake 1.3 installed. By default configure will put automake in /usr/local. You probably don't have /usr/local/bin in your path. Or, if you do, you probably didn't rehash ("hash -r" for bash) Tom
Re: suggestion for y.tab.c vs y_tab.c
> "Mark" == Mark E <[EMAIL PROTECTED]> writes: Mark> Could an equivalent change be considered to automake.in so Mark> DJGPP's bison will work? While not the best solution, it's the Mark> best one I can think of until there is suitable macro in Mark> Autoconf to deal with this. I'm not familiar with the language Mark> used in automake.in but from what I see I could probably make a Mark> patch if need be. Thanks. As you say, I'd really like to see a nice autoconf macro to do this. However, in the meanwhile this change is ok with me. I haven't been hacking automake much lately :-( It would be best if you sent in a patch. Tom
Re: DEFS vs. INCLUDES
> "Linus" == Linus Nordberg <[EMAIL PROTECTED]> writes: Linus> What's the reason behind putting all `-I' stuff in DEFS? If you mean in Makefile.in, it is laziness. Linux> I would like to pass a few -D to m4, but don't want the `-I':s. Use INCLUDES to hard-code your own options. You can (and should) use AM_CLFAGS on newere versions. Linus> DEFS = @DEFS@ -DMY_DEF Don't redefine DEFS. Linus> The background is that we want to preprocess some assembly code Linus> with m4 before assembling it. It needs -DPIC for .lo. Is Linus> there a way of telling libtool how to do that? I doubt that there is a way to tell libtool how to do this. Tom
Re: DEFS vs. INCLUDES
Linus> hmpf! :) [ ... ] Linus> (AM_CLFAGS, hmmm... American C language fags? :-) It is early here. My fingers aren't awake yet. Linus> I don't want to pass any `-I' to m4 since some m4's don't like that, Linus> so using INCLUDES is no good. How should AM_CFLAGS be used? Ah. Don't use DEFS or AM_CFLAGS for m4. Introduce a new macro and use it in your rule which runs m4. eg, AM_M4FLAGS Tom
Re: tar command executed during make dist
> "Merijn" == mdejonge <[EMAIL PROTECTED]> writes: Merijn> I'm new to this mailing list so I don't know whether to Merijn> following has been discussed before (At least I couldn't Merijn> find anything on the mailing list). It has been, but it may have been quite a while ago. Merijn> During a "make dist" the command "$(TAR) chozf" is executed. Merijn> Unfortunately, the "z" option (which causes tar to execute Merijn> gzip) is GNU tar specific while the command "tar" is not. This issue is completely resolved in the next automake. Meanwhile, you can write your own dist target: my-dist: distdir ... do whatever ... Tom
Re: tar command executed during make dist
> "Lars" == Lars J Aas <[EMAIL PROTECTED]> writes: Lars> Any point in using the f flag at all in such cases? Lars> => $(TAR) cho $(distdir) | $(GZIP) -c > $(distdir).tar.gz I thought some versions of tar defaulted to something like /dev/rmt8 (or some other equally obscure device). It's hard to remember. Anyway, there's no harm in using it. T
Re: Single line dependency lists become too long
> "Merijn" == Merijn de Jonge <[EMAIL PROTECTED]> writes: Merijn> In the Makefile.am I have listed these dependencies on Merijn> multiple lines, Automake generates a Makefile.in which Merijn> contains all 356 file names on a single line. This line is too Merijn> long for configure to instantiate and the result is a Merijn> truncated line of dependencies. This is a bug in automake 1.4. It is fixed in the cvs automake. Tom
Re: what goes in a distribution.
> "Todd" == Todd Dukes <[EMAIL PROTECTED]> writes: Todd> I was disapointed that pkgdata_DATA files are not included Todd> in a distribution without also being listed in EXTRA_DIST. Todd> Is this a bug? The section in the manual says EXTRA_DIST is Todd> for files that aren't installed by the regular rules, but it Todd> never says what the regular rules are. It isn't a bug. The regular rule is that most things aren't included in the package by default. I think only _HEADERS might be. In the next version you'll be able to control this more easily: dist_pkgdata_DATA = ... nodist_pkgdata_DATA = ... This is already implemented in the cvs verson. Tom
Re: Single line dependency lists become too long
Merijn> This line is too long for configure to instantiate and the Merijn> result is a truncated line of dependencies. Akim> What happens? Do you know where this limitation comes from? Typically sed chokes when building Makefile from Makefile.in. I forgot about this when making some changes in Automake's front end. They have since been fixed. Tom
Re: Automake security problem
Jim> Here's an untested patch. I'll look at this soon. Jim> BTW, Tom, what about that last patch I sent in (testing for close Jim> failure)? Yeah :-( I haven't had much automake hacking time for a while (again). I guess I'd like to get more people checkin rights to make up for my failings. Jim, if you're interested, tell me, and I'll send you the information. Tom
Re: REPLACE_GNU_GETOPT
> "Hal" == Duston, Hal <[EMAIL PROTECTED]> writes: Hal> I am setting my package up with automake/autoconf, and want to Hal> provide gnu getopt if it is not available. I noticed Hal> AC_REPLACE_GNU_GETOPT in the info file, but it doesn't seem to be Hal> completely implemented. It is in automake, and the info file, Hal> but nowhere else I could see. Is this functionality going away? Hal> How can I properly do what I want to do? I don't know perl, so I Hal> am somewhat limited in my ability to read the source. This stuff you found is sort of a relic from the old days, when automake was more or less used only by the Gnits people. AC_REPLACE_GNU_GETOPT was never in automake, as I recall. You just had to know where to get it :-( Perhaps Jim has a copy. Jim? Tom
RE: REPLACE_GNU_GETOPT
Hal> That would be something I would install on my development system? Hal> Would I need to put it somewhere special so it doesn't go away if Hal> I upgrade automake? I would really rather do it the "proper" way Hal> if true. Or is get a copy the "proper" way? Still trying to get Hal> my brain wrapped around this stuff. Probably the best way would be to get a copy and put it into your acinclude.m4. I doubt this macro changes with any frequency. Tom
Re: numeral for first character--
> "Olly" == Olly Betts <[EMAIL PROTECTED]> writes: >> localhost:~/src/4dim$ automake >> Makefile.am:2: bad macro name '4dim_SOURCES' Olly> I hit this problem a few months ago. It's an unnecessary Olly> restriction which is fixed in the CVS version of automake. Olly> If you'd prefer not to use that you can probably work around by Olly> building your program as "x4dim" and then fiddling with the name Olly> before it gets installed. You can also hack your copy of automake to let you do this. As I recall that patch is quite simple, involving a change to a single regexp up at the top of automake.in. Tom
Re: Automake with different compilers
> "Carl" == Carl van Schaik <[EMAIL PROTECTED]> writes: Carl> I am at present trying to get automake to build a library that Carl> contains various source files (.c, .s) that I want to compile Carl> with different compilers. Automake no matter what I have done, Carl> sees the suffix (.c, .s) and automatically starts to compile it Carl> with gcc. If I force the $(COMPILE) variable, the as complains Carl> about the "-c" which is inserted for gcc. Carl> Is there any nice way to fix this? I'm not sure I understand what you are asking. Are you asking how to make it so .s files aren't compiled with the C compiler? I agree this is an automake bug (of long standing). You can probably make your own ".s.o" rule. Or make your own rule on a per-object basis. Tom
Re: test pr19 failure: explanation
> "Jim" == Jim Meyering <[EMAIL PROTECTED]> writes: Jim> I've just looked into the failure of test pr19. I don't understand. pr19.test works fine for me. Tom
Re: Automake with different compilers
> "Carl" == Carl van Schaik <[EMAIL PROTECTED]> writes: Carl> Also, there any way to get automake to compile a .c file to .o Carl> and not make a library or program out of it? Carl> I'm doing some cross-compiling stuff that only seems to work if Carl> I compile files to .o and use the linker to create a Carl> binary... this is a pain in automake ... You can override the linker on a per-executable basis. This might be what you want to do. Failing that, here is the standard way to build a .o without building anything else: EXTRA_LIBRARIES = libdummy.a libdummy_a_SOURCES = my-program.c Then arrange for my-program.o to be a target somehow, e.g., arrange for it to be installed. Tom
Re: RPM targets
> "Michael" == Michael Bletzinger <[EMAIL PROTECTED]> writes: Michael> Is anyone working on creating rpm targets similar to "make Michael> dist". I was thinking that automake could for example Michael> generate the spec file from a template. I occasionally hear about people wanting this, but I don't recall seeing any implementation. Tom
Re: Patch: $(wildcard *.c) style globbing
> "John" == John Fremlin <[EMAIL PROTECTED]> writes: John> This patch adds the ability to use partial $(wildcard *.h) John> (GNU?) make-style syntax in automake variable declarations. It John> is useful for projects where the files for each target are are John> organised in a single directory. This comes up periodically. I'm still against it, I'm afraid. Tom
Re: Why does Makefile depend on BUILT_SOURCES?
> ">" == OKUJI Yoshinori <[EMAIL PROTECTED]> writes: >> 2. The target `Makefile' depends on $(BUILT_SOURCES). >> The solution may be to get rid of the dependency in the item 2, >> so my question is why Makefile must depend on BUILT_SOURCES. Is >> there some good reason? Only a historical one. In the old days we computed dependencies early, so we had to make sure that built sources were built before the dependency-generation step. Now I think we don't need this. I'm in favor of removing this. Send a patch... maybe Jim will apply it :-) Tom
Re: Why does Makefile depend on BUILT_SOURCES?
>> Ok, I will do. BTW, now does Jim have the maintainship? I thought >> you were the maintainer. I am, but I haven't had much time to check things in lately. Neither has Alexandre, I think. I'm hoping Jim will. T
Re: test pr19 failure: explanation
Pavel> In fact, the failures are caused by other tests that Pavel> (erroneously!) copy install-sh, missing and mkinstalldirs to Pavel> the test directory. Thanks. I'm seeing this now. I'll look into it soon. Tom
Re: nodist_foo_HEADERS?
Harlan> Anybody have an idea of how much trouble it will be to accept Harlan> the nodist_ prefix on HEADERS? If you're using the cvs automake, then I'm suprised it doesn't work. Could you write a test case (in the style of the test suite)? Tom
Re: derived sources and parallel builds
> "Alex" == Alex Hornby <[EMAIL PROTECTED]> writes: Alex> This is a problem as Foo_real.cpp includes Foo_derived.h. Alex> How can I get a result similar to the old BUILT_SOURCES, where Alex> the commands to produce the derived source files for a target Alex> are run _before_ the non derived source files of that target are Alex> built? Write dependencies that express what you mean: Foo_real.o Foo_real.lo : Foo_derived.h I agree this isn't ideal. Suggestions welcome, patches more so. Alex> Is the solution to always make dependancies before a parallel Alex> build commences? No. We used to do that and it has undesirable consequences. (Well, our implementation did. But I do think our present direction is the better one.) Tom
Re: derived sources and parallel builds
> "Alex" == Alex Hornby <[EMAIL PROTECTED]> writes: >> How about just creating standard Makefile dependencies? >> >> Foo_impl.o: Foo_s.hh Foo_c.hh Foo_s.cpp Foo_c.cpp >> Foo_impl.$(OBJEXT): Foo_s.hh Foo_c.hh Foo_s.cpp Foo_c.cpp Alex> That would work. I wanted to avoid adding dependencies manually Alex> because a) I'm lazy :) and b) parallel builds are somewhat Alex> non-deterministic so I could never be sure we'd covered them all. Yep, it is a problem, and it gets larger the more auto-generated headers you have. I agree it would be nice to say "build these files before doing anything else". That's sort of a pain with make. One idea would be to introduce an "all-hook" which would be run before "all". This solves the common problem, but not the full one (what if the user runs "make some-executable"?). Tom
Re: bcc55 support ?
> "Ionutz" == Ionutz Borcoman <[EMAIL PROTECTED]> writes: Ionutz> Is automake going to support bcc55 ? Or is there a way to create the Ionutz> Makefile for Win$$ + bcc55 from Makefile.am ? I don't know what bcc55 is. I'm not against supporting it. We already support Microsoft's compiler, I think. Somebody else will have to do the work, of course. Tom
Re: yacc and flex
> "Sascha" == Sascha Ziemann <[EMAIL PROTECTED]> writes: Sascha> flex conf_lexer.l && mv lex.yy.c conf_lexer.c Sascha> gcc -c conf_lexer.c Sascha> bison -y conf_parser.y && mv y.tab.c conf_parser.c Sascha> gcc -c conf_parser.c Sascha> And this means, that the tokens are not available in the Sascha> conf_lexer.c file. How can I solve this problem? Add an explicit dependency: conf_parser.o: conf_lexer.c We're trying to come up with a better solution for this problem. Tom
Re: derived sources and parallel builds
> "Alex" == Alex Hornby <[EMAIL PROTECTED]> writes: >> I agree it would be nice to say "build these files before doing >> anything else". That's sort of a pain with make. One idea would >> be to introduce an "all-hook" which would be run before "all". >> This solves the common problem, but not the full one (what if the >> user runs "make some-executable"?). Alex> How about generating per executable/library hooks? That way both Alex> the make all and make some-executable cases are covered. This might be the best way to go. It is sort of a pain to implement; for each such exe we'd have to do this: exe: dummy build the exe dummy: $(MAKE) exe-hook $(MAKE) actual dependencies for exe here T
Re: flex -lfl
> "Sascha" == Sascha Ziemann <[EMAIL PROTECTED]> writes: Sascha> bin_PROGRAMS = rlpd rlpc Sascha> rlpd_SOURCES = rlpd.c rlp.c errwrap.c conf_parser.y conf_lexer.l Sascha> rlpc_SOURCES = rlpc.c rlp.c errwrap.c Sascha> How can I prevent the second program (rlpc) from beeing linked Sascha> with the flex lib? Try setting rlpc_LDADD (I think -- I don't look at the lex stuff much). T
Re: flex -lfl
Sascha> I don't specify -lfl and I don't use LIBS, so I can not use Sascha> LIBADD instead. I don't understand this. Anyway, the current automake doesn't force you to link with @LEXLIB@. Neither did automake 1.4, if I remember correctly. So I don't understand why you are seeing -lfl on your link lines. Can you find out where it is coming from? Sascha> - How can I check for flex and bison instead of lex and yacc? I use Sascha> some flex/bison specific features, which would not work with Sascha> lex/yacc. There's no really good way. Try doing "LEX=flex" and "YACC=bison" in configure.in before running the macros. Bleah. Sascha> - How can I get the evaluated value of sysconfdir into a file? When I Sascha> put @sysconfdir@ in a .in file, it will be replaced by Sascha> "$(prefix)/etc". And this does not help much in a header file for Sascha> example. Use sed in Makefile.am. If we had a FAQ, this would be one. I could easily set up faq-o-matic for automake. Would anybody find this useful? Sascha> - Is there something like YACCFLAGS, that can be used for the -d Sascha> option of yacc? AM_YFLAGS Tom
Re: Failing cond5.test with "make check"
Chris> When I run "make check" when installing automake-1.4, Chris> it passes all 194 tests, except for the cond5.test. Chris> How do I fix this? This test tests for a bug that used to cause automake to hang. The test works by running automake and then waiting a while to see if it terminates. This doesn't work too well if the machine is slow for some reason. In automake 1.4 the test waited for 5 seconds; in the cvs version it waits for 15 seconds. So, the probable answer is "don't worry about it". Tom
Re: the appropriate location for depcomp
> "Okuji" == OKUJI Yoshinori <[EMAIL PROTECTED]> writes: Okuji> Automake requires some helper scripts such as `depcomp', but there Okuji> is an inconsistency. The documentation tells me that they are put in Okuji> the top directory or the source directory. However, while `depcomp' is Okuji> required in the source directory by automake, the `Makefile.in's Okuji> produced by automake requires that it is in the top directory. This Okuji> can also apply to `compile', because AM_PROG_CC_C_O sets CC to Okuji> "$(top_srcdir)/compile $CC" instead of "$(srcdir)/compile $CC". Okuji> So my question is which directory you wanted to use for them Okuji> actually. As far as I see, there is no reason that they should exist Okuji> in all source directories, so I think they should be put into the top Okuji> directory. Currently automake implements a strange rule here: * If AC_CONFIG_AUX_DIR is set, then always use it. (This part makes sense.) * Otherwise, search for the file in ., .., etc (just like autoconf does at runtime). If we find the file, then this path is saved and used to find other such files (in effect setting the config aux dir) * Otherwise, this means the file wasn't found. If we're installing it, install it locally. This rule works ok for some files, like mdate-sh. But for depcomp it is clearly not the best choice, since it will basically force everybody to use AC_CONFIG_AUX_DIR(.) -- which is ugly and unintuitive. So, I'm planning to change the rule to always choose top_srcdir as the default location (i.e., in the final step). Comments? Tom
Re: ILD too long
> "Thomas" == Thomas Tanner <[EMAIL PROTECTED]> writes: Thomas> I wonder why automake passes CFLAGS and AM_CFLAGS Thomas> to libtool in link mode? This makes it impossible Thomas> to differentiate between compiler and linker (LDFLAGS) Thomas> flags. The GNU Coding Standards mandate this. Tom
Re: derived sources and parallel builds
> "Alex" == Alex Hornby <[EMAIL PROTECTED]> writes: Alex> Do you have any hints before I dive in? Is the current CVS head Alex> a good starting point? CVS head is a good starting point. There is a HACKING file that explains some rules (which may or may not apply; I forget exactly what is in it :-) >> $(exe_OBJECTS): exe-hook I'm skeptical that this approach will work :-( Won't this run the hook on every "make"? Tom
Re: derived sources and parallel builds
>> I'm skeptical that this approach will work :-( Alexandre> I still think we should revert to the old behavior of Alexandre> Makefile depending on BUILT_SOURCES, at the very least, as Alexandre> a deprecated feature, for backward compatibility. This only works for GNU make. So it was never a reliable mechanism to begin with. Tom
Re: derived sources and parallel builds
Alexandre> So maybe we should use a stamp file that all $(OBJECTS) Alexandre> depend upon, and that runs $(MAKE) $(BUILT_SOURCES) before Alexandre> it touches itself. It still wouldn't work in case someone Alexandre> removes one of the BUILT_SOURCES but leaves the stamp file Alexandre> around, but it could be a starting point. There might not be anything better. Bleah. T
Re: sub-project link problem
> "Lars" == Lars J Aas <[EMAIL PROTECTED]> writes: Lars> The problem is that when it's time to link against the C++ Lars> library in the parent project, a C linker is chosen, which fails Lars> on a lot of platforms (IRIX is one of them). Lars> Does anyone know why $(LINK) is chosen instead of $(CXXLINK) in Lars> the subproject, even though it's a C++ project, using the C++ Lars> compiler while compiling the objects? Automake should always choose the C++ linker if a C++ source file is seen. If you only have C source files, then automake has no way of knowing that the library is a C++ library. I assume this isn't the case for you, in which case you've found a bug. Could you make a smaller test case? Tom
Re: Using EXTRA_SOURCES
Sam> The basic problem is that source is conditionally compiled into Sam> SDL, and supposedly the attached code is supposed to do the Sam> "right thing", but I can't get it to include rules for the Sam> objects. I can get the desired effect using automake Sam> conditionals, but the complexity of targets in SDL means that it Sam> takes literally 5 minutes to generate the makefiles. What you have definitely won't work. This would be a FAQ if we had a FAQ :-). (Hey, what do people think about using FAQ-O-Matic for Automake? I could set it up easily.) You can have a configure substitution in _SOURCES. I know this seems wrong. It is unfortunate that 1.4 doesn't complain about this (1.5 will). Instead you have to play tricks with _LDADD if you want to go the configure substitution route. How big are your Makefiles? How many? Is there a way to solve the performance problem? Conditionals are the easiest way to go from the maintenance perspective (in most cases, imho, etc). Tom
Re: catching build problems
>> How do I override the rule for entering one of them? Alexandre> I don't think you can override it :-( You can rewrite the entire rule, but of course that will break in a future version. Tom
Re: Problem of unsubscription
> "Christophe" == Christophe Gros <[EMAIL PROTECTED]> writes: Christophe> I'm sorry to disturb you with some administrative matter, Christophe> but I've tried to unsubscribe from the automake mailing Christophe> list without success. It seems that there is a Christophe> discrepancy between sourceware.cygnus.com and Christophe> [EMAIL PROTECTED] I don't think there is a "automake@sourceware" list. If by discrepancy you mean that the unsub methods are different, then you are right. GNU uses something like majordomo -- send to "[EMAIL PROTECTED]". sourceware uses qmail, which is different. If that doesn't work for you, you will have to contact the FSF sysadmins. I don't run the GNU list. Tom
Re: To hack or not to hack
> "Michael" == Michael Bletzinger <[EMAIL PROTECTED]> writes: Michael> The first issue is: Where should the common configuration Michael> information needed to build the various subpackages be Michael> stored? It depends on your goals. Michael> I would like to set up a base package for the project which Michael> includes all of the necessary configuration info (headers, Michael> script functions, macros etc). All of the other packages Michael> would then include these base files to avoid redundant Michael> configuration code. Sounds like you want to use config.cache to speed up the configuration process. Some packages, like Gtk+, create a script which encodes the configuration information it discovered. Then users of the package can use this script's output to find the answers quickly. Michael> - automake forces me to configure libtool for every package. Michael> I cannot use a configured libtool installed in a base Michael> package. You might be able to make this work somehow, but I don't know it offhand. Michael> - autoheader automagically defines the PACKAGE and VERSION Michael> variable in every config.h file making it impossible for Michael> packages to cleanly include more than one config.h file. Use the (documented) 3rd arg to AM_INIT_AUTOMAKE to disable this. Michael> The second issue is: What to do about "flavors"? Globus Michael> delivers the same library in multiple binaries that are built Michael> with different configuration options. Examples of flavors Michael> include (--with-threads, and 32/64 bit integers). Currently Michael> Globus gives each flavor of a library the same name but Michael> stores it in a different directory. You can already do this. We do this with runtime libraries for the compiler, where it is called "multilibbing". You can make the install directory depend on configuration options in several different ways. Michael> 1. Do I need to sign anything to work on these tools? I Michael> already have contribute permission from NCSA. If you make significant changes you have to sign paperwork for the FSF. Michael> 2. Is there something else that I can read concerning the Michael> overall design of these tools? For automake, no. Michael> 3. Would it be possible for me to have someone to pester Michael> with questions so that I don't violate some design paradigm Michael> that would prevent my patches from becoming accepted? For automake, ask the list. I won't always have time to answer, but other people probably will. Tom
Re: Testing for a library with $CXX (instead of $CC)
> "Fred" == F Labrosse <[EMAIL PROTECTED]> writes: Fred> Is it possible to test for a library using $CXX instead of $CC? Fred> My problem is that on a Solaris box, some libraries won't link Fred> with cc while they do with CC (e.g. qt). Can't you set the language to C++ and then test? Seems like this ought to work. If not, it is probably an autoconf bug. Fred> Or is it better to set $CC to CC (instead of having $CC=cc and Fred> $CXX=CC). No, that's worse. Tom
Re: automake --add-missing --copy
> "Lars" == Lars J Aas <[EMAIL PROTECTED]> writes: Lars> I've always been annoyed that automake --add-missing --copy Lars> doesn't pass the "--copy"-option along to libtoolize, so I end Lars> up with symlinked config.guess, config.sub, ltconfig and Lars> ltmain.sh. Fixed. Lars> This might be fixed already, it's an "old" (month at least) Lars> automake 1.4a I've got installed. That's funny, given the recent pace of development. Tom
Re: Testing for a library with a function different from main() (Was: Testing for a library with $CXX (instead of $CC))
> "Morten" == Morten Eriksen <[EMAIL PROTECTED]> writes: Morten> BTW, I've already written a pretty extensive Qt-check macro. Feel free Morten> to check it out, its part of our SoQt distribution at Morten> http://www.sim.no/coin.html>, or just do: Why not submit it to the autoconf macro archive? T
Re: AC_PATH_XTRA
> "Fred" == F Labrosse <[EMAIL PROTECTED]> writes: Fred> On solaris, X11 needs some extra libs to be used. Apparently, Fred> AC_PATH_XTRA can find them and sets some variables: X_CFLAGS, Fred> X_LIBS, and X_EXTRA_LIBS. Fred> My question is what is the best way to use them? "best" depends on your goal. I usually add them to LDADD. Tom
Re: source files not in current directory
> "Alex" == Alex Hornby <[EMAIL PROTECTED]> writes: Alex> I'm using CVS automake and have source files in sub directories of my Alex> project like so: Alex> That would be fine, but AFAICT there is no way for make to apply the Alex> normal .c.o suffix rule in this case to produce the object file. How Alex> can I get this behaviour? I thought I fixed this to generate explicit rules in this situation. Perhaps I missed a case :-( T
Re: Exporting variables
> "Fred" == F Labrosse <[EMAIL PROTECTED]> writes: Fred> DEFINESDOC=`echo @DEFS@ | sed -e "y/-D/ /"` && \ Fred> export VERSION DEFINESDOC DOC_H_FILES && \ Fred> makeDoc && \ Fred> touch doc Fred> Only DEFINESDOC is available in makeDoc!!! i.e. the variables Fred> not defined in the rule are not exported (their value is correct Fred> just before calling makeDoc). You have to set the variables in your rule explicitly: VERSION="$(VERSION)" DOC_H_FILES="$(DOC_H_FILES)" T