Re: AC_C_BIGENDIAN

2000-09-06 Thread Guido Draheim

"Lars J. Aas" wrote:
> : Does anybody sees a means to compute this at compile time?  Using the
> : same trick as we did for SIZEOF etc.
I did need a cross-compile AC_C_BIGENDIAN once upon a time, so there is a
AC_C_BIGENDIAN_CROSS submitted to the autoconf-archive (hopefully Peter
will update the page some day). The basic idea is to create a binary
from special C-source and "grep" the binary for some pattern. The C-source
has a static-array of integers, so the integer-literals will be pushed in
target-order. Anyone has a better idea?

I think AC_C_BIGENDIAN *should* have a cross-compile check, so
may be one should append such a section to the tail of AC_C_BIGENDIAN.
(the macro AC_C_BIGENDIAN_CROSS is just that).

have fun... and here's the code... 

AC_MSG_CHECKING(whether byte order LOOKS bigendian)
[
cat >conftest.c <&AC_FD_MSG
   ac_cv_c_bigendian=yes 
fi
if test `grep -l LiTTleEnDian conftest.o` ; then
   echo $ac_n 'looks little-endian ... ' 1>&AC_FD_MSG
   ac_cv_c_bigendian=no
fi
 fi
  fi
AC_MSG_RESULT($ac_cv_c_bigendian)




Re: AC_C_BIGENDIAN

2000-09-06 Thread Guido Draheim

Pavel Roskin wrote:
> 
> Hello!
> 
> > ] if test -f conftest.c ; then
> >  if ${CC-cc} conftest.c -o conftest.o && test -f conftest.o ; then
> > if test `grep -l BIGenDianSyS conftest.o` ; then
> >echo $ac_n "looks big-endian ... " 1>&AC_FD_MSG
> >ac_cv_c_bigendian=yes
> > fi
> > if test `grep -l LiTTleEnDian conftest.o` ; then
> >echo $ac_n 'looks little-endian ... ' 1>&AC_FD_MSG
> >ac_cv_c_bigendian=no
> > fi
> 
> An what happens if neither is found? What's the default? It should be an
> error.
Correct. I do faintly remember that there had been a test-has-run-ok check,
but it seems to have been lost over the years. The check had just worked
correctly on every platform I had, oh well. Gets fixed...
> 
> What happens if short integers occupy 4 bytes?
Er, actually, I never heard of a platform where that is the case, anyway,
both tests would fail and the test should choke. Can anybody point out
a platfrom where sizeof(short) is 4?

>
> Also I would prefer names containing "ac". However careful you are in
> using unusual capitalization, it is better to show a good example of using
> only Autoconf namespace. My suggestion for the strings: "AC_LITTLEENDIAN"
> and "AC_BIGENDIAN___"
Hmmm.
> 
> By the way, can anybody think of an algorithm that avoids explicit support
> for ASCII and EBCDIC only?
If the target has neither ASCII/EBCDIC, then both tests have a good chance
to fail anyway. As suggested above, the test should choke then. Where could
that happen... hmm... unicode-only host-platform? Anybody seen that? It
seems I am locked to much with the 20 platforms I have around me... ;-)

have fun...
-- Guido




Re: AC_C_BIGENDIAN

2000-09-06 Thread Guido Draheim


To summarize: the code assumes host-platform to be 8-bit ebcdic
or 8-bit ascii. target-platform has 16-bit per short. This is the setup of
everyone who runs cross-compilers these days AFAIK. The test should explicitly
check that one answer has been found (neither zero nor two) and otherwise
fail just like the current AC_C_BIGENDIAN does. This case of one amongst
a million would enforce the autoconf'ed package installer to go without
autoconf'ed configure, whereas that is the case we have now for
*everyone* trying to cross-compile packages containing AC_C_BIGENDIAN.

Now, would the code have a chance to go into main' autoconf anyway,
or better leave it to some AC_C_BIGENDIAN_CROSS in the autoconf-archive...

Guido Draheim wrote:
> that happen... hmm... unicode-only host-platform? Anybody seen that? It
> seems I am locked to much with the 20 platforms I have around me... ;-)




adding platform-defaults [Re: AC_C_BIGENDIAN ... default parameter]

2000-09-08 Thread Guido Draheim

Bernard Dautrevaux wrote:
> 
> BTW, having such an automatic default option shoudl be nice for almost all
> tests; I know I can just "ac_c_bigendian=yes ./configure" but I never knows
> the name of the config.cache variables, so "./configure --set-bigendian=yes"
> coudl do the same.

   a pre-setting of the cache-variables has no chance to go into a 
`make reconfigure` line. Same with pre-setting $CC or $CFLAGS which one usually
does with crosscompiled autoconf. The usage of "--set-bigendian" would be of
course a nice thing, but you could not go with an overall setting for all the
autoconf package, since some packages may not have an AC_BIGENDIAN. It is
therefore needed to have some "--ac-=xxx" that would be silently accept
any  even if not used anywhere in configure.

> 
> Anyway in both cases what would *really* be needed is when a test like that
> fails would be to add a disgnostic line like "If you know the answer, just
> rerun configure witn option '--set-bigendian=yes'" or "... by typing
> 'ac_c_bigendian=yes ./configure'".
> 
> This should not be too hard to do (disclaimer: I'm *not* an M4 programmer)
> and will make the life of cross-compiler-based installers a lot easier :-)
> 
Well, it would not be too hard, but it would not solve problems of any
similar scheme. Perhaps have a generic AC_OPTION(defname [,explanation]),
that would go with --ac-defname, but I dunno how to really design it. .

As for cross-compiling, I'd ask if there is a pre-include option, e.g.
--include=, where the script could hold the various 
cache-variables to be preset, or even start to preset $CC. This option 
would also go into the `make reconfigure`, so all you need is a (global)
script that has all the cache-variables at hand for the target-platforms
that are available in the lab (which is of limited number surely).
If there is no such option (not that is AFAIK), it should be added IMHO.

have fun...
-- Guido




Re: adding platform-defaults [AC_C_BIGENDIAN]

2000-09-08 Thread Guido Draheim

Bernard Dautrevaux wrote:
> 
> Sure enough; I constantly forget this, probably because I *never* run
> configure manually but through a script in the build tree (and BTW type
> "./configure" instead of "make reconfigure" as my scripts usually
> automatically call aclocal/autoconf/automake)
Well, it's not that I like to run the configure directly, the point is
the Makefile.in generated by `automake`. It has some, well, dependencies,
that will trigger `config.status` or `config.status --recheck` which will 
in turn run the original `configure XXX --no-create`. Now just replace
the XXX with the arguments originally given to `configure`. But if you had
some interesting shellvariables on the initial `configure`, well, they are 
lost in here, so that `configure` may fail.

> 
> > As for cross-compiling, I'd ask if there is a pre-include option, e.g.
> > --include=, where the script could hold the various
> > cache-variables to be preset, or even start to preset $CC. This option
> > would also go into the `make reconfigure`, so all you need is
> > a (global)  script that has all the cache-variables at hand for the
> > target-platforms
>
> In fact that's exactly what my wrapper script does: depending on the
> platform I'm compiling for, preset various variables (CC, CFLAGS,
> ac_cv_bigendian, etc.) before calling configure (note that I also use a
> similar script to call make too).
> 
> Of course such a site-wide host-dependent set of default settings would be
> nice and should simplify all my scripts.
> 
That's what I'm talking about. The current `configure` does currently
two things to preset shellvariables: it first looks for a "site-wide"
shellscript named by $CONFIG_SITE (defaults to $prefix/(share|etc)/config.site) 
and loads that one, and right thereafter the config.cache is loaded. 
Now guess what - there is no `configure`-option to set $CONFIG_SITE,
and furthermore, the default is not dependent on the --target option.

Well, I call this a flaw in the config.site handling, and somewhere 
around that piece of the configure-script, some enhancement would
be easy - and it would surely solve the problems of many cross-compiling
setups. Two propositions, to be discussable on the mailing-list:

a) add an option "--config-file=" that sets $CONFIG_SITE. This equals
   the option "--config-cache=" that sets $cache_file. May be some
   other variable $config_file would be needed, so $CONFIG_SITE is not
   assigned to. Should get the same advertising in the help-screen as 
   --config-cache does.

b) enhance the default-guess for the place of config.site, especially if
   the user gave an explicit "--target" option. The $target-default is NONE,
   so we have decent fallback case. Any suggestions where to look
   if $target != NONE ?

if only (b) is used, it may result for the same problemspace we have now,
where the rural autoconf-user does not even know about "config.site". So
I strongly suggest to have atleast (a) and putting it on the help-screen.
The choosing of the filepath for (b) is tricky - a wise decision is 
needful. Just add the --target argument as another subdirectory just before
config.site in the current defaultpath? Look also for --build? prepend it 
as as subdirectory too? What about --host? 

Opinions please...

-- Guido ICQ: 49289035 GSM:+49-175-9723679




Re: HTML format documentation (presetting variables)

2000-09-11 Thread Guido Draheim

Richard Stallman wrote:
> 
>   The most
> important example is when `configure' is re-run (which is not that
> uncommon; re-running `./config.status --recheck' when configure.in
> changes is becoming more and more common): any such variables set in
> the command line are lost in the re-execution, with unpredictable,
> usually bad, results.
> 
> But options should be recorded in config.status so that they will
> not be lost.
> 
Well, if you can record VAR=VAL settings in config.status, everything
is fine. And just to remind you, rms, the "open" syntax is needed for
we guys who do cross-compiling, since the "closed" syntax does not
allow to preset a few things - here it is atleast recorded in `automake`s
Makefile - the one that will implicitly call `./config.status --recheck`.

As the discussion does now lead a bit elsewhere, what about second-level
options, settable via VAR=VAL (or sth. alike)? There was the discussion
that for cross-compiling, people do often have the needs to preset
variables that are later stored in the config.cache (e.g. ac_cv_bigendian),
but nobody really knows their name until they read the configure source
code. May be its worth to support sth. like AC_OPTVAR(varname, helptext),
and the variable/helptext would not be shown unless called with an extra
option --help-vars (or sth. alike), and on the main helpscreen, just add
a line saying that VAR=VAL is supported and the exported varlist can be
seen with another help call.

By the way, the explicit AC_OPTVAR is not actually needed for the heap of
macros that do AC_CHECK_CACHE - they have the varname/msgtext relation
already in place, just need a generic way to export it to the local
commandline user (or those who run `configure` from a GUI-wrapper).

have fun
-- Guido.




Re: HTML format documentation (presetting variables)

2000-09-11 Thread Guido Draheim

Akim Demaille wrote:
> See AC_ARG_VAR.
taken. ... then again, AC_CACHE_CHECK does not use it wouldn't that be okay?
Just again, what about a second-level options that go in another helpscreen, and,
well, in that respect, I do support your position to the point that the
current helpscreen is already filled too much with options that most people
don't really need  whereas the documented varset is too small for other cases...

have fun
--guido




Re: Portability of fopen (foo, "wb") (Was: bug in AC_COMPUTE_INT)

2000-11-16 Thread Guido Draheim

Peter Eisentraut wrote:
> Akim Demaille writes:
> > Does anybody know whether using fopen (foo, "wb") is portable?
> Extremely doubtful.

The "b" is an ansi-C requirement, however there may be some systems
that are simply not compliant. AFAICS these are quite old, somewhere
in the eighties or so - therefore, it is hard to find factual
information as the world wide web is considerably younger. I just
found second hand information that some DEC ultrix libc did barf
at "b". Any halfway sane implementation of a libc will simply
ignore trailing "garbage", so it is safe to put the b as the last.

I would leave it... the cries will be loud enough, - that is true for
quite some cases where there *is* an obvious error instead of 
silent bad behaviour, esp. when we speak about decades before the
autoconf was even invented... 

Anyone to give an example of a non-updated "b"-non-ansi-c-compliant 
system that is in actual use? I really really doubt that...

cheers
-- guidoEdel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ PS++PE 5++X-




Re: Portability of fopen (foo, "wb") (Was: bug in AC_COMPUTE_INT)

2000-11-16 Thread Guido Draheim


Guido Draheim wrote:
> The "b" is an ansi-C requirement, however there may be some systems
> that are simply not compliant. [...] I just
> found second hand information that some DEC ultrix libc did barf
> at "b".

I did a bit of web digging, and I found first hand information in a
tex package dating back to 14-Aug-87...
> For non-Unix operating  systems, it is  generally
> necessary to open  binary files differently  than
> text files,  since  the C-runtime  libraries  use
> that distinction to decide how to translate  Unix
> line terminators.  Every system  so far has  used
> the letter  "b" in  the  fopen() mode  string  to
> select this mode, and  every Unix system  ignores
> the "b", except Ultrix,  which raises a  run-time
> error, sigh...   I  have therefore  replaced  the
> mode string by RB_OPEN and WB_OPEN

with the ongoing success of windows however, I assume that no younger
system will have ever had this bug in its libc. And it *is* a bug.

-- guidoEdel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ PS++PE 5++X-




Re: macro database -- c++ macros

2001-01-05 Thread Guido Draheim


I am currently waiting for an announcement on the autoconf-list
for an information that the move is complete. The old place
however responds and it is obviously up to date.

If anyone is interested, I go used to use a set of autoconf-extensions 
for more than one project of mine, and I had a need to carry them around
to a number of places where I work, so I wrote an autotools-enabled 
package that installs the autoconf-archive.tar.gz along with my 
own macros by doing the usual gnu-strokes on the keyboard. May be
it useful for some other people too, have a look at the current tarball
at http://download.sourceforge.net/pfe/aclocals-0.3.1.tar.gz and go
ahead to use it for your own ones. Note that this is not actually
meant for public distribution, this has to be left to Peter Simons.

have fun,
-- guidoEdel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ PS++PE 5++X-


Bernard Dautrevaux wrote:
> 
> > -Original Message-
> > From: Enrico Sirola [mailto:[EMAIL PROTECTED]]
> > Sent: Thursday, January 04, 2001 4:33 PM
> > To: [EMAIL PROTECTED]
> > Subject: macro database -- c++ macros
> >
> > hello,
> > I'm going to autoconfiscate a new opensource project
> > (QuantLib -- see http://quantlib.sourceforge.net if you are
> > intrested,
> > it's a library for quantitative analisys in finance). The library is
> > entirely written in c++, and i should write a set of macros to check
> > for non standard compilers behavior. Does any1 know if there is a
> > macro
> > repository/database with some ready-to-use macros fo c++. If yes,
> > these
> > would be of great help.
> 
> Hi,
> 
> There's quite a lot of those at
> http://research.cys.de/autoconf-archive/, although I think I remember this
> is in the process to move to another location (you should search the mailing
> list archive for the current location though, as I don't remember the
> correct one, but today the address above still respond...
> 
> > Thanks in advance,
> 
> You're welcome,




Re: HOWTO get content of config.h into a userdefined header (for distribution)

2001-01-07 Thread Guido Draheim

I develop libraries quite a lot, and the installed headers do often
have dependencies on other system headers that may or may not be
installed there. It would be a nuisiance to write a series of
common.h.in headers if the library starts to get a little bigger.

Therefore I created the AC_PREFIX_CONFIG_H macro that will take
the generated config.h and spit out a file that has all the 
defines, ifdef'd and with a prefix. Assume your package is
called `boo`, a call to AC_PREFIX_CONFIG_H without args (of
course you can override the defaults) will produce a file
boo-config.h containing sth. like this:

#ifndef BOO_HAVE_UNISTD_H
#define BOO_HAVE_UNISTD_H
#endif

/* #undef BOO_HAVE_WINDOWS_H */

Just install this boo-config.h along with your other library
headers. Well, this one is still not finished, there are some 
problems involved that I hope someone can fix ;-)

you can find the macro in the autoconf-archive.tar.gz at
http://research.cys.de/autoconf-archive
and the latest (bugfixed?) version is inside aclocals-0.3.1.tar.gz at
http://download.sourceforge.net/pfe

have fun,
-- guidoEdel sei der Mensch, hilfreich und
gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ PS++PE
5++X-

Uwe Hermann wrote:
> 
> On Sun, Jan 07, 2001 at 11:55:36AM +0100, Oliver wrote:
> > Hi,
> >
> > I've made a litle library-project with GNU-autoconf/make/libtool. My
library
> > depends on the content of config.h which tests for some header files.
If some
> > other project (which uses autoconf) would use my library ithe config.h
> > files would collide. So I want the content of config.h into my header
common.h
> > which I can reference in the headers of my lib. I tried following in
config.in:
> >
> > AC_CHECK_HEADERS(signal.h)
> > AC_OUTPUT_COMMANDS(
> > if test x$ac_cv_header_signal_h = xyes; then
> >   echo '#include '  >> libdir/comon.h
> > fi)
> >
> > This should create a file common.h which contains someting depended of
what
> > configure has found. But this doesn't work.
> > Please, can you give me some help?
> 
> You could add
> 
> #ifdef HAVE_SIGNAL_H
> # include 
> #endif
> 
> in your libdir/comon.h. The HAVE_SIGNAL_H macro is only set to 1 if
> AC_CHECK_HEADERS(signal.h) finds a signal.h.
> 
> HTH.
> 
> Uwe.
> --
> Uwe Hermann <[EMAIL PROTECTED]>
> http://www.hermann-uwe.de/
> ---
> :wq

-- 
Sent through GMX FreeMail - http://www.gmx.net





AC_MKDIR/cross-cc/number-of-args??

2001-01-09 Thread Guido Draheim

Hi everyone,

in the last days I did hit a problem that I thought was good to
do with an autoconf-macro - may be someone has already done
such a beast. It goes about the number of arguments to mkdir(2).

While every unix (posix?) system takes two arguments (the
filename and the modebits), quite a few platforms only want
one argument, in my case it was not only win'ish mingw32
but also vxworks. As a side note: cygwin uses two args,
mingw32 one arg. I guess quite a few embedded RTOS do also
define mkdir with just one arg.

The question is: how to reliably detect the number of 
arguments without TRY_RUN since it must be detectable
for crosscompiling targets. Anyone to give me a hint on
how to implement such an m4-macro? Or already on disk somewhere?

Thanks in advance
-- guidoGCS/E/S/P C++$ ULHS L++w- N++@  
 

-- 
Sent through GMX FreeMail - http://www.gmx.net





Re: AC_MKDIR/cross-cc/number-of-args??

2001-01-10 Thread Guido Draheim


ADL's version looks good - atleast for mkdir as it is okay to
put a single string at mkdir. I did overlook this one as
I had problems with -fwrite-strings and vxworks headers
that declare mkdir(char*), however ac-compile checks 
wouldn't use that compile-option...

Otherwise, yes, there are good deal of problems about
the permission-bits themselves, especially with some
win/dos or embedded/rtos targets. In my sources, these
are solved with a few nulldef defines like
#ifndef S_IRWXG \n #define S_IRWXG 0 \n #endif
which is just the thing that we do wherever O_BINARY
is not available (many unix'ish systems don't have it).

Many thanks to everyone, esp. ADL. I'll test it in the
next time, after which I'll send a notice to ADL, as
it may be a good idea to have this one in the
autoconf-archive too. Mo, you have the most platform
experience - wherever mkdir is (posix'ish) with two
args, it is okay to feed existing S_-defines, right?
And to define non-exisiting to null will be okay if
done locally, right? (I did never hear of a system
that would be offended by S_-bits it declares for
other places, but I'd not be amazed either).

cheers
-- guido

Mo DeJong wrote:
> >
> > I have been using the following macro for a few weeks,
> > it might not be really perfect but at least it works for the
> > hosts I need (this include crosscompiling to mingw32).
> >
> > dnl AC_FUNC_MKDIR
> > dnl Check for mkdir.
> > dnl Can define HAVE_MKDIR, HAVE__MKDIR and MKDIR_TAKES_ONE_ARG.
> > dnl
> > dnl #if HAVE_MKDIR
> > dnl # if MKDIR_TAKES_ONE_ARG
> > dnl/* Mingw32 */
> > dnl #  define mkdir(a,b) mkdir(a)
> > dnl # endif
> > dnl #else
> > dnl # if HAVE__MKDIR
> > dnl/* plain Win32 */
> > dnl #  define mkdir(a,b) _mkdir(a)
> > dnl # else
> > dnl #  error "Don't know how to create a directory on this system."
> > dnl # endif
> > dnl #endif
> > dnl
> > dnl Written by Alexandre Duret-Lutz <[EMAIL PROTECTED]>.
> >
> > AC_DEFUN([AC_FUNC_MKDIR],
> > [AC_CHECK_FUNCS([mkdir _mkdir])
> > AC_CACHE_CHECK([whether mkdir takes one argument],
> > [ac_cv_mkdir_takes_one_arg],
> > [AC_TRY_COMPILE([
> > #include 
> > #ifdef HAVE_UNISTD_H
> > # include 
> > #endif
> > ],[mkdir (".");],
> > [ac_cv_mkdir_takes_one_arg=yes],[ac_cv_mkdir_takes_one_arg=no])])
> > if test x"$ac_cv_mkdir_takes_one_arg" = xyes; then
> >   AC_DEFINE([MKDIR_TAKES_ONE_ARG],1,
> > [Define if mkdir takes only one argument.])
> > fi
> > ])
> >
> > --
> > Alexandre Duret-Lutz
> 
> Along these same lines, here is the mkdir check from
> the configure.in in Jikes. Note that the mac version
> is just for kicks, I don't think it actually works.
> 
> dnl We need to do some nasty checks here to make sure that
> dnl we know what version of mkdir() to call.
> 
> dnl First, we just make sure mkdir() actually exists
> AC_CHECK_FUNCS(mkdir, , AC_MSG_ERROR([No mkdir() function found]))
> 
> AC_CACHE_CHECK(for mac style mkdir, jikes_cv_mac_mkdir,
> AC_TRY_LINK([
> #include 
> #include 
> ], [mkdir("foo.dir", 0);
> ], jikes_cv_mac_mkdir=yes,
>jikes_cv_mac_mkdir=no)
> )
> 
> AC_CACHE_CHECK(for glibc style mkdir, jikes_cv_glibc_mkdir,
> AC_TRY_LINK([
> #include 
> #include 
> ], [mkdir("foo.dir", S_IRWXU | S_IRWXG | S_IRWXO);
> ], jikes_cv_glibc_mkdir=yes,
>jikes_cv_glibc_mkdir=no)
> )
> 
> AC_CACHE_CHECK(for libc5 style mkdir, jikes_cv_libc5_mkdir,
> AC_TRY_LINK([
> #include 
> #include 
> ], [mkdir("foo.dir", S_IRWXU);
> ], jikes_cv_libc5_mkdir=yes,
>jikes_cv_libc5_mkdir=no)
> )
> 
> AC_CACHE_CHECK(for win32 style mkdir, jikes_cv_win32_mkdir,
> AC_TRY_LINK([
> #include 
> ], [mkdir("foo.dir");
> ], jikes_cv_win32_mkdir=yes,
>jikes_cv_win32_mkdir=no)
> )
> 
> if test "$jikes_cv_glibc_mkdir" = "yes" ; then
> AC_DEFINE(HAVE_GLIBC_MKDIR, ,
> [use unix style mkdir(str, S_IRWXU | S_IRWXG | S_IRWXO)])
> elif test "$jikes_cv_libc5_mkdir" = "yes" ; then
> AC_DEFINE(HAVE_LIBC5_MKDIR, ,
> [use unix style mkdir(str, S_IRWXU)])
> elif test "$jikes_cv_win32_mkdir" = "yes" ; then
> AC_DEFINE(HAVE_WIN32_MKDIR, ,
> [use win32 style mkdir(str) from ])
> elif test "$jikes_cv_mac_mkdir" = "yes" ; then
> AC_DEFINE(HAVE_MAC_MKDIR, ,
> [use mac style mkdir(str,0) from ])
> else
> AC_MSG_ERROR([Could not locate a working mkdir() implementation])
> fi
> 
> cheers
> Mo DeJong
> Red Hat Inc





configure GUI somewhere?

2001-01-12 Thread Guido Draheim


...just a short question - is there possibly a configure GUI
somewhere? Something that parses "configure --help", shows a
nice GUI to change the options, and offers ability to run
configure $options, make and make install. No, I don't mean 
something that can do anything about configure-options, just 
a good subset of what one can see in "configure --help" is 
all one needs in the standard case. It would greatly simplify
the paveway for unexperienced users to install-from-source-tarballs.

just an idea, SCNR to ask, may be s.o. has a hint...
-- guidoEdel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ PS++PE 5++X-





gnu.org/autoconf webpage update!? [Re: Autoconf 2.49c: Release candidate]

2001-01-24 Thread Guido Draheim


Akim Demaille wrote:
> [...] Autoconf 2.49c, our release candidate.  The core Autoconf
> is not expected to change before the release, while the documentation
> and minor details still need some work.
> [...]
> Autoconf can be downloaded from
> 
> ftp://alpha.gnu.org/gnu/autoconf/autoconf-2.49c.tar.gz
>
> Happy configuring!
>
> Akim, Alexandre, Jim, Pavel, Paul, and Tom.


The gnu.org's webpage at http://www.gnu.org/software/autoconf 
does not give a clue that there is some development work going on.
No pointer to the relevant webpage at http://sources.redhat.com/autoconf
therefore no mailinglists, no news, or any pointer to release candidates.
I did ask RMS a while ago if the gnu.org webpage could be updated,
and I faintly remember he did forward the question to this list.
I guess it would be a *very* good time now to add atleast a href
on the gnu.org webpage, or even provide some new documentation there.

Who to do it?

cheers,
-- guidoEdel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ PS++PE 5++X-





prefix:subpaths macro anyone? i.e. moveable packages

2001-02-27 Thread Guido Draheim


Hi everyone,

has someone somewhen written some ac-macro to detect the
subpaths of a prefixed installation? The idea is to have
a moveable package where the compilation prefix path is just
the default that can be changed at runtime.

hmm, consider a package named 'foo' that has a bin/foo, 
a lib/libfoo, some shared-files in share/foo/, some docs a.s.o.
The compileprefix may be /usr/local, but the user can move
the package to somewhere else, thereby setting an environment
variable (or a registry key) called FOODIR. Well, currently
I can just make an assumption about the subpaths being ./bin,
./lib, ./share/foo a.s.o, however these could have been 
changed to sth. different during configuration, e.g. let
them point all into a subpath ./foo with an overall prefix
of /programs.

If I look at the current scheme of autoconf/automake, this one
is definitly nothing that is supported as a default idea,
the bindir, libdir, pkglibdir macros all use the complete path,
however,  one can still invent a series of other subpath-ac_substs 
that are detected (!!) from the traditional longpaths. It's just
a kind of lengthy pattern recognition, and may be someone has 
already done it.

thanks in advance,
-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-





Re: How do I build debug versions?

2001-03-07 Thread Guido Draheim

Alexandre Duret-Lutz wrote:
> 
> >>> "Dean" == Dean Hoover <[EMAIL PROTECTED]> writes:
> 
> [...]
> 
>  Dean> Another thing you could do is make multiple build directories
>  Dean> and always make in a certain way in them. In other words, you
>  Dean> could:
> 
>  Dean> mkdir debug-build opt-build
>  Dean> cd debug-build; $top_srcdir/configure; make debug; cd ..
>  Dean> cd opt-build; $top_srcdir/configure; make; cd ..
> 
> PFE (http://pfe.sourceforge.net/) seems to setup things in a
> way which is close to Shameek's request (it uses Debug/$arch/ and
> Release/$arch/ build directories).  Maybe you can draw some
> ideas from that package.
> 

Well, there are a lot more weird things beyond the needs of most people,
buried under some weird assumptions, f.e. here's a little snippet from
the configure.in, it' soo convenient ;-)

case `pwd` in  # implicitly add -DNDEBUG if in path with Release
  */Release/*) OPTIM="-DNDEBUG $OPTIM" ;;
  */Debug/*)   DEBUG="-DDEBUG  $DEBUG" ;;
esac

anyway, there are indeed multiple versions in different subdirectories
of the very same sources, so it may possibly give a few pointers. It
did take some time to figure out how to do it correctly, well, now it 
seems to be rock stable :-))

cheers,
-- guidohttp://PFE.sourceforge.net
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-





Re: config headers and #undefs

2001-03-07 Thread Guido Draheim

Kevin Ryde wrote:
> 
> #define __need_size_t
> #include 
> #undef __need_size_t /*NOTATEMPLATE*/

#ifdef __undef_need_size_t
#undef __need_size_t
#endif

BTW, if you want to install a config.h file, you may want to use this:
   http://pfe.sourceforge.net/autoconf/ac_prefix_config_h.html
and invent a proj-conf.h that includes the derived proj-config.h.

cheers,
-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-





ANN: aclocal-archive 0.4.3

2001-03-29 Thread Guido Draheim


Hi everyone, 

I have been updating my aclocal backpacker package, renaming it
to "aclocal-archive". Now it can create some rpm files for binary
distribution of aclocal macros for your convenience too. I would
like to have some comments on the usefulness of the aclocal-archive,
idea and I would like to know if some people want to share it,
in which case I could possibly go and open up a CVS area for it. Note, 
that a autoconf-archive.tar.gz could be created from such a cvs area
too, not just the native aclocal-archive-xxx.tar.gz tarball.

TIA, rgds, guido
 ___
If you did not know about this package before, here's the README:


%description
 autoconf extensions that go into /usr/share/aclocal/ directory.
 This dir is searched for macros with the `aclocal` program.
 Installs also the macros found in the autoconf-archive.tar.gz 
 from http://cryp.to/autoconf-archive

%usage
 create a new subdirectory for your own macros that you use in
 your projects and place your m4 files in there, then add this
 directory to the automake-variable in Makefile.am. You get
 a few makefile rules:

 configure - detect actual share/aclocal dir, and the m4 files available
 make dist - create a new snapshot of all m4-files availabe, as a tarball
 make install - install all m4-files in share/aclocal
 make doc - create html-files in the format of cryp.to/autoconf-archive
 make install-doc - install html-files along with an index.html
 make pack - create autoconf-archive.tar.gz from uppercased subdirs
 make unpack - implant a cryp.to/autoconf-archive snapshot
 make rpm - copy tarball to packages/SOURCES and build rpms
 make pack-doc - create html just like the cryp.to/autoconf-archive
 make install-pack-doc - install only the official macro site docs

%detail
 add your m4-files subdir spec to ACLOCALS_ARCHIVE definition at the
 top of the Makefile.am. Other files should be added to EXTRA_DIST.

 The configure script will make a `find . -name \*.m4` so that all
 m4 files are detected and installed, the automake-variable is only
 used for `make dist`, so you can create a subdir with files that
 shall not be distributed in your tarball.

 The primary goal is to help developers to seperate their m4 files
 out from the projects, and make them easily reusable not only
 for a number of projects - a `make dist` snapshot can be easily
 transferred across network boundaries, and a `make install` will
 update the m4-files available to `aclocal`.

 As a hint, you can compile autoconf and install it somewhere
 in your local non-root home-directory paths, the configure
 script of this package will detect the aclocal-dir from the
 `aclocal` program available, and install the m4-files in your
 home's share/aclocal extensions directory.

 last not least, to use it properly, update the version number
 from time to time in the aclocal-archive.spec - the configure
 script will derive the PACKAGE/VERSION defs from there.

%download
 This is a preview, it is only announced to the autoconf mailinglist
 for discussion. The files are currently hosted side by side to
 the http://pfe.sourceforge.net project release-files which is 
 of course not the right place. To download, have a look at
http://sourceforge.net/project/showfiles.php?group_id=1922
 for fils called "aclocal-archive-*"


 http://216.136.171.200/pfe/aclocal-archive-0.4.3.tar.gz





Re: ANN: aclocal-archive

2001-04-02 Thread Guido Draheim

Guido Draheim wrote:
> [..] I would
> like to have some comments on the usefulness of the aclocal-archive,
> idea and I would like to know if some people want to share it, [...]
> [..]   http://216.136.171.200/pfe/aclocal-archive-0.4.3.tar.gz

180 downloads, no comments - in the opensource world, that should
read as "everything's fine but 'did not look for it anytime before".
'sounds good, doesn't it ;-)

cheers, guido




Re: Perl vs Scheme vs ML vs ... for autoconf

2001-04-10 Thread Guido Draheim


a) automake was not ported from perl to guile for years,
   and I don't know of experiments to actually do it now,
   or have it done in this decade (it wasn't in the last).
   Moving it from perl to guile does not earn much for 
   features or maintainability - it would just be another
   point to spread guile which I suspect to be one of
   the intents for RMS to propose it ;-)
b) perl is nice for its builtin regex, string-ops and
   system-support supported with syntactic sugar. It has
   an easy learning curve which gave it a good audience,
   Using perl for autoconf.* feels natural - you want to
   search, extract and write things, and the best thing 
   to use is a Practical Extraction and Report Language.
   The syntactic sugar however does sometimes confuse
   people which is the downside of TMTOWTDI.
c) a + b plus perl being ubiquitous - what else, hmmm..
d) if you ask for a language that had been forgotten,
   I'd point to php which is an original for string and
   database too, and with the advent xml era, it show
   an enormous growth, just like python in its first years.
   However, I feel it is still "evolving" in the sense
   it could shiftshape where perl is pretty done now.
e) autoconf.* maintainerscripts look short, I have to admit 
   that I don't get the idea of changing things. And if so,
   use the scripter that people are fluent with. You have
   automake.pl aclocal.pl autoupdate.pl, and automake.pl is
   bigger that the sum (!!) of the other auto*-tools. That's
   an argument to let converge maintaince on both sets of
   the autoconf/automake theater.

just my 2cent,
-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-




Re: Autoconf Logo?

2001-04-20 Thread Guido Draheim

Eric Siegerman wrote:
> 
> On Fri, Apr 20, 2001 at 02:52:08PM +0200, Lars J. Aas wrote:
> > Does such a thing exist?  Should it be a goat?
> > I'm thinking of something to stick on the web pages...
> 
> Hercules slaying the hydra :-)

or just the hydra :-)  well, there's just that impression
of wheels, gearbox,... mill hmmm...

-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-




Re: Autoconf and cross-compiling

2001-05-17 Thread Guido Draheim

"Gary V. Vaughan" wrote:
> 
> On Wednesday 16 May 2001  9:27 pm, David Burg wrote:
> > Hello,
> >
> > I need to change a existent configure.in file to enable cross-compiling
> > (host is x86 and target is strongarm). I have read the on-line documention
> > of autoconf on www.gnu.org, but explainations on cross-compiling
> > configuration are not many and I fail to adapt the configure.in.
> >
> > I have allready a c++ cross-compiler named arm-linux-gcc and glibc for
> > cross-compiling.
> >
> > Are they some easy understandable cross-compiling with autoconf how-to
> > avaible somewhere ?
> 
> Chapter 25 of the Goat book, http://wources.redhat.com/autobook.
> 

small typo, Gary, it's http://sources.redhat.com/autobook
and chapter 26, anyway, was easy to guess ;-) 
However, looking at it, in being a how-to, it is quite short.
Perhaps we need a kind of guide to enabling cross-compiling,
like a list which autoconf-tests are good for cross-compiling
and which are not, or just in some places. E.g. try_link will
sometimes work as a check-method, but some embedded-targets
don't have convenience-libs on the build-system to enable the
checks - they'll fail always. [ahm btw, changing the PATH(s)
before starting configure/make is the easiest way to do
crosscompile most projects ;-) ] I have to admit that most
of my experiences come from experimentation, try-and-error,
and not reading. I wasn't to find a site dedicated to the
topic either :-( ... oh well...

cheers,
-- guido   http://savannah.gnu.org/projects/ac-archive
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- y++ 5++X-




$lib-config (Re: how to use libraries in /usr/local)

2001-05-21 Thread Guido Draheim


Bruno Haible wrote:
> [...]
> Another totally different approach is to recommend that every
> library libfoo comes with a script 'foo-config' in /usr/local/bin that
> can spit out the required -I and -L options. Here as well, autoconf
> support would be nice, so that the resulting -I/-L options would be
> substituted into the Makefile.
> 

I strongly feel that this is already established as good practice
for libraries. A generic version is mostly sufficient, e.g. one created by
http://cryp.to/autoconf-archive/Miscellaneous/ac_create_generic_config.html
that is quite imperfect nonetheless extraordarily useful for me.

(*hmmm* lately, debian maintainers notified me, that one does even have
  to write a man-page for a $lib-config script - everything that goes into
  $/bin must have one, according to the rules, even a $lib-config script..)

cheers
-- guidohttp://savannah.gnu.org/projects/ac-archive
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-




Re: configure scripts and cross-compiling

2001-05-23 Thread Guido Draheim


You are doing fine, just some configure-checks are not ready
for cross-compiling, and that's what you are warned about.
Basically, the package that you are trying to cross-compile 
has not been made ready for cross-compiling.
Two approaches:
a) learn about config.site or cache-file, and let it contain
   cache-values - the check will then say "(cached) yes",
   so it won't try to autodetect anything.
b) bigendian-crosscompilecheck is a common problem, may be 
   the next generation autoconf will have the solution that
   is currently present in in the autoconf-archive. Edit the
   configure.in, replace AC_C_BIGENDIAN with the reference
   to the following macro, reconf, and build. This is
   generic for all target platforms.
http://cryp.to/autoconf-archive/Cross_Compilation/ac_c_bigendian_cross.html

cheers, guido   http://www.gnu.org/software/ac-archive



Mike Culbertson wrote:
> 
> Could somone specify what it is generic configure script would have to
> find in order to think that the compiler I am using (gcc 2.95-3 on
> sun/sparc solaris 8) is a cross-compiler?  I have a fairly standard
> install of everything, and I am using no proprietary tools with the
> exception of a few in /usr/ccs/bin (ar, as, etc).  At thi point, nothing
> wants to build properly and I am really not sure what I have changed that
> has caused this (I never have actually been set up as a cross-compiler)
> Thanks in advance.  Any other info provided on request.
> 
> it dies here:
> bash-2.03# ./configure
> loading cache ./config.cache
> checking for gcc... gcc
> checking whether the C compiler (gcc -02 -msupersparc ) works... yes
> checking whether the C compiler (gcc -02 -msupersparc ) is a
> cross-compiler... yes
> checking whether we are using GNU C... yes
> checking whether gcc accepts -g... yes
> checking host system type... sparc-sun-solaris2.8
> checking whether byte ordering is bigendian... configure: error: can not
> run test program while cross compiling
> bash-2.03#
> 
> Mike Culbertson

-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-




Re: Auto-tools & Win32 & Borland C++ Builder

2001-05-23 Thread Guido Draheim


This is not restricted to borland compilers, there are quite
some C-compilers on unix-systems that quite some people like
to see supported, however most of the autotools developers
do live in a quite gnu'ish/gcc'ish environment, and every now
and then, a gmake'ish/gcc pecularity slips in that will break
the next time another tool'ed environment is seen. I would
recommended to use atleast gmake and (ba)sh which are both
available for win/dos, and having the complete gnu fileutils
is not a bad idea either. On this basis, look for any problems
related to assumptions about the gcc, and e-mail the resp.
people that are guilty ;-) Possibly, install a gcc-distribution
like cygwin32 or mingw32 (http://libsdl.org/Xmingw32), start
a normal gcc/automake setup, and run "sh configure CC=bcc32".
I guess it will extremly stress your technical skills to
find the bits that assume CC=gcc and a unix-filesystem, and 
furthermore I guess that such work will be very welcome as 
it will help supporting non-gcc compilers on other unix-platforms 
as well, minus the tricks about filepaths. Make sure to use
the newest autotools, the last months have seen quite some
improvements in supporting cygwin/mingw platforms which will
make things quite a bit easier for you.

sorry for the extreme addressing. however, may be there
are some hints whether people have already tried with
borland compilers. I did lately build an nmake-makefile 
for the free(!!) borland 5.5 compilers (which means commandline 
only). However, for a pure C project, the compiler looks
a bit inferior wrt. to gcc, so I'd switch to gcc anyway,
I don't know about C++... all that I've heard so far, that
people did stop at some point to try to get along with the
non-gcc compiler, since the gcc compiler suite is way good
enough for anything that is needed which is the actual 
reason why bcc-support / msvc-support is not answered in
an FAQ. Starting to use gcc on win/dos, well, again, this 
is more a pedagogical endavour, 

Another scheme is of course the usage of the C++Builder
as a front-end, and use its project-files to generate 
a makefile(.am/.in) that can make it build in environments 
that don't have a borland compiler. Again, you'd have to
change away anything that is non-portable across compiler
sets, so one could start to use gcc's c++ anway, which
again does not need bcc support in the original setup. So
I guess, approaching autotools enthusiasts, it may come
out to the question why you're using borland compiler-chain
anyway as portability is best achieved with the gcc itself.
Again, partly a pedagogical endavour (if not flames) that
should be limited to one mailing-list. Possibly libtool.

oops, got a bit long an winded, cheers, guido

Axel Thimm wrote:
> 
> sorry for the excessive addressing, but this topic touches all auto-tools.
> 
> I am in the process of convincing some people to move their Borland based
> source code development to proprietary free models. As you may guess, this is
> an extremly difficult task, requiring more pedagogical than technical skills
> (and unfortunately myself is extremly Unix-centric, and I still have to learn
> about Borland's peculiarities).
> 
> Nevertheless I want to give it a try. As a first step I'd have to move the
> configure/build/install infrastructure to auto-tools, then I'd attack the
> compiler non-portability (and by the end of this decade I might get a GNU
> compilant system ...).
> 
> Searching the lists/net nothing helpful came up, but at least there also
> wasn't any evidence of any NoGo-theorems.
> 
> Does anyone have already some experience in working with auto-tools and
> Borland from the command line? How do the maintainers/release managers here
> think about it? Would they be willing to accept patches for supporting
> commercial compilers ;)
> 
> Regards, Axel.
> --
> [EMAIL PROTECTED]

-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-




Re: ac_define_dir (Re: question about sysconfdir)

2001-05-31 Thread Guido Draheim

"Matthew R. MacIntyre" wrote:
> 
> I want to have a configuration file in the sysconfdir for my program,
> but if I put AC_DEFINE(CONFFILE, "$sysconfdir/file.conf"), I end up
> with the path being "${prefix}/etc/file.conf" which is pretty
> meaningless to C.  I've seen people hack it by just saying
> "$prefix/etc/file.conf" in the configure.in, but this doesn't seem to
> be the right thing to do. Should I do that, or is there a way to use
> the sysconfdir?
> 
> Thanks for any help,
> 

kinda FAQ, use shell-interpolation, and AC_DEFINE the result, please see
http://www.gnu.org/software/ac-archive/Miscellaneous/ac_define_dir.html

cheers, Guido




Re: doc dirs?

2001-06-19 Thread Guido Draheim


Rüdiger Kuhlmann wrote:
>  AC_SUBST([infodir],['${prefix}/info'])dnl
> +AC_SUBST([htmldir],['${prefix}/share/doc/html'])dnl
> +AC_SUBST([psdir],  ['${prefix}/share/doc/ps'])dnl
> +AC_SUBST([dvidir], ['${prefix}/share/doc/dvi'])dnl
> +AC_SUBST([guidedir],   ['${prefix}/share/doc/guide'])dnl
>  AC_SUBST([mandir], ['${prefix}/man'])dnl

>From my experiences, these docs go usually just into $pkgdocdir, as
unlike info/man there is no implicit directory-search/dirindex-build
by the systemtools. Still, $docdir is missing and desperatly needed - 
some  packagers/distmakers like to have the genericdocs/packagedocs
to live in $prefix/doc (fhs1?), other systems (fhs2?) have them
in $datadir/doc, or even in a Desktop/Documentation-subdir. 
A $docdir should be settable on its own, yes it should...




Re: doc dirs?

2001-06-19 Thread Guido Draheim

Russ Allbery wrote:
> 
> Rüdiger Kuhlmann <[EMAIL PROTECTED]> writes:
> 
> > I'd like to re-kindle the discussion about options for where to put a
> > programs documentation by suggesting the following patch:
> 
> Wouldn't some sort of general facility for adding a new *dir switch for a
> given package be better than adding a bunch of new options that very few
> packages will use (and making --help even longer)?  If the worry is that
> programs will use inconsistent switch names, we can always put a list of
> recommendations into the manual.

just a few thoughts:
a) the helpscreen lists things like sysconfdir or infodir or sharedstatedir
   but many packages don't use these. They are just confusing, nothing else.
   Possibly just there so they can propagate to sub-configures...
b) I see packages to put dirpath-settables as --with-htmldir options, or
   possibly with a different name, e.g. --with-html-dir. It is inconsistent
   and it should be stopped. We want dist-makers to be able to call all 
   the ac' configures with a predefined option-set and have unused ignored.
c) propagate with-options to sub-configure, silently doing it even for
   unknown options, same with the current set of dirpaths, propagate them,
   even if toplevel doesn't used them. nice thing, isn't it. Align the
   functionality of both. Kinda AC_ARG_DIR to be along AC_ARG_WITH (and
   AC_ARG_VAR)?  Looks easy, but probably isn't...
d) combine a/c with an underlying fhs/gcs scheme for knownpaths,
   AC_USE_DIR(synconf sharedstate), don't add it to local subst/help
   by default, but use the knownpath from the standards if a USE_DIR
   is seen. b/c means to accept every --*dir silently for sub-configures.
*) looks like far future anyway... but parts of it could be done next -
   especially extending the --*dir detection through a dedicated macrocall!
   Still need to silenently accept/propagate unknown --*dir options as a 
   known behaviour, needs to be agreed on...




Re: Default values for infodir and mandir [WAS: Re: [autoconf] doc dirs?]

2001-06-20 Thread Guido Draheim

Earnie Boyd wrote:
> 
> Rüdiger Kuhlmann wrote:
> >
> > Hi!
> >
> -8<-
> >  AC_SUBST([infodir],['${prefix}/info'])dnl
> > +AC_SUBST([docdir], ['${datadir}/doc'])dnl
> >  AC_SUBST([mandir], ['${prefix}/man'])dnl
> >
> 
> In my simplistic mind having three places for documentation isn't
> logical.  I can understand leaving switches for infodir and mandir in
> place for backward compatibility but shouldn't there values default to
> ['$(docdir)/info'] and ['$(docdir)/man'] instead.  IIRC this is what the
> FHS recommends.
> 

IIRC, the FHS recommends to move mandir and infodir into $datadir/(man|info)
instead of the current ac-default for $prefix/(man|info). The same goes
for the ubiquitous /usr/doc-dir - the $docdir shall change from $prefix/doc
to $datadir/doc, i.e. a default of /usr/share/doc.
Newer linux distros do already follow the new fhs, and it is partly easy
for distromakers as they can `configure` all ac-software invariably
with $mandir and $infodir presettings. However this is not true for
other documents, most prominently htmldocs and docbooks.
The simplictic approach of another oh-so-global -*dir option is
probably fine for the moment.

Since we are at it, are there plans to change the default for infodir 
and datadir from $prefix to $datadir ?




Re: Default values for infodir and mandir [WAS: Re: [autoconf] doc dirs?]

2001-06-20 Thread Guido Draheim

Ralf Corsepius wrote:
> 
> Guido Draheim wrote:
> >
> > Earnie Boyd wrote:
> > >
> > > Rüdiger Kuhlmann wrote:
> > > >
> > > > Hi!
> > > >
> > > -8<-
> > > >  AC_SUBST([infodir],['${prefix}/info'])dnl
> > > > +AC_SUBST([docdir], ['${datadir}/doc'])dnl
> > > >  AC_SUBST([mandir], ['${prefix}/man'])dnl
> > > >
> > >
> > > In my simplistic mind having three places for documentation isn't
> > > logical.  I can understand leaving switches for infodir and mandir in
> > > place for backward compatibility but shouldn't there values default to
> > > ['$(docdir)/info'] and ['$(docdir)/man'] instead.  IIRC this is what the
> > > FHS recommends.
> > >
> >
> > IIRC, the FHS recommends to move mandir and infodir into $datadir/(man|info)
> > instead of the current ac-default for $prefix/(man|info).
> 
> Nope. It does not even mention the words datadir or docdir.
> grep datadir fhs.txt
> grep docdir fhs.txt
> 
> It mandates:
> /usr/share/[man|info] for OS-vendor supplied packages
> /usr/local/[man|info] for local packages
> and reserves
> /opt/[man|info] for "add-on" packages.

Indeed! so in fact, it should still be $prefix/man etcetera,
which leads me to the question where the /doc should go under,
- is it $prefix/doc or $datadir/doc ? The fhs does not tell of
a /doc directory for /usr/local software, but it does mention
/opt/doc, and it does mention /usr/share/doc along with a 
footnote saying it has been /usr/doc originally.

Again, the docdir looks like a fairly standard directory,
so it should be added to the defaultset of configure options.
Even more, "OS-vendor supplied packages" under /usr shall
*not* use $prefix but a thing that we come to call $datadir,
for each of the /man /info /doc subdirectories. It is best
for "OS-vendor"s to be able to configure gnu'ish software
with the same set of options for these directories.

What it damn horrifies me is the inconsistency with respect
to relocatable packages. Package-wise installations are
referenced in the /opt/ section of the fhs, and it
says /opt//man. Possibly it can be derived as
/opt//doc too, so that the $docdir default should
be $prefix/doc. The reference too $datadir/doc slipped in
for me, as I had been asked a while ago to place the
htmldocs and pdffiles in the /share-directory instead of
directly under a prefix/doc path. Perhaps I'll add some
magic that can see a "/usr"-prefix and shifts them down
into /usr/share - or should it be the other way round to
move up /share-data - who knows ;-)


-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-




$docprefix [Re: Default values for infodir and mandir [WAS: Re: [autoconf] doc dirs?]]

2001-06-22 Thread Guido Draheim


Alexandre Oliva wrote:
> 
> > which leads me to the question where the /doc should go under,
> 
> Perhaps instead of --docdir we should have --doc-prefix, that defaults
> to --prefix?  Then man, info, html, etc would all be in /usr/local by
> default, but this could be easily overridden using --doc-prefix.
> 
> Another idea is to have mandir and infodir default to
> prefix/{man,info} if --docdir is not specified (in which case it would
> default to ${datadir}/doc, and to $doc_prefix/{man,info} otherwise
> (not overriding --mandir or --infodir, of course).  Yes, I agree this
> is messier to implement and describe, but it seems to make installers'
> and distributors' lives simpler.
> 

It took me a while to get the idea, Alexandre, let me summarize it
as far as I understand the bits:

fhs suggests, that (man|info) is placed directly and in parallel
to the (bin|lib) directory in the case of /opt. and /usr/local, 
while at the same time these doc-directories are placed inside
of /usr/share instead of /usr just for the software that is going
into /usr/bin. In other places it mentions that data-directories
(like termcap) should be moved into the /usr/share subtree.
What I get is that all docs for software in the two os-vendor 
prefixes / and /usr should be assembled into /usr/share, while
at the same time their $prefix-based directories are using them
directly attached. On another account we see text looking into
package-wise installations (/opt/) which propose to
set the (bin|lib|man|info) directories directly thereunder.

Basically I see that the current assumptions for $infodir and
$mandir to be based on $prefix is right as we deal with
self-installed "add-on" software. However os-vendors will
try to put the docs out of $prefix into the share-subfolder,
and incidentally $datadir has a default of $prefix/share.
But it is wrong to use just that as the (share)-subfolder
is supposed to contain other data-directories like 
(desktop|pixmap). Currently the os-vendor has to know which
doc-directores are valid *and* placed in $prefix, and adapt
the configure-call accordingly. New doc-directories would
not be affected as their maintainers would easily be tempted 
to use $datadir as its base.

However the idea of a $docprefix does fit in nicely herein -
it will make it easy for a dist-maker to switch all docs over 
into a share-directory as it is suggested by fhs, and if
anyone cares to, all these docs can even be put into a dir that 
is somewhere reachable from a httpd. The default however should
be $docprefix=$prefix just like $execprefix=$prefix.

Anyone who starts to invent a doclike subdir will then be
tempted to let it be based on $docprefix. Good idea? It
could end up to populate the $prefix-directory, nothing that
is felt a nice move in the unix-world. Yes, as it is a
"prefix" no one would try to put their docs directly in
there, just a subdir, and doc/$PACKAGE looks good as a
gathering place for all kindes of docs of the package. And
if some new software goes to autoindex like we have with
(info|dir), users will adapt to use its path which will
again be based on $docprefix I guess. (footnote: just came
across a $docdatadir idea for docs that default into share)

Anyway, it is a long way to adapt autoconf|automake packages
to start using a $docprefix if we care to invent one - and
at the moment it would not hurt much as $docprefix defaults
to $prefix. One can even write an easy macro that will add
a (share) if $prefix=/usr and that will affect all the docs
to be moved. It does actually sound easier to have just one
switch to move docs instead of the current mess with using
options for (man|info), may be we can even drop their seperate
--(man|info)dir global-options somewhen in the future, also
thinking of the fact these multi-decade help-browsers might
be dusted away at some point.

So, I raise may hand to make the shift and introduce a
$docprefix, but I know it is not just my decision - most 
prominently the gcs-guys (gnu coding standards) may want
to nod at it too, changing some docs for debian subsequently,
and following it's changing terminology in fhs-discussions. 
No need to feel wimpy about, don't we ;-)

have lots of fun,
-- guido http://guidod.4t.com
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- y++ 5++X-




cross-compiling ac_c_bigendian again (Re: AC_C_BIGENDIAN vs. Darwin)

2001-06-26 Thread Guido Draheim


Paul Eggert wrote:
> [...]
> does OpenSSH 2.9p1.  So AC_C_BIGENDIAN works only for native compiles.
> [...]
> Also, if you want to throw in cross-compiles for Solaris 8 while
> you're at it, append "|| defined _BIG_ENDIAN".
> 

The topic of cross-compiling with ac_c_bigendian has been on the list
a long ago - a fix exists, and a patch to autoconf may be applied at
any time. It's a FRP (frequently recurring problem) which it needn't be.

Alexandre Duret-Lutz wrote on 23 May 2001 
> BTW, now that 2.50 is out, maybe someone can review these patches?
> http://sources.redhat.com/ml/autoconf-patches/2000-10/msg00015.html




Re: Installable config.h files

2001-07-02 Thread Guido Draheim


Do I understand it correctly that one would have to create the 
config.h.in template file manually? Hmmm.

May I put a pointer here to that alternative with a pkg-config.h 
derived from standard config.h? It has served as a good basis for me.
http://www.gnu.org/software/ac-archive/Miscellaneous/ac_create_prefix_config_h.html
The generated pkg-config.h will then be included by a project specific 
(and installable) header file that adds defines with a longer body
based on the features detected during `configure`-time.

Apart from this, I'd like to remind eveyone that the real cause of
config.h being non-installable is the fact that there is no way to have
multiple inclusions of config.h files from different (sub-)projects
since the feature-defs are not ifndef'ed. A lot of redefined-warnings
would come up that would not look nice - and the project seriously
buggy which it wouldn't needfully be. Everything would be a bit
easier if the feature-defs would read:
#ifndef HAVE_SYS_TIME_H
#define HAVE_SYS_TIME_H 1
#endif
but may be there is an intention about the non-ifdef scheme?

cheers,
-- guidohttp://guidod.4t.com
31:GCS/E/S/P C++$ ULHS L++w- N++@   d(+-) s+a- y++ 5++X-




Re: (wishlist) Automated lib-config script generation

2001-07-22 Thread Guido Draheim


Tim Van Holder wrote:
> 
> > Nowadays, many shared libraries include a -config script e.g. libgtk
> > [...]
> > There is much duplication of code here.  Anyone wishing to use this
> > scheme must duplicate the script and then amend it for their use, and
> > also convert the m4 macro as well, to AM_PATH_.  libtool could be
> > used to automate this.  Currently I maintain the gimp-print build
> 
> It seems to me this really is something that would fall under autoconf's
> jurisdiction, so I've CC'd that list.
> 
> >[...]
> > To remove the need for m4 macro duplication, a single macro could be
> > used instead.  E.g.:
> >
> > AC_PATH_LIB(libname, minimum_version, header, config-script)
> > where libname is the name of the library to check for e.g. gtk, gimp.
> >   minimum version is the lowest version to allow
> >   header is the header to include when compiling the test program
> > (default libname/libname.h)
> >   script is the config script to run (default libname-config)
> 
> This seems potentially useful, though it would depend on how compatible
> -config scripts actually are (i.e. if gtk-config and sdl-config,
> for example, were to take different options, the usefulness of this
> macro would be limited at best).
> 
> > Duplication of the config scripts is unavoidable, as information such
> > as library location, libraries to link against, and header locations
> > are hard-coded into them.  However, an m4 macro could be used to
> > generate them.  E.g.:
> >
> > AC_LIB_CONFIG(libname, location)
> > this would generate the script libname-config in the specified
> > location in the source tree.  The following variables would be used in
> 
> That should be be AC_CONFIG_LIBCONFIG, to match the existing
> AC_CONFIG_* macros.  I'm not sure this has to be a specific macro though,
> as it could be easily created using the existing config.status mechanism.
> [...]

There is a generic-config macro that I'm using quite often. However it
does not implant code into config.status - it would be best to have
that done somehow. Anyway, check the m4 source in the ac-archive under:
http://www.gnu.org/software/ac-archive/Miscellaneous/ac_create_generic_config.html

btw, it does complement a generic lib-check macro that can be found as:
http://www.gnu.org/software/ac-archive/Miscellaneous/ac_path_generic.html

cheers,
-- guido http://www.gnu.org/software/ac-archive
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-




/lib/cpp (c++ on lang_c? libtool problem?)

2001-07-24 Thread Guido Draheim

[autoconf-2.52,automake-1.4f,libtool-1.4b]

checking build system type... sparc-sun-solaris2.6
checking host system type... sparc-sun-solaris2.6
checking target system type... sparc-sun-solaris2.6
[...]
checking for gcc... gcc
checking for C compiler default output... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for executable suffix... 
checking for object suffix... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking how to run the C preprocessor... gcc -E
checking for style of include used by make... GNU
checking dependency style of gcc... gcc
checking for ld used by GCC...
/net/compiler/gcc_mpt_1.00/sparc-sun-solaris2.6/bin/ld
checking if the linker
(/net/compiler/gcc_mpt_1.00/sparc-sun-solaris2.6/bin/ld) is GNU ld...
yes
checking for /net/compiler/gcc_mpt_1.00/sparc-sun-solaris2.6/bin/ld
option to reload object files... -r
checking for BSD-compatible nm... /net/compiler/gcc_mpt_1.00/bin/nm -B
checking whether ln -s works... yes
checking how to recognise dependant libraries... pass_all
checking for dlfcn.h... yes
checking how to run the C++ preprocessor... /lib/cpp
configure: error: C++ preprocessor "/lib/cpp" fails sanity check

why does it try to find a C++ preprocessor?
why does it not fallback to C preprocessor?
The C preprocessor is set correctly!

configure.ac:
AC_INIT(pfe-words.c) 
AC_CANONICAL_HOST
AC_CANONICAL_SYSTEM
AC_SET_DEFAULT_PATHS_DLLSYSTEM
AC_SPEC_PACKAGE_VERSION(pfe.spec)
AM_INIT_AUTOMAKE($PACKAGE, $VERSION)
AM_CONFIG_HEADER(config.h)
AC_DEFINE_VERSIONLEVEL(PFE_CONFIGVERSION)

AC_ARG_WITH(ltdl,
[  --with-ltdl build and install libtool dlopen convenience
kit ])
AC_COND_WITH(ltdl,no)

AC_LANG_C
AC_PROG_CC
AC_LIBTOOL_DLOPEN
AC_LIBTOOL_WIN32_DLL
AM_PROG_LIBTOOL
AC_PROG_INSTALL


but same bad luck with moving LANG_C/PROG_CC up front
AC_INIT(pfe-words.c) 
AC_CANONICAL_HOST
AC_CANONICAL_SYSTEM
AC_LANG_C
AC_PROG_CC


autoconf-2.52,automake-1.4f,libtool-1.4b FAIL
autoconf-2.52,automake-1.4h,libtool-1.4b FAIL
autoconf-2.52,automake-1.4h,libtool-1.4  OK

I'd guess the bug is libtool.m4 related?
Bad interference with autoconf-2.52?

cheers
-- guido   
_______
  Guido Draheim, R&D  <[EMAIL PROTECTED]>
  Tektronix Berlin, MPT E7"Edel sei der Mensch,  
  tel: +49 -30/ 386-23153   hilfreich und gut"  --G.




Re: (wishlist) Automated lib-config script generation

2001-07-28 Thread Guido Draheim

Roger Leigh wrote:
> 
> I have attached a rewritten ac_path_generic.m4 (AM_PATH_LIB), as well
> as an updated AM_CONFIG_LIBCONFIG.  If they are OK, I'm willing to
> submit them to the archive.  Any comments appreciated.

Hi Roger, 

it took me a bit to get time to have a look - AFAICS your macro approach
is actually a bit different - what I am a bit confused about is the
generation of a *-config.in file - does no autotool get confused in
the case that this file does not exists at tooling-time? Hmmm. And may
be that in this case the name of the macro is a bit misleading, may be
you'd add _IN to the name to make it more obvious.

The actual advantage of your macro comes of course with config.status
updates - I hope that I can at one day turn my macro into a form that
adds its -config file output-generation into config.status. IIRC the
new autoconf does support some helper routines for exactly that purpose.
But it seems I don't have much time before mid of august to actually
do it.

Anyway, I'm looking forward to see your macros to go to the ac-archive,
note that I'm co-maintainer if you want to update some macro later on.
(well, I have lately resurrected my private ac-archive.sf.net cvs that
 contains pre-published macros too, this way I have a bit more freedom
 to add extras and package things up other than the original tarball)

cheers,
-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)>+++ y++ 5++X-




Re: FHS && ./configure

2001-10-28 Thread Guido Draheim

Es schrieb Raja R Harinath:
> 
> You can usually use the following command to generate something more
> FHS compliant
> 
>   ./configure --prefix=/usr \
>   --sysconfdir=/etc \
>   --infodir=/usr/share/info \
>   --mandir=/usr/share/man
> 
> However this is not always right.  FHS may require some of the
> binaries be installed in /bin and /lib rather than /usr/bin and
> /usr/lib, etc.
> 
> All in all, it is better not to use 'configure'/'make install' to
> manage your system binaries but use some kind of packaging system.
> 

correct, ... and at the same time, inconvenient for the quick builds
when carrying a tarball of my projects around. It should be then just
configure/make/make-install even for win32-target where the files
should go under /programs, and it for quite some unices a different
scheme about /opt packs is virulent. I hacked up a macro for that, but 
as noted above, it's use it not recommend - it just happens to be there ;-)

http://ac-archive.SF.net/guidod/ac_set_default_paths_system.html

by the way, to be FHS compliant with a single prefix-system is a
very hard task - manpages go to /usr/local/man but /usr/share/man
when under /usr. The /usr-packages should put their configs into
/etc, and the /opt packages use /var/opt for localstatedir. And
therefore, there is not ONE prefix + x * subdirs system that will
exactly match FHS guidelines - it should be left to the local
packaging system to set the paths to where it should go.

The only thing I want to raise my voice again - please add a
generic docprefix where all these docprefix/{man|info|help|html}
files can be put under, nuke manpath/infodir in the distant
future. pleease

cheers,
-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: FHS && ./configure

2001-10-29 Thread Guido Draheim

Es schrieb Gioele Barabucci:
> 
> On Monday 29 October 2001 01:56, you wrote:
> 
> > correct, ... and at the same time, inconvenient for the quick builds
> > when carrying a tarball of my projects around. It should be then just
> > configure/make/make-install even for win32-target where the files
> > should go under /programs, and it for quite some unices a different
> > scheme about /opt packs is virulent. I hacked up a macro for that, but
> > as noted above, it's use it not recommend - it just happens to be there ;-)
> >
> > http://ac-archive.SF.net/guidod/ac_set_default_paths_system.html
> >
> > by the way, to be FHS compliant with a single prefix-system is a
> > very hard task - manpages go to /usr/local/man but /usr/share/man
> > when under /usr.
> ?? sure? /usr/local should be a replication of /usr IIRC, so
> /usr/local/share/man is the correct directory

that was my guess too, before I read the standard.

fhs-2.2.txt, 4.11.5.2 (page 26):
   Manual pages for commands and data under /usr/local are stored in
   /usr/local/man.  Manual pages for X11R6 are stored in /usr/X11R6/man.
   It follows that all manual page hierarchies in the system must have the
   same structure as /usr/share/man.

and 4.9.2 /usr/local, requirements will list /usr/local/man explicitly to exist.

> 
> > The only thing I want to raise my voice again - please add a
> > generic docprefix where all these docprefix/{man|info|help|html}
> > files can be put under, nuke manpath/infodir in the distant
> > future. pleease
> Why nuke it? you can just add your --docprefix and leave that options where
> they are.
yes, we can let it live on for years.. until info-pages are gone
and replaced with some xml/docbook variants

> BTW html files goes under /usr/share/doc/APPNAME/html while man is in
> /usr/share/man

there is no such requirement other than being set by a linux distributor
but I did not check the latest document if that has gone in as standard.
likewise one could assemble the html/xml/docbook/whateverMLvariant into
a single place that has not yet emerged in a standard way (I like to have
them linked into a single dir that I reference back from the local httpd)
but with a docprefix one can relax about the actual root node for such
doc page centers...

basically, I'd see we introduce a $docprefix (or similar name) and let it
be set to just the same $prefix by default by adding somewhere a thing
like docprefix='${prefix}', then do change the default-inits like so:
s|.prefix./man|{docprefix}/man|
s|.prefix./info|{docprefix}/info|

to change the current default settings over there too, and let them go
through docprefix. It still needs that quite some software (eg. automake) 
does honour this extra variable too, so this can not be switched over
in a few weeks, the change needs to be stretched over a couple of years...

cheers,
-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: *** Warning: File `config.h' has modification time in the future

2001-11-07 Thread Guido Draheim

Es schrieb John Poltorak:
> 
> For some reason I keep getting this error msg when running Make after
> configure:-
> 
> *** Warning: File `config.h' has modification time in the future
> 
> config.h has a timestamp one hour in front of the current time.
> 
> Any suggestions about what could be causing this?
> 
> I suspect it is related to $TZ some how...
> 

I've seen such errors when having parts of the project mounted
via nfs - and nfs time skew is not that uncommon when the server
and client disagree about what time it is or the time-protocol
does not sync them correctly in the network. It (almost) certainly 
does not have to do with $TZ or something, - unix programs use 
time(2)-ticks everywhere which is TZ-independent, I mean including 
file timestamps and checks about them... HTH, cheers, guido




re: dont! Re: stdbool macro, take 2

2001-11-08 Thread Guido Draheim

Es schrieb Bonzini:
> 
> > So AC_HEADER_STDBOOL should remove any preexisting stdbool.h before
> > doing any tests that depend on stdbool.h.
> 
> Fixed.
> 
> > I don't see why this would be needed.  `make distclean' can just
> > remove stdbool.h unconditionally.
> 
> Right.
> 

being a person who has been doing some tricky stuff with a generated
file called stdint.h, I would like to oject on generating a stdbool.h
file for a specific reason - installable headerfiles of a project. If
you want to make sure that some of your exported interface function
can use the bool datatype then you must make sure that this type will
actually exist - in the context of that other app using a library,
and there is (of course) one best thing - include some system headers
that will define it. 

so, how to go from here - one way is to install a missing stdbool.h
file along with your public headers but this will break if there is
another package (thinko rpm et deb) that will want to install the
samenamed file, possibly with a different version of code contained
in it. And when you distribute it into a subdir that will require
another package to have an additional includedir (thinko of a
sdl-config --cflags returning -I/usr/include/SDL) you will have a
hard time if another library will do the same thing - which header
will be taken does depend on various effect of the order these
things are given.

A generated stdbool.h file will therefore be only valid for some
apps that do not install public headers - basically no libraries
I guess. This is of limited use and should not go into autoconf.
If you need a bool datatype (or other defs that shall go into
this lib) one should make sure to have it dumped into another
file that will not possibly offend with a "standard system header"
that might be possibly coming in later from a dedicated devel
project, e.g. http://ac-archive.sf.net/gstdint for stdint.h and
where I took the path to not overload the reserved name that was
for usage by the C compiler make (that could be up with an update).

sorry for cutting in so late, I was not attending closely for 
having somewhat too much work lately - basically I strongly recommend
to not go with a make-stdbool.h and put it into autoconf for an 
unexperienced user to happily use it along and coming in later blaming 
for the problems it will make later on. To give it another take - what 
header files are generated with any autoconf macro? The other thing is
correct - you can generated a proper C file to be compiled in just
this project but do not do that with a header file - it will be
misleading.

There are of course other paths that are more appropriate - one way
that I tried upon: make the default name different (possibly without 
a default) and have it #define a check that it will check for itself 
to see if another header file has already typedef'd it (to avoid typedef 
collisions, possibly from a different C compiler or another lib) and 
have a say on need_bool and friends with a whole lot of other problems.

And all this just circumvents the traditional path which is there for
a reason in so many library headers - define a prefix and use the
basic types and modifiers with that prefix, coming to gint and gboolean
and whatever. After all, I wrote that prefix-the-config.h-defines macro
for a reason to have it installable if I just have a normal config.h
"#define bool" from a standard bool-typedef check like we do for const.
Sure, all these do not apply if the header file is only generated for
this project and never installed - but in that case, it should not
be called stdint.h either, call it stdint.inc and make an ifdef
HAVE_STDINT_H like everyone else, with an #else part including the
local inc-file. It's just the C source and no header file after all.

cheers, guido 
p.s. references are
http://ac-archive.sf.net/gstdint
http://ac-archive.sf.net/Miscellaneous/ac_create_prefix_config_h.html




Re: dont! Re: stdbool macro, take 2

2001-11-09 Thread Guido Draheim


just read through that message reference, bruce, and I do agree
of course - I think the real problem with stdint/stdbool and
friends is that it is just a nuisiance to make projects be
portable to platforms that are not strictly up at '99 or better,
and w.r.t. gstdint I can point to the fact that inttypes were 
introduced in 1992, being later part of unix98, and folded into 
C99. But I am not supposed to use these inttypes because there 
are many compilers and systems out there that do not have them, 
- I could adapt my local source code but I am not allowed to put 
them into public header files. It's making me sick, really :-( ... 
and in way the same account to bool/true/false series of defines, 
since one should think that all this is actually established 
programming practice but for the merits of portability, I do have 
to avoid that and start using those mylib_bool defines all over 
the public interfaces.

Of course, instead of doing a mylib_bool for every project of 
mine, I could just as well extract that part into a seperate
project (e.g. "autoposix" ;-)) and let my local one be dependent
on it. And in a way, I would not be suprised that many people
start using libglib just for the reason that it flattens the 
differences of the supported systems and returns, among other
stuff, an up-to-date system interface to such established
programming schemes like inttypes, dlopen and lwp-threading,
and not to forget, gboolean.

@paolo
I agree with you too but on one thing: a programmer does not
specifically need to copy stuff in over and over again, 
instead it is possible to make an #ifdef series like
#if defined HAVE_STDBOOL_H
#include 
#elif defined HAVE_GSTDBOOL_H
#include 
#error could not get either of stdbool or gstdbool, sorry
#endif
and let user code (of such a header file) be required to
check for these. This happens to be of established autoconf
practice to search in multiple headers for things that we
need, and AC_HEADER_DIRENT might serve as an example for it.

And in a way, it was also the reason I started with that
gstdint microproject that is based on my m4 macro to check
into the various possible locations for inttypes and have now
a last resort being a dependency on the file that gstdint
could install all out of itself. I have to admit that I am
not quite convinced that it is the final answer to all the 
problems in this area but just a step more to detach myself
from the problems there are around with multiple lib-headers
to try to define stdxxx types in different locations all
over again.

So, in a way I would ask you to start such a microproject for
gstdbool too - other projects could then check for bool in
it's normal locations (possibly being predefined in the
compiler) and have a third-party check for gstdbool.h, and
just make a note in your project docs that you declare the
bool-types as a precondition to be it around, and hint the
user that this might be in the compiler, from a system extension
or from that microproject one can hand out the url for. And
autoconf could get a standard-macro ac_header_stdbool that
would work just like header_dirent with one exception - it
will also check for the header from from the microproject
that has settled around the autoconf-people, and in a way
it is the reason that I placed gstdint in a subdir of the
ac-archive sfnet-fork as I see it as a thing very close to
autoconf but not just in it.

Anyway, even that I do not feel your stdbool macro should
live in the autoconf core itself, I feel it could be great
contribution to be memorized in the ac-archive, so that
people shall not need to reinvent it. Amongst the various
effects I could not that one would be able to make a
standard for an is-already-defined macro that other header
could check so they do not redefine the same symbols. Such
a thing has been done for inttypes all the way long (since
it overlaps with the bsdtypes stemming from network code),
and it could be used for generated-header too, so that the
generated header contains
#ifndef _GENERATED_STDBOOL_H
#define _GENERATED_STDBOOL_H

#endif
which will avoid redefinitions even when the macro and the
header-generation is used in multiple places around the
software world. Actually, it's another lesson that I had
to learn around stdint and friends. And to emphasize it
again, I am not sure if it is the last lesson I have to
learn the hard way

sorry for such a long talk on it,
'hope the things do not look that foggy
as they do look to me sometimes, cheers, guido

Es schrieb Bruce Korb:
> 
> Guido Draheim wrote:
> > being a person who has been doing some tricky stuff with a generated
> > file called stdint.h, I would like to oject on generating a stdbool.h
> <>
> I agree, with arguments about project focus and project bulk
> to boot:  http://sources.redhat.com/ml/autoconf/2001-11/msg00033.html
> 
> > p.s. references are
> >  http://ac-archive.sf.net

ac-archive@sfnet news..

2001-11-12 Thread Guido Draheim


* autogen macros added
  Bruce Korb's latest achievement in his autogen project resulted in some
  fine macro pieces which I have added to an extra subcategorie in the 
  ac-archive sfnet-branch until it gets decided where to put them in the
  gnu-branch. I did also add a crosslink to http://autogen.sf.net to draw
  more attraction to his works. Let's see how that works...
* some macros updated
  I had some rework of my own macros to update them to work properly in
  the 2.5x generation - this is usually related to kill the m4-changequote
  in there which the multilevel macroisms in the new generation do not
  particularly like. I do also see that the bigendian_cross functionality
  is integrated in the cvs-tree which is good to hear but brings up the
  question what should be done with ac-macros being overrated by autoconf
  itself. For now, I let them rust in peace ;-)
* acinclude tool
  the growing popularity of the ac-archive has made some flaws to shine
  out badly stemming from the aclocals-installoption present only in the
  sfnet-branch. I tried to document the whys in an extra doc section on
  the frontpage at http://ac-archive.sf.net - and which lead me to invent
  a new tool called "acinclude" to help me solve the problems atleast for
  the moment. Well, this is not new code actually but a variant of the
  "aclocal" tool shipped for quite a time with the gnu autotools. Basically,
  it will construct an "acinclude.m4" file from the macros living in the
  ac-archive and other site-wide extension-directories from users or projects.
  This tool might be a bit controverse, so don't hesitate to hit me hard
  for any problems around this one ;-)

I had lately not much time to work on the sfnet-branch, so it still carries
the code-portions that were added to give the feel of the gnu-branch to
some extent. However neither me or Peter had time in the last months to 
solve out the remaining problems and I noticed that more and more people
got to notice the sfnet-branch which was never actually announced as a proper
project - I just needed a working place to put up my package files which
were already under heavy usage, so I could not wait longer than a week or two
after the aclocals-extras were backed out in the gnu-branch cvs. The use of
packaging files and version numbers however seems to be regarded as useful,
so I will adapt the current packaging system later on to let it stand on its
own feet - until then, don't get nervous about the current pile of six rpms
of wondrous naming and functionality as I'll clean that up next time around.

regards, guido




Re: c99

2001-10-13 Thread Guido Draheim

[EMAIL PROTECTED] wrote:
> 
> 1. How do i request a C99 compiler?  Is there some variation on AC_PROG_CC?

Do never ask for a version declaration, always ask for a feature
you need - that is the basic principle of autoconf. Many features
being declared as C99 were present in the 1994-version of a compiler
too, e.g. { .field: value } was present a long time before in many
many compilers - the C99 just made that official. So the question
goes back to you - what *feature* do you want?

Such a feature test can then be made independent of AC_PROG_CC, well,
it would "exit" when the current `cc` or `gcc` does not have the
feature you want - or do you want a macro that walks through several
installed compilers and chooses the c99-one? Well, that might be
another page in the book, gnu hackers are usually okay with finding
the gcc being usually the most uptodate... which is why AC_PROG_CC
will usually prefer a gcc over vendors'cc when both are there...

> 
> 2. i see a link to "The Official Macro Archive" but it doesn't work.
> Is there a mirror somewhere?
> 

you do not mean a link to cryp.to/macro-archive but the link to the
GNU Macro Archive at http://www.gnu.org/software/ac-archive - right?
A downtime of gnu.org should not happen for long time, but you can
get an extended tarball from my branch at http://ac-archive.sf.net
but I did not bother lately to upload a gnu.org-lookalike generated
from my cvs copy, so it is not exactly the same the gnu.org-tarball
named autoconf-archive.tar.gz - I'll schedule to put up a gnu.org
lookalike for next week, both as a mirror (for you convenience) and
as technical demonstration that the sf.net ac-archive build system
includes all features from the gnu ac-archive makefile (which is
the case since the branch happens to exist).

cheers,
-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: c99

2001-10-13 Thread Guido Draheim

[EMAIL PROTECTED] wrote:
> 
> On Sat, Oct 13, 2001 at 11:18:16AM +0200, Guido Draheim wrote:
> > [EMAIL PROTECTED] wrote:
> > > 1. How do i request a C99 compiler?  Is there some variation on AC_PROG_CC?
> >
> > Do never ask for a version declaration, always ask for a feature
> > you need - that is the basic principle of autoconf. Many features
> > being declared as C99 were present in the 1994-version of a compiler
> > too, e.g. { .field: value } was present a long time before in many
> > many compilers - the C99 just made that official. So the question
> > goes back to you - what *feature* do you want?
> 
> i'm using:
> 
>   for (gint xx=0; xx < 3; xx++) { .. }
> 
> and i generally mix variable declarations and statements as i
> please instead of putting all the declarations near the open brace.
> 
> In terms of an autoconf probe, what do you suggest?

ouch, that's a autoconf-test that has not been written so far, even
more for this special case there were two interpretations (with xx
being seen as ifdeclared outside of for() or the symbols is only valid 
within the for()) - of course it could be easily written (and would
go the macro-archive then) - somebody around here who has an example
ready?

> 
> > Such a feature test can then be made independent of AC_PROG_CC, well,
> > it would "exit" when the current `cc` or `gcc` does not have the
> > feature you want - or do you want a macro that walks through several
> > installed compilers and chooses the c99-one?
> 
> That seems like overkill.

to me too.

> 
> > Well, that might be
> > another page in the book, gnu hackers are usually okay with finding
> > the gcc being usually the most uptodate... which is why AC_PROG_CC
> > will usually prefer a gcc over vendors'cc when both are there...
> 
> Yah, fine.
> 
> > > 2. i see a link to "The Official Macro Archive" but it doesn't work.
> > > Is there a mirror somewhere?
> >
> > you do not mean a link to cryp.to/macro-archive but the link to the
> > GNU Macro Archive at http://www.gnu.org/software/ac-archive - right?
> 
> Oh, neat.  No, actually the link is:
> 
>   http://research.cys.de/autoconf-archive/
> 
> i found this link on <http://sources.redhat.com/autoconf/>, so maybe
> it needs updating.
> 

this is in fact the old link and there were no updates to those pages
for quite a time - the ac-archive is now a GNU project and the 
developments are done at savannah.gnu.org -  So, yes, the redhat.com 
pages should be updated (and not only in this respect ;-))

cheers,
-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: c99

2001-10-13 Thread Guido Draheim

[EMAIL PROTECTED] wrote:
> 
> On Sat, Oct 13, 2001 at 11:24:57AM -0700, Paul Eggert wrote:
> > > From: [EMAIL PROTECTED]
> > > Date: Sat, 13 Oct 2001 10:22:31 -0700
> > >
> > > AC_MSG_CHECKING([whether $CC accepts C99 declarations])
> > > AC_TRY_COMPILE([],[
> > >   int x=0; x+=1; int y=0;
> > >   for (int z=0; z < 2; z++);
> > > ],[
> > >   AC_MSG_RESULT(yes)
> > > ],
> > > [
> > >   AC_MSG_ERROR([
> > > *** This package requires a C99 compiler.])
> > > ])
> >
> > OK, but why bother with that?  Just run 'make'.  If it fails, your
> > compiler doesn't support C99 declarations.  I see little need to
> > discover that at 'configure'-time.
> 
> At least i can guess the "-std=gnu99" option if CC=gnu .. ?

make a macro (after AC_PROG_CC) that includes a check  like
TESTC99DECLARATIONS
if notOK and test $GCC = yes ::
  save_CFLAGS="$CFLAGS"
  CFLAGS="$CFLAGS -std=gnu99"
  TESTC99DECLARATIONS
  if stillnotOK ::
CFLAGS=save_CFLAGS ; echo BAD (or bail out)
  else
echo "OK (with -std=gnu99)"
  fi
else
  echo BAD (or bail out)
fi

get the idea? :-)
  

> 
> > Now, if your goal was to find a C compiler tht supported C99
> > declarations, that would be another story.
> 
> Yah, that's my goal.
> 
> > Or if your goal was to define a macro that is nonzero if C99
> > declarations are supported, Autoconf could do that too.  But I don't
> > think a macro like that would be all that useful in practice: it'd
> > just make the code uglier.
> 
> No, that's silly.  i'm not going to litter my code with #ifdefs
> for old compilers.  More realistically, i just want configure to
> suggest upgrading gcc if the installed gcc doesn't support C99.
> Something like that.
> 
however you will notice that many many compilers out there do not
support your style - atleast not now. Your code is not (yet) quite
portable but making it portable atleast throughout gcc makes still
fine for a good share of computer space. I support what Pauls remark
points at but then again, you can use autoconf to detect differences
on platforms that have a gcc... and there are surely quite some..:-O

cheers,
-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: c99

2001-10-13 Thread Guido Draheim

[EMAIL PROTECTED] wrote:
> 
> On Sat, Oct 13, 2001 at 09:07:54PM +0200, Guido Draheim wrote:
> > > No, that's silly.  i'm not going to litter my code with #ifdefs
> > > for old compilers.  More realistically, i just want configure to
> > > suggest upgrading gcc if the installed gcc doesn't support C99.
> > > Something like that.
> >
> > however you will notice that many many compilers out there do not
> > support your style - at least not now.
> 
> Of course but i can't go back.  The ability to declare variables
> anywhere is fantastic.  i'm not even tempted to use C++ anymore.  :-)
> 

if only I had enough time to work and extend the substruct-c into a true 
patch on top of gcc - making C a thing being still C and having only
the good features of C++ - but there is no time... 
http://guidod.4t.com/substruct-c
cheers,
-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: c99

2001-10-14 Thread Guido Draheim

[EMAIL PROTECTED] wrote:
> {..}
> Is this stylistically acceptable?

*fg* oh well, speaking about style is (of course) a matter of taste
and then a matter of whose taste it shall please. For a macro that
you want to reuse in your own projects, yes looks good. 

Speaking as a maintainer of the (gnu) ac-archive, I would add just 
some optional hints:
* calling the macro TEST_C99 is not quite correct as you only test
  one specific feature - so it would be good to call the macro as
  just that, e.g. TEST_C99_FOR_BLOCK or whatever you like to call it.
  Just see how many STL&C++template checks we currently have at the
  ac-archive.. compilers might implement C99 features just partially.
* you call a compile-test, even twice, and it looks as if this is a
  case that can be turned into a macro using an AC_CACHE_CHECK value.
  That's especially good during creation of configure-scrips as the
  test can be run with a good cache-file that holds some answers
  premade, and there is a good benefit in overly large projects that 
  might be able to share a answer-file - and speed up their configure
  time dramatically. 
* tune your (internal and) exported variables, possibly just make them
  longer so they do not accidently trap those of other macros, atleast
  ensure to give a hint where a variable might have come from that another
  macro might start to go and use - $C99 is probably not the best one.
* Personally, I'd prefer the cache_val to contain the needed answer
  of the test - well, I'd see the needed clfag -std=gnu99 as the answer.
* in a highgrade extension, consider to add an ACTION-IF, ACTION-IF-NOT, 
  ACTION-WITH-CFLAG triple along with an ifelse() that puts in default
  actions like adding the needed cflag to CFLAGS. But that might be
  overdone for what you need - it would just serve you in learning 
  decent tricks one can do to make an autoconf-macro even more reusable ;-)
* anyway, I'd really like to have this macro in the ac-archive for
  others to reuse (and let it be tested for you ;-)) ... just read the
  small hints about the small extras that are needed to have it registered:
  http://ac-archive.sourceforge.net/#formatting

looking forward, and good luck, Guido

> 
> define(TEST_C99_DECL, AC_TRY_COMPILE([],[
>   int x=0; x+=1; int y=0;
>   for (int z=0; z < 2; z++);
> ],[C99=yes],[C99=no]))
> 
> AC_MSG_CHECKING([whether $CC accepts C99 declarations])
> TEST_C99_DECL
> if test $C99 = no -a $GCC = yes; then
>   save_CFLAGS="$CFLAGS"
>   CFLAGS="$CFLAGS -std=gnu99"
>   TEST_C99_DECL
>   if test $C99 = no; then
> CFLAGS="$save_CFLAGS"
>   else
> C99_COMMENT=', with -std=gnu99'
>   fi
> fi
> 
> if test $C99 = yes; then
>   echo "yes$C99_COMMENT"
> else
>   AC_MSG_RESULT(no)
>   AC_MSG_ERROR([
> *** This package requires a C compiler with C99 support.  Please
> *** consider trying a recent release of GCC.])
> ]  fi
> fi
> 
> --
> Victory to the Divine Mother!!
>   http://sahajayoga.org

-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: GNU autoconf, automake, make, m4 32/64-bit Support

2001-10-10 Thread Guido Draheim


"Torok-1, Maria" wrote:
> 
> Anything you could tell me regarding 32-bit and 64-bit versions of automake,
> make, and m4 would also be appreciated!
> [..]
> > E-mail:  [EMAIL PROTECTED]

"Torok-1, Maria" wrote:
> We're currently using GNU libtool v1.4 on Solaris 2.6,

... before you flood the mailing-list, autoconf/automake etcetera are
scripted on top of other tools that are to be asked for 64-bit support,
the scripts that we discuss on these mailinglists are not a problem.

you could use the solaris make/cc just as you can use gmake/gcc - since
I was able to build on ia64 with gnu tools, I don't see a problem here.
Unix-systems will also ship with an m4 but autoconf manuals says it 
requires gm4 - I do not know if m4 @ sun.2.8 has evolved enough to be
compatible. For automake you will want to look for a perl package. If
that is good, feel free to do the rest.

Does anyone intend to support 64-bit? Absolutly! and for all time...
well, you know, it's the future atleast of unixish systems...

cheers,
-- guido 
joke:
windows) where do you want to go today
linux) where do you want to go tomorrow
unix) are you coming guys?




Re: crossmingw32

2001-12-25 Thread Guido Draheim

Es schrieb Rodrigo Augusto Barbato Ferreira:
> 
> Hi,
> 
> I am not succeding in generating .dll libraries using
> autoconf-2.13/automake-1.4/libtool-1.3.5 in a crossmingw32
> environment over my gnu/linux box.
> 
> I believe it is a bit diferent than building in a
> mingw32 or cygwin environment over a windows box
> ([un]fortunately I do not own windows to do that).
> 
> I hope someone here could point me some simple directions
> in how to do that, so that I can see what I am missing.
> 


Guido Draheim wrote on Aug 2, 2001:
> 
> Mumit Khan wrote:
> > Excellent! Thanks for putting this together. I'll put it on Mingw web site
> > with your permission (when I return from vacation in about 1.5 weeks).
> > 
> 
> your welcome. I did intend to make up a small series of documents 
> circulating around portability of sharedlib creation, to be put
> into a dedicated area of its own (on my homepage?) - it just happens 
> that I have not much time, so it boiled down to the most pressing
> of Q/A docs being about creation of dlls with libtool and crossgcc .
> Anyway I'll attach a html'ized version of the faq that I did make 
> up just yesterday. Feel free to correct me (atleast I should put
> a pointer a section about ming.org life AFAICS).

... I have that "dllmaker guide" up in my head but no time to write
it down - AFAICS it would be about 60 pages and no one will pay me
for that

anyway, hope this old "crossgcc mingw32 dll FAQ" can be of some help.

cheers guido
Title: crossgcc mingw32 dll FAQ
crossgcc mingw32 dll FAQ


   Do you have another Q/A that could go into an FAQ? Do you know 
   another webpage already with some (same|other) explanation? Then
   write today, either to the mailinglist (libtool(@)gnu.org) or the 
   orignal author.

   (this document was written by Guido Draheim 
using the copyright legalese of the GNU FDL, where reproduction
restrictions do not apply when put inside other documentation
describing gcc+win32 target for programs (the main topic of this))

Q: can I use libtool 1.3.x to build dlls?
Q: where can I get a nice mingw32 crosscompiler
Q: what about debian mingw32 crosscompiler
Q: I'm getting a libtool message saying "dunno-yo-lib"
Q: "dunno-yo-lib" but the lib is there, in a sys_lib of the gcc
Q: dll in another subdirectory, found but "cannot execute binary"
Q: dll in another subdirectory, found but many unresolved symbols
Q: everything is resolved but a data-symbol from my dll

   ===
 
Q: can I use libtool 1.3.x to build dlls?
A: Theoretically yes, practically no. Using libtool-1.4 is a far
   better choice, or use the patched version you can find in the
   SDL tarball living at http://libsdl.org. In any case, for 
   either 1.3.5 or 1.4 (but not the sdl-patched 1.3.x): 

   DO NEVER FORGET TO SET "-no-undefined"

   as a linker flag, libtool won't do it automatically just for being
   needed in enable-shared mode. (Libtool can create about any win32
   static-libs even with undefined_allowed, but it would not do so
   with -no-undefined plus failures in creating the dll just before.)
   Since you are already doing a [case "$host_os" in mingw*) _FLAGS=xx]
   in your configure.ac, then you can go and have a look at some
   other useful options for win32/dll creation.
   -no-inhibit-exec ... but rarely with an effect, instead it may
  obscure some bugs during compiling. But it is sometimes
  helpful while deep in the development stage of your project.
   -mconsole or -mwindows ... links with different startup code,
  the latter assumes your code opens a GUI window on its own
  and likes to be run detached from console. Real windows
  developers even know about threaded libc, differences
  about dos-box/nt-box, stdout/stderr handling for GUI-apps,
  and quickwin options to have console unfold its own 
  window even when started on console - be lucky to have
  just the choice between two -m/machine options.
   --export-all-symbols ... a dlltool option, ignoring hints from
  gcc given via in-source __attribute__(dllexport) marks.
  There are other options, especially about export-lists
  from your handcrafted def-file, or regex-definition
  on symbols. But usually you just use this one.
   -export-dynamic ... the libtool option you would normally use
  to export functions - just as there are other defintions
  including regex-variants.
   -avoid-version ... a libtool option - since 1.4 the generated dll
  file be named as if for a unixish ld.so system - where it
  would be libmylib.2.5.8.so o

Re: FHS

2002-01-05 Thread Guido Draheim

Es schrieb "Richard B. Kreckel":
> 
> Now four years into FHS-2.0, can we consider this a bug?  Or is FHS buggy?
> 

Didn't we had a lengthy talk about it lately? Well, actually, I wrote a
little macro that saves me the burden to place all the if/then/else
computations into a configure script again and again - I tried to 
squeeze a reasonable systematic approach out of the FHS, and so far it
served me quite well. Just place this ac-macro into your project and
see for yourself. Have fun, Guido

http://ac-archive.SF.net/guidod/ac_set_default_paths_system.html

(note: the first idea came from _dllsystem support but then I noticed 
 all the little differences about /usr/local /usr and /opt installations).




Re: FHS

2002-01-05 Thread Guido Draheim

Es schrieb "Thomas Bushnell, BSG":
> 
> Guido Draheim <[EMAIL PROTECTED]> writes:
> 
> > Didn't we had a lengthy talk about it lately?
> 
> So since I'm the main directories for the GNU Coding Standards, I've
> sent a note to RMS requesting that he alter the coding standards to
> change the placement of infodir and mandir to put them under datadir.
> 
> I'm sure he won't object, but until the change is made, autoconf
> should remain as is.  Are there other FHS-related problems to address
> at the same time, besides these two?
> 
> Once the change is made, autoconf should make sure that all GNU
> maintainers are apprised of the change so that they can make sure
> their next release adapts accordingly.
> 

in the moment, I am just out the door (sat night), but alteast note
one thing: /usr-prefix means configs got to /etc, and not /usr/etc
... and there are other such things, e.g. /opt and their share data
like man-pages... all documented in FHS...

cheers, guidod




Re: FHS

2002-01-06 Thread Guido Draheim

Es schrieb "Thomas Bushnell, BSG":
> 
> Harlan Stenn <[EMAIL PROTECTED]> writes:
> 
> > There is more to software than GNU.
> 
> Sure, but the GNU Coding Standards are for GNU, by definition.  There
> are many things that are disallowed by them in the interests of making
> GNU better.  For example, info files must all be machine independent.
> There is no good reason that GNU man pages should not also be so.
> 
> > If not, think a bit about how your philosophy will unfold, and exactly what
> > your goals are.
> 
> I think it is reasonable for the GNU Coding Standards to comply with
> the FHS.  But the Coding Standards do not need to permit everything
> that is permitted by the FHS; they are intentionally narrower.

May be it's because I'm not a native speaker but I would not call it
"permit", right? The FHS does to some degree follow established 
practice and it tries to fold them into a system that we could attribute
as being nice to install and maintain. This does not needfully follow
the abstract idea of a "prefix" where all other installpaths are just
subdirectories/subpaths - (( note the fine macros from adl being built 
around this: http://ac-archive.SF.net/adl/stdrelpaths.html )) and
instead it moves some of the install-dirs out of the prefixed tree,
alteast for the /etc and /var data directories. Have a look at my ac
macro http://ac-archive.SF.net/guidod/ac_set_default_paths_system.html
which should be pretty much readable - e.g. the /opt-prefix makes their
configuration files go to /etc/opt and not /opt/etc, and same for many
of the localstatedir (i.e. /var) paths which are being moved to hang
under a *parallel* prefix in /var - in a way one could condense that
for a any prefix /my/prefix the var-files shall go to /var/my/prefix.

Personally, I would see atleast two different modes - home-install and
system-install. For a home-install, (or local-install) you do usually
configure --prefix=$HOME so that the files get installed in (direct!)
subdirectories of $HOME - $HOME/bin, $HOME/share and so on, and in
fact, many of the FHS intentions (like putting non-modifiables on to 
a different filesystem) do not count here. In my opinion, this is the
original gnu intent for tarball-shipped software since the users do
not have the option to install to a system path, and instead I know
of many sysadmins who offered a prefix writeable for a group of people,
and by some interesting coincidence the group-maintained prefix was
often called /usr/local.

The other mode is of course a system-install, and in that mode you 
want to ensure that the /usr-filesys can be mounted readonly, that
the basic configurations-files go under /etc and can be saved away
in one go, and localstate and pooldir-paths go to the /var-filesys,
so as an essence, in system-mode, a prefix like /my/prefix will make
the config-files go to /etc/my/prefix instead of /my/prefix/etc
which would be correct for a local-install. 

The macro above does halfwise reflect the modes by looking at the
prefix and choosing different defaultpaths for /opt or /usr prefixes
but it is certainly in some respects incorrect. But let me say it
loud: DO NOT use FHS rules that apply to a /usr-prefix (i.e. a
system directory) to talk about rules for a local-prefix (i.e. the
gnu default hierarchy). It's two different things and the fact that
/usr-binaries use /usr/bin and local-binaries use local/bin should
not be made as a general concept for the rest of the install-directories.

EPILOGUE:
note that most system-install-from-tarball will also see the prefix
and configure the gnu software with the different paths like --mandir
and the sysconfdir and such - if you have an linux-rpm-system that
look into /usr/lib/rpm/macros at the %configure-macro - it usually
carries a lot of ac-configure-options for you.

note also, that I do not see a specific need to change the mandir
path to datadir, but I would put on schedule to be removed completly
and make it dependent on a docprefix where info and man-directories
can hang form. But I do not see a problem for a non-/share-path for
these directories in local-install mode. They can be left there.

and note furthermore, that I see one big item that I see as being
wrong in current ac/gnu defaultpaths - sharedstatedir should not
be put into $prefix/com/ but $sysconfdir/default/, that is it should
end up in $prefix/etc/default/.

cheers, guido




Re: FHS

2002-01-06 Thread Guido Draheim

Es schrieb Russ Allbery:
> 
> The vast majority of software installations by system administrators from
> source do not go into /usr, /etc, or anything else owned by the system.
> They go into /usr/local, /opt, or some other local path.  /usr and the
> like are reserved for the system and its own packaging system.

note that /opt is mentioned in FHS - most software layouts use it to
keep add-on packages seperate  - i.e. /opt// (and IIRC it is
mentioned *that way* in FHS) but still they like the configfiles to
go to /etc... well... |^O

> [..]
> Personally, I'm somewhat annoyed to see yet more paths going into Autoconf
> that I'll then have to override to get the installation paths that I want
> (namely a flat tree of bin, sbin, lib, man, info, and etc); 

Personally, I have to put a lot of packages into my $home and I do not
quite like all the extra directories - I like to move *all* datalike
files to go to a /share/ path, esp. man and info-files. Moving pages
down feels good for me - and even more, in my mind, I have the abstraction
that most of the extra datafiles extend *another package*, that is my
own files go to $prefix/share/ and man-pages go to the share-
directory of the man-package and info-files go to the share-directory
of the info-package and aclocal-files go to the share-directory of the
aclocal tool. Moving the extra datafiles out of the share-tree does
violate this abstraction... just my 2 cent...

> 
> Changes towards stricter FHS compliance do make things harder for those of
> us who developed our file system layouts long before people started
> fiddling with things like share and libexec and don't want to go through
> the pain of trying to change them.
> 
and let me reiterate the fact that the FHS has not created a universal
prefix/subpaths scheme that the current default-prefix system in autoconf
has in use. And it might be better to keep with a simple install-layout
in basic autoconf - real soft admins will override them anyway to reflect
the needs of the local software policy.

/snip/
Another task is about a discussion of the current wealth of dirpaths that
we have in current autoconf - I do not quite remember how often I have
raised my voice in favour of a --doc-prefix so that at some point in the
future we can kill both mandir and infodir - it might be that future 
system installations will not anymore install info-pages (or even man-pages)
anymore, e.g. a desktop-centric system like darwin/mac. It might come the
time that info-files become obsolete and a thing of the past - IIRC, there
are projects underway to make docbook the help format of choice - but
there is no --docbookdir yet, and I fear the time we add yet another such
prefix for program help files where we could slay them all with one prefix
that a /man/ or /info/ or /help/ gets appended - soft admins will only
need to override the prefix to flatten the doc-filetree.

cheers,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




re: OT Re: Removing function calls (potentially OT)

2002-01-07 Thread Guido Draheim


I did send an e-mail directly, and posted an answer publically to
news:comp.lang.c.moderated - it is really off-topic. cheers, guido

Es schrieb Kenneth Pronovici:
> 
> I guess this is probably somewhat off-topic, but I'm hoping someone here
> can point me in the right direction.  I apologize in advance if this is
> too off-topic - feel free to take the conversation off the list if need
> be.
> 
> I have a function:
> 
>LogTrace(char *format, ...);
> 
> that is used for tracing.  For potential performance reasons in
> production my boss wants the option to remove these calls entirely.  I'd
> like to provide an autoconf option --without-tracing that would build a
> version of my program with these calls removed, but I'm running into a
> brick wall (possibly of my own making).
> 
> My initial guess was something like this:
> 
>#ifdef WITHOUT_TRACING
>   #define LogTrace
>#endif
> 
> However, that definition will turn this:
> 
>LogTrace("%s", variable);
> 
> into this:
> 
>("%s", variable);
> 
> which doesn't seem like a good idea although it does seem to compile at
> least some of the time.
> 
> Anyone have any suggestions on a better way to do this?  Just an example
> of a package which does something similar would be a great starting
> point.  I'm doing all of this work in ANSI C.
> 
> Thanks, and again, I'm sorry if this is too off-topic.
> 
> KEN
> 
> --
> Kenneth J. Pronovici <[EMAIL PROTECTED]>
> Personal Homepage: http://www.skyjammer.com/~pronovic/
> "They that can give up essential liberty to obtain a little
>  temporary safety deserve neither liberty nor safety."
>   - Benjamin Franklin, Historical Review of Pennsylvania, 1759
> 
>   
>
>Part 1.2Type: application/pgp-signature

-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: macro archive re-launch

2002-01-20 Thread Guido Draheim

overkill.




Re: Autoconf support for VxWorks.

2002-01-21 Thread Guido Draheim

Es schrieb "Fredriksson, Johan":
> 
> Do I need to change the rules in autoconf.m4f and acspecific.m4 to get
> autoconf to support VxWorks. I have a package that I've built on Cygwin and
> Sun-Solaris, now I need to build it on VxWorks.
> 
> If I need to do this where do I get Information about it? The autoconf
> manual don't bring me much information.
> 

May I ask if you compile "on" vxworks, or crosscompile "for" vxworks?
>From my experiences with compiling autoconf'ed programs for vxworks,
it was mostly the missing crosscompile support that led to problems
(and, btw, the multitude of vxworks-runnig target processors spawned 
 the idea about the crosscompile-safe ac_c_bigendian). And simulating
some sane ac_check_lib behaviour was a real PITA, so if you could
come up with some good ideas to do that, plase, go ahead, but
so far I would have chosen to put such vxworks support into
libtool rather than autoconf, so the linking fails or not in there.

cheers,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: macro archive re-launch

2002-01-21 Thread Guido Draheim

Es schrieb Dan Kegel:
> 
> Guido Draheim wrote:
> >
> > overkill.
> 
> Agreed.  I'm tempted to say:
>   Death to XML, long live *plain* text!
> 
it's not quite xml that's bugging me - it's that the
intelligence needed to form a good html view is put
on the creator of the macro. Simon(s) says, that the
current extraction of information out of the dnl-header
is not very good in order to get a good html view - 
this is true. But my answer would be to add more
intelligence into the extractor with more options that
writers of dnl-headers can use to optimize the output,
possibly with optional xml-tags thrown in some of the
time - as part of the dnl-commentblock, not around it.

Since gnu ac-archive drifts into xml buzzword space, I
will probably take the opportunity to throw away Simons' 
macro2html.cpp that is currently used, and build a perl
script for the sf-net ac-archive that will have atleast 
the extracting and report generation intelligence of the 
c++ binary - I have to admit that I find it easier to 
add extra intelligence into a script made in a Practical 
Extraction and Report Language rather than in C++, but 
that has to wait a bit since I am busy with other stuff.

Anyway, let's see what the Simons' efforts will finally
bring about - I must say to be quite interested to see
what the experiences with such a setup will be, since
I am doing quite some xml stuff as part of my daily life,
but in there I value it more as the intermediate computer-
written almost-human-readable data-representation instead 
of a user-written input and user-oriented output format.

Anyway, perhaps such a perl-based script that converts
macro-dnl-commentblock to xml/html-doc-pages, could be
also used for plain autoconf, but that's a second step
I guess, as there is much more complexity in that.

cheers, guido




Re: Chances of build success on non-Unix platforms (read: dos/win32)

2002-02-07 Thread Guido Draheim

Es schrieb John Poltorak:
> 
> Is there any place I can search to find if an app has been succesfully
> built with autoconf+friends on a non-Unix platform?
> 
> I'm thinking of trying to build Tar 1.13 and see refences to DOS but have
> no way of telling whether they are purely historical, (going back to the
> last millenium  :-) ) or whether there is any likelihood of success
> nowadays.
> 

note the difference between the build platform and the platform the
resulting binary shall run. To *build* a autoconf'ed project under
non-unix platforms, you need to install a build-environment that has
the most common unixish fileutils available, esp. "sh" and "sed".
The most common today are cygwin and mingw, and I don't remember if
dj delorie is still maintaining his environment, which also had a
dos variant, but I thinkt dos is too outdated now.

The other part is resulting binary, and the question whether a
specific unixish project (like "tar") will compile for it. Note
here about cygwin which will bring a unixish personality to
windows but at the same time it limits interoperability with
other win32 programs (it's not impossible though). It would 
need the project source to be ready-made for the pecularities
of the win32 platform APIs, and in many cases like gnu tar,
you will find somewhere a patch that adds the missing win32
ifdefs.

To get it straight - you don't need to run windows to compile
a program for the win32 platform - use a crossgcc, for the
linux-to-mingw32 part you'll even find prepacked rpms, or just
build one yourself (e.g. on solaris) it isn't *that* complicated.
So, if you just want to have a win32 tar, you don't need to
leave a typical unix environemnt, and you are saved from
installling one for win32, in order to create a win32 binary.

hope this sheds some light on it,
read the rest on at the mingw and cygwin pages,
and if there are problems with the resulting
binary, come back, and ask IN THE LIBTOOL mailinglist
since you SHOULD REALLY use libtool when trying to
mess with the win32 platform.

cheers,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: how to prefix definitions in config.h

2002-02-19 Thread Guido Draheim


hello clinton,

may I ask what you are *really* trying to achieve?
Personally, I would like to depracate to prefix a config.h file
in place, even more if you would really install it along with
your library - I've seen to many include-order conflicts for
this case, so please don't do that.

Secondly, if I read your post well, then you try to modify a
file, but not twice, but want to rerun it. Am I confused?
Well, might be. Anyway, I don't expect things to be easy,
for the case that the input and output file are in the
same position.

Thirdly, you didn't mention that the ac-macro has a few
optional arguments, one being input-file, the other being
output-file, which you didn't mention again. What's
wrong with that, and if it is not sufficient, why didn't
you contact me?

Fourthly, so you try to add it to ac_config_commands,
hey I like that, I did not quite bother so far to add
the script to config.headers, so that the prefixing
can be run automatically as a dependency of a reconf.
But first, let's sort things out what you're really
trying to achieve.

cheers,
-- guido

Es schrieb Clinton Roy:
> 
> Hello all, I'm trying to mangle a config.h for a shared library, we'd
> like all the definitions in it to have the same prefix.
> 
> I've tried using AC_CREATE_PREFIX_CONFIG_H from
> http://www.gnu.org/software/ac-archive/Miscellaneous/ac_create_prefix_config_h.html
> 
> but it doesn't work so well when the input and the output file are the
> same.
> 
> I've got a simple sed command followed by a mv to do the prefixing
> for me, but I'm not sure how to get these commands run after config.h
> is created; AC_CONFIG_COMMANDS comes close, but config.status doesn't
> always run my commands, this is probably because I am using the wrong
> tag, but since I can't use the tag of config.h I'm stuck.
> 
> I've also tried AC_CONFIG_HEADERS, but automake complains:
> automake requires `AM_CONFIG_HEADER', not `AC_CONFIG_HEADER'
> 
> Any help appreciated.
> 
> autoconf 2.52
> automake 1.5d
> 
> --
> Clinton Roy
> 
> Meetings - ``Try, or no try; there is no do.''

-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: how to prefix definitions in config.h

2002-02-19 Thread Guido Draheim

Es schrieb Clinton Roy:
> 
> Guido Draheim <[EMAIL PROTECTED]> writes:
> 
> > may I ask what you are *really* trying to achieve?  Personally, I
> > would like to depracate to prefix a config.h file in place, even
> > more if you would really install it along with your library - I've
> > seen to many include-order conflicts for this case, so please don't
> > do that.
> That's exactly what we're trying to do ;)
> 
> If the config.h header is installed in $prefix/include/package/config.h,
> and all it's definitions are prefixed with the package name, we think
> we've done everything needed to avoid name clashes.
> 
> > Secondly, if I read your post well, then you try to modify a file,
> > but not twice, but want to rerun it. Am I confused?
> Hrm. Maybe not you, but I definitely am ;)
> 
> I would like config.h generated in the normal way via
> AC_CONFIG_HEADER, then prefixed with our package name.

I see,
I'm doing the same thing, and in fact, all libraries should put
their headers with a subdir prefix, sadly not all of them do it,
and automake's support for such a style is limited. Anyway,
two ways.

a) I'm using package/_config.h - this even hint the reader that this
   file is not a normal header (it is generated!) and it does not
   get confused for the standard config.h file.
b) look at the argument of AC_CONFIG_HEADER - you can specify a
   different output-file, so that you don't need to have the two
   same-named.

sadly, there is not enough automatism in my ac-macro that it will
detect the different ac_config_header, so you have to give it as
an explicit argument. 

so much for the problem of the same-named file, the other is...

> 
> > Well, might be. Anyway, I don't expect things to be easy, for the
> > case that the input and output file are in the same position.
> That's the main problem I've had with AC_CREATE_PREFIX_CONFIG_H, it
> doesn't store the output of it's sed command in a temporary file and
> move it across the original, so in my case the original is lost.
> 
> > Thirdly, you didn't mention that the ac-macro has a few optional
> > arguments, one being input-file, the other being output-file, which
> > you didn't mention again. What's wrong with that, and if it is not
> > sufficient, why didn't you contact me?
> Not contacting you directly was an oversight on my behalf. I really am
> quite confused as to why prefixing of definitions isn't a standard
> option already. The other problem I can see with
> AC_CREATE_PREFIX_CONFIG_H is that it doesn't get put into
> config.status and thus I think it would only be run once.

... the other problem is that the prefixing is not done during 
reconfig, and in fact, the items in my ac-macro should be appended
to the config.status file, which I didn't bother to do so far,
and I didn't have enough time to figure it out.

> 
> Unfortunately when I try to use AC_CONFIG_COMMANDS to get the
> prefixing done automatically on reconfiguration, I don't know what to
> use as the tag to always have the command run, and it only gets run in
> the initial configure invocation.
> 

... and btw, of course, I do not need the intermediate config.h file
either, I wouldn't have a problem if s.o. could come up with a macro
reading AC_PREFIXED_OUTPUT_HEADER that would be put in the place of
AC_OUTPUT_HEADER today - I do only ifdefs on the prefixed stuff, not
the original config stuff, so there's no need for the intermediate
file ;-)

cheers,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: how to prefix definitions in config.h

2002-02-19 Thread Guido Draheim


I forgot to mention, that I would like to keep the ac-macro
backward compatible, atleast for some time, sine most linux
distros still ship with autconf 2.13 as the default. Anyway,
the macro has come to be one of the most used by library
makers, so you're right, one might want to consider to add
it to the main autoconf. Well, in that case, it should be
done right and true, not just done and useful ;-) ... so it
should atleast be able to add to config.status 8-] --guido

Es schrieb Guido Draheim:
>
> > option already. The other problem I can see with
> > AC_CREATE_PREFIX_CONFIG_H is that it doesn't get put into
> > config.status and thus I think it would only be run once.
> 
> ... the other problem is that the prefixing is not done during
> reconfig, and in fact, the items in my ac-macro should be appended
> to the config.status file, which I didn't bother to do so far,
> and I didn't have enough time to figure it out.
> 
> >
> > Unfortunately when I try to use AC_CONFIG_COMMANDS to get the
> > prefixing done automatically on reconfiguration, I don't know what to
> > use as the tag to always have the command run, and it only gets run in
> > the initial configure invocation.
> >
> 
> ... and btw, of course, I do not need the intermediate config.h file
> either, I wouldn't have a problem if s.o. could come up with a macro
> reading AC_PREFIXED_OUTPUT_HEADER that would be put in the place of
> AC_OUTPUT_HEADER today - I do only ifdefs on the prefixed stuff, not
> the original config stuff, so there's no need for the intermediate
> file ;-)
>




Re: how to prefix definitions in config.h

2002-02-19 Thread Guido Draheim


ac_create_prefix_config_h does not only prefix the names, they get
also converted into ifdefs. That was also necessary, since I had
some problems as this prefixed-header might be included multiple
times, as there is not ifdef-once header/footer for the file
itself. That's my solution to the problem, for making the checks
of autoconf available via a generated header that gets defined
through ac_define's. -- guido



Es schrieb Russ Allbery:
> 
> Guido Draheim <[EMAIL PROTECTED]> writes:
> 
> > I'm doing the same thing, and in fact, all libraries should put their
> > headers with a subdir prefix, sadly not all of them do it, and
> > automake's support for such a style is limited. Anyway, two ways.
> 
> > a) I'm using package/_config.h - this even hint the reader that this
> >file is not a normal header (it is generated!) and it does not
> >get confused for the standard config.h file.
> > b) look at the argument of AC_CONFIG_HEADER - you can specify a
> >different output-file, so that you don't need to have the two
> >same-named.
> 
> The name isn't the problem.
> 
> Suppose that you have two installable libraries that use autoconf, and in
> both of those libraries you need to probe for a few system features, like
> the proper types, and change the headers and prototypes based on that.
> Now suppose you want to write a program that uses both of those libraries.
> If you include both header files, then the separate package config.h files
> will potentially conflict with each other and result in a bunch of
> redefinition errors.
> 
> One thing that you can do is recognize that generally your interface only
> depends on a much restricted subset of the things that you probe for
> actual compilation, and therefore you only need to install a stripped down
> version of config.h.  For INN, I use the following awk script to generate
> that stripped-down version with only the symbols that the header files
> need to use, and just add new symbols to it as I need them for the API:
> 
> #! /bin/sh
> 
> ##  $Id: mksystem,v 1.1 2001/02/24 07:59:06 rra Exp $
> ##
> ##  Create include/inn/system.h from include/config.h.
> ##
> ##  include/config.h is generated by autoconf and contains all of the test
> ##  results for a platform.  Most of these are only used when building INN,
> ##  but some of them are needed for various definitions in the header files
> ##  for INN's libraries.  We want to be able to install those header files
> ##  and their prerequisites, but we don't want to define the normal symbols
> ##  defined by autoconf since they're too likely to conflict with other
> ##  packages.
> ##
> ##  This script takes the path to include/config.h as its only argument and
> ##  generates a file suitable for being included as .  It
> ##  contains only the autoconf results needed for INN's API, and the symbols
> ##  that might conflict with autoconf results in other packages have INN_
> ##  prepended.
> 
> cat < /* Automatically generated by mksystem from config.h; do not edit. */
> 
> /* This header contains information obtained by INN at configure time that
>is needed by INN headers.  Autoconf results that may conflict with the
>autoconf results of another package have INN_ prepended to the
>preprocessor symbols. */
> 
> #ifndef INN_SYSTEM_H
> #define INN_SYSTEM_H 1
> 
> EOF
> 
> awk -f - $1 <<'---END-OF-AWK-SCRIPT---'
> 
> /^#define HAVE_INTTYPES_H/  { print save $1 " INN_" $2 " " $3 "\n" }
> /^#define HAVE_STDBOOL_H/   { print save $1 " INN_" $2 " " $3 "\n" }
> /^#define HAVE_SYS_BITTYPES_H/  { print save $1 " INN_" $2 " " $3 "\n" }
> 
> { save = $0 "\n" }
> 
> ---END-OF-AWK-SCRIPT---
> 
> cat < #endif /* INN_SYSTEM_H */
> EOF
> 
> --
> Russ Allbery ([EMAIL PROTECTED]) <http://www.eyrie.org/~eagle/>




Re: how to prefix definitions in config.h

2002-02-19 Thread Guido Draheim

Es schrieb Russ Allbery:
> 
> Clinton Roy <[EMAIL PROTECTED]> writes:
> > Guido Draheim <[EMAIL PROTECTED]> writes:
> 
> >> ... the other problem is that the prefixing is not done during
> >> reconfig, and in fact, the items in my ac-macro should be appended to
> >> the config.status file, which I didn't bother to do so far, and I
> >> didn't have enough time to figure it out.
> 
> > I think this is now the crux of the problem - how to get a command to be
> > run by config.status every time config.h is generated - as we could use
> > your prefixing code, my sed script, or Russ's awk script, depending on
> > how we want to solve the installability problem.
> 
> I just let make handle this.  It knows about dependencies and running
> scripts, so as long as something important depends on the generated header
> file, it will take care of making sure it's generated.
> 

well, my ac-macro started out as some sed-lines in the makefile, but I
didn't wanted to copy it from project to project exchanging the prefix
which was always derived from the package name.

of course, one could extend automake to see this generation and add it
the appropriate lines to the makefile.in, but I guess that adding the
output-file to the list of ac_output should be good enough to have it
remade everytime a reconf is needed.

so, it still boils down to adding some lines to config.status, and 
the only problem I see is to rewrite the macroitis currently being
used, since the config.status must be a plain shell script, all the
m4 specials must be removed from the code. And Clinton, do I read
it correctly that you have some experimental code about it, could
you just show that?

TIA
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: exact-width C Integers and Autoconf

2002-02-21 Thread Guido Draheim


Simon, 
you contacted me about ac_create_stdint_h.m4 - no success with it?
-- guido

Es schrieb Paul Eggert:
> 
> > From: "Simon Waters" <[EMAIL PROTECTED]>
> > Date: Thu, 21 Feb 2002 02:31:30 -
> 
> > I need to define 32 and 64 bit integer types in reasonably portable C.
> 
> I assume you mean exact-width 32 and 64 bit types.  The C standard
> does not require the existence of such types; for example, you might
> be on a 36-bit host that has 36-bit int and 72-bit long.  However,
> such machines are admittedly rare these days.
> 
> > Linux can include "inttypes.h" these days, so I'm happy to rename to the =
> > "standard" names.
> 
> Yes, that's the better way in the long run.  The standard names will
> take over eventually.
> 
> > Is there an autoconf example to follow?=20
> 
> Not that I know of.  I would put this into configure.ac:
> 
> AC_CHECK_HEADERS(inttypes.h limits.h stdint.h)
> 
> and put something like the following into my C code:
> 
>   #include 
> 
>   #if HAVE_LIMITS_H
>   # include 
>   #endif
> 
>   #if HAVE_INTTYPES_H
>   # include 
>   #else
>   # if HAVE_STDINT_H
>   #  include 
>   # endif
>   #endif
> 
>   #ifndef INT32_MAX
>   # if INT_MAX == 2147483647
>   #  define int32_t int
>   #  define INT32_MAX 2147483647
>   # else
>   #  if LONG_MAX == 2147483647
>   #   define int32_t long
>   #   define INT32_MAX 2147483647L
>   #  else
>   This code assumes the existence of 32-bit exact-width
>   integers, and will not work on machines that lack such a type.
>   #  endif
>   # endif
>   #endif
> 
>   /* Avoids integer overflow on 32-bit hosts.  */
>   # define EQUALS_INT64_MAX(x) \
>   ((x) / 65536 / 65536 == 2147483647 && (x) % 2147483647 == 1)
> 
>   #ifndef INT64_MAX
>   # if EQUALS_INT64_MAX (INT_MAX)
>   #  define int64_t int
>   #  define INT64_MAX 9223372036854775807
>   # else
>   #  if EQUALS_INT64_MAX (LONG_MAX)
>   #   define int64_t long
>   #   define INT64_MAX 9223372036854775807L
>   #  else
>   #   if EQUALS_INT64_MAX (LLONG_MAX)
>   #define int64_t long long
>   #define INT64_MAX 9223372036854775807LL
>   #   else
>This code assumes the existence of 64-bit exact-width
>integers, and will not work on machines that lack such a type.
>   #   endif
>   #  endif
>   # endif
>   #endif
> 
> Admittedly EQUALS_INT64_MAX is a bit tricky.  Perhaps something like
> this should be turned into an autoconf macro, though we probably need
> more experience.  I'll CC this message to [EMAIL PROTECTED] to see if
> there are other opinions.
> 
> > This is a multi-part message in MIME format.
> 
> Please send text mail as plaintext; it saves me time.  Thanks.

-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




configure.in vs. configure.ac clash

2002-03-02 Thread Guido Draheim


while trying to create an rpm for the mp4h project, I jumped on the
following problem:

- the toplevel configure has configure.ac
- a subdir libltdl/ is copied in, which contains a configure.in
- the rpm macro %configure will expand to a line like
   CFLAGS=$RPM_CFLAGS configure --prefix 
- the toplevel configure will call the the ltdl/configure using...
   configure sometarget --prefix... 'CFLAGS=$expanded_RPM_CFLAGS'

and BOOM. 
Yepp, I can fix it while *not* using %configure with the extra CFLAGS settings.
Anyway, interesting to see ;-) .. have fun, guido

P.S. here are the last lines:

config.status: creating config.h
configure: configuring in libltdl
configure: running /bin/sh './configure'  i586-mandrake-linux-gnu --prefix=/usr 
--exec-prefix=/usr --bindir=/usr/bin
--sbindir=/usr/sbin --sysconfdir=/etc --datadir=/usr/share --includedir=/usr/include 
--libdir=/usr/lib --libexecdir=/usr/lib
--localstatedir=/var/lib --sharedstatedir=/usr/com --mandir=/usr/share/man 
--infodir=/usr/share/info --with-modules 'CFLAGS=-O3
-fomit-frame-pointer -pipe -mcpu=pentiumpro -march=i586 -ffast-math 
-fno-strength-reduce' build_alias=i586-mandrake-linux-gnu
host_alias=i586-mandrake-linux-gnu target_alias=i586-mandrake-linux-gnu 
--enable-ltdl-convenience --cache-file=/dev/null
--srcdir=.
configure: warning: CFLAGS=-O3 -fomit-frame-pointer -pipe -mcpu=pentiumpro -march=i586 
-ffast-math -fno-strength-reduce: invalid
host type
configure: error: can only configure for one host and one target at a time
configure: error: /bin/sh './configure' failed for libltdl
sed: can't read confdefs.h: No such file or directory
Fehler: Bad exit status from /my/rpm/tmp/rpm-tmp.16825 (%build)




Re: configure.in vs. configure.ac clash

2002-03-02 Thread Guido Draheim

Es schrieb Alexandre Duret-Lutz:
> 
> >>> "gd" == Guido Draheim <[EMAIL PROTECTED]> writes:
> 
>  gd> while trying to create an rpm for the mp4h project, I jumped on the
>  gd> following problem:
> 
>  gd> - the toplevel configure has configure.ac
>  gd> - a subdir libltdl/ is copied in, which contains a configure.in
>  gd> - the rpm macro %configure will expand to a line like
>  gd> CFLAGS=$RPM_CFLAGS configure --prefix 
>  gd> - the toplevel configure will call the the ltdl/configure using...
>  gd> configure sometarget --prefix... 'CFLAGS=$expanded_RPM_CFLAGS'
> 
>  gd> and BOOM.  Yepp, I can fix it while *not* using %configure
>  gd> with the extra CFLAGS settings.
> 
> The real fix is to rerun autoconf in both directories to make
> sure the configure is generated by the same version of Autoconf
> (Your libltdl/configure obviously comes from Autoconf 2.13).
> 

sure, but rpm is traditionally made from the original tarball,
and the local distro doesn't ship a 2.52 that I could simply
make as a BuildRequirement to reconf just before %configure.
So in the end, the real fix is to report it as a bug to the
original author and let him make a new tarball.

It's been just interesting to see such interactions between
the versions, I have some superprojects here on my disk which
combine a couple of autoconf'ed projects under a master
project that configures them all in one run. The real projects
are just in subdirectories like here, and some are old autoconf,
some are new generation (it seems some projects have not been
converted or they are not that easy to convert). The issues
as shown above make it not a nice move to have the superproject
to use the new autoconf generation. Personally, for a simple
superproject, that isn't needed anyway - the mp4h however has
used the toplevel configure to do real configstuff for its
src/ subdirectory and it has just added the libltdl/ subdir
to gain support from libtool, later on. That brought in the
current setup, and since its just libtoolizing, the author
has probably opened a trap, and me to hit while trying an rpm.

cheers,
-- guidohttp://freespace.sf.net/guidod




Re: cross-compiling question: static libraries and binaries to different places?

2002-03-04 Thread Guido Draheim


--bindir vs. --libdir ?

Es schrieb Dan Kegel:
> 
> I'm cross-developing.  I want to build a package
> that has both static libraries and binaries.
> The binaries should go to the target system;
> the libraries should stay on the build system.
> What do I pass to configure and to make?
> 
> If I do
> configure --build=pentium-unknown-linux --host=@IXIA_K_ARCH@-unknown-linux
> --disable-shared --with-gnu-ld --prefix=/usr
> make -C @IXIA_PORTARCH@/src/lib DESTDIR=$(DEST) install
> 
> the library ends up in the right place (DEST/usr/lib)
> but the binary ends up in the wrong place (DEST/usr/bin).
> 
> If instead I do
> 
> make -C @IXIA_PORTARCH@/src/lib DESTDIR=$(DEST)/fsimg install
> the library ends up in the wrong place (DEST/fsimg/usr/lib)
> but the binary ends up in the right place (DEST/fsimg/usr/bin).
> 
> What to do?  In cross-development environments, is it not
> supported to have static libraries go to the build system,
> but binaries go to the target?
> 
> - Dan




Re: cross-compiling question: static libraries and binaries to different places?

2002-03-04 Thread Guido Draheim


, you want the shared libraries go to the
target system during install, and the static libraries
to stay. May I ask how do you expect the linker to
resolve with the shared library if that one is 
remote? The static lib isn't the same, so, *hhmm* I am
a bit confused. So far, I would have built two times,
one for the target system and one for the build system.

However, I see your point - unlike plain binaries, the
crosscompiled libraries are needed to be doubled for
both the real target and the build system. However, 
the same would account for quite some other things
like include-headers and manpages to stay on the build
system (it doesn't make sense for quite some targets,
esp. if those don't have a console, and we just use
an mount-path to install there on to, or the whole
install directions are just about destdir making an 
archive from it that can be copied over). It is however
uncertain, e.g. on some build hosts, you could not
read the manpages, while they are needed on the target
host. Or just the other way round, useless on the
target host, and useful on the build host.

It's not easy to decide about that. It all brings us
back to the question to consider some configure
options to make a multi build - currently, I do this
using a handmade toplevel configure that creates two
subdirectories, and then calls the real autoconf
configure from that subdirectory. i.e.

./configure <- toplevel one
./src/configure.ac <- autoconf one
./src/configure<- generated

and the toplevel configure does check the options, so
it can see all kinds of conditions that require
multiple builds - in that case it will then do

(test -d debug || mkdir debug) && \
   (cd debug && ../src/configure $* --disable-shared)
(test -d release || mkdir release) && \
   (cd release && ../src/configure $* --with-optstuff)

And ensure to have a toplevel makefile that does
recurse into all build subdirs, for both build and
install runs. Of course, you could add any other 
option to the two configure lines, whatever conditions
you  want to check for. Well, I had never the time to 
make up some tool that could create the toplevel 
configure and makefile automatically - in your case, 
you want the two modes only in the case of 
crosscompiling, and this is a quite common wish.

Hmm, whereever that ends. May be you can live with
the model just presented, atleast it works, so
far I can assure you.

-- guido



Es schrieb Dan Kegel:
> 
> You'd think so, but playing games like that might really
> confuse libtool.
> 
> What I'd like to see is a fully-worked out example of how
> to use libtool, with both static and shared libraries,
> in a cross-compile situation, without the static libraries
> leaking out onto the target system.  We may need to split
> --libdir into --libdir and --buildlibdir, or something
> awful like that?
> 
> Thinking about libtool and cross-compiling is giving me
> serious heartburn.
> - Dan
> 
> Guido Draheim wrote:
> >
> > --bindir vs. --libdir ?
> >
> > Es schrieb Dan Kegel:
> > >
> > > I'm cross-developing.  I want to build a package
> > > that has both static libraries and binaries.
> > > The binaries should go to the target system;
> > > the libraries should stay on the build system.
> > > What do I pass to configure and to make?
> > >
> > > If I do
> > > configure --build=pentium-unknown-linux --host=@IXIA_K_ARCH@-unknown-linux
> > > --disable-shared --with-gnu-ld --prefix=/usr
> > > make -C @IXIA_PORTARCH@/src/lib DESTDIR=$(DEST) install
> > >
> > > the library ends up in the right place (DEST/usr/lib)
> > > but the binary ends up in the wrong place (DEST/usr/bin).
> > >
> > > If instead I do
> > >
> > > make -C @IXIA_PORTARCH@/src/lib DESTDIR=$(DEST)/fsimg install
> > > the library ends up in the wrong place (DEST/fsimg/usr/lib)
> > > but the binary ends up in the right place (DEST/fsimg/usr/bin).
> > >
> > > What to do?  In cross-development environments, is it not
> > > supported to have static libraries go to the build system,
> > > but binaries go to the target?
> > >
> > > - Dan

-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: cross-compiling question: static libraries and binaries to different places?

2002-03-04 Thread Guido Draheim

Es schrieb Dan Kegel:
> 
> Yes.  Doc and headers would stay on the build system.
> 
> > It is however
> > uncertain, e.g. on some build hosts, you could not
> > read the manpages, while they are needed on the target
> > host. Or just the other way round, useless on the
> > target host, and useful on the build host.
> 
> Not worried about those cases.  They are well served
> by the current situation.

*grin*

> 
> > It's not easy to decide about that. It all brings us
> > back to the question to consider some configure
> > options to make a multi build - currently, I do this
> > using a handmade toplevel configure that creates two
> > subdirectories, and then calls the real autoconf
> > configure from that subdirectory.
> 
> Yes, I'm doing something like that now.  (For each
> open source package I install, I have a top-level
> Makefile.in that knows how to unpack, configure, and
> install the open source package.)
> 
> I don't think this does the trick, though.  I can't see
> how it lets you install binaries and shared libs to a staging
> area for transfer to the target, and everything else to
> their final location on the build system, while making
> sure that libtool is told the proper final location
> of shared libraries as they will appear upon boot of the target.

now that's a good one - the .la file does have the information
where the lib will be on the target system. And we want to
retain it on the build system to have it ready for other 
packages to benefit from the information. That would be
perfect to compile a complete series of packages during
cross-compiling. And obviously, the whole thing of two
configure/build runs is just too much, the created libraries
and binaries are the same, so the work is double without
need. IYAM, the problem is not specifically in libtool or
autoconf, but in automake - we just need another install
target like "make install-buildfiles". Okay, we would need
one additional vector in configure, something like a
--build-prefix, so that it gets patched in just so, but it
is not specifically needed, it would be about enough to
give a make install-buildfiles BUILDPREFIX=xx. However, hmm,
*scratchinghead* in the case that we did run a crosscompiler,
the configure does know the default location of our cross
build tools anyway *hmmm* - still, the biggest support would
be needed in the makefile to install into a buildtool path 
and WITHOUT the need to relink there.

does this get us on the right track?
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: ac-archive-macros-2006.0927.tar.bz2

2006-10-17 Thread Guido Draheim
Peter Simon's mail does contain a lot of allegations - I will
refrain to clarify in the same length. I hope that quite some
people on the Autoconf list know me well enough to assume that
there had never been bad intentions whatsoever on my side.

Peter has been contacting me on Sunday to ask about the
redistribution of m4 macro files at ac-archive.sf.net that
had been published before at autoconf-macro.cryp.to - calling
it "his work". I hinted that the GPL does allow redistribution
of GPL-protected work including others extensions - only to see
me being accused of GPL-violation yesterday and without waiting
for response he wrote a few hours later to the sourceforge staff
requesting their intervention on the ac-archive.sf.net project.

Technically, I noted him that the ac-archive.sf.net website had
have multiple links to the cryp.to archive. The sf-net project is
about the integration of mulitple sources with the cryp.to archive
running as one among others - the cryp.to archive is a bit special
here as it integrates again other peoples work. Since the html
pages were already honouring the cryp.to source and the m4 files
did never contain a hint back, I have decided to redistribute the
README COPYING AUTHORS files from the cryp.to tarball as well
in the update of today (ac-archive-macros-2006.1017.tar.bz2) in an
attempt to comply with the alleged requirement (in his mail to
sourceforge) to have something in there that "identifies me (i.e.
Peter) as the compilation copyright owner". Plus, every single
html page from a cryp.to macro carries now a hotlink back there.

On another account, everyone is invited to partake in the sf-net
autoconf macro archive. I have remade the build chain in python
so that everyone can create the full website - the source
repository is public at the sourceforge cvs. (btw, Peter has
never published his build tools - the tarball is incomplete).
As an extra of the refurnished ac-archive.sf.net (which is still
not uptodate in quite a number of places) does contain unix manual
pages generated from the macro files. The docbook intermediate
xml might be used for other derivate work as well. Your choice.

Have fun, everyone
-- Guido
P.S. the old tarball will be removed - I don't think it was bad
 but a new one is available anyway and I am always fine with
 lowering aggressiveness in this world.


___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


acincludedir -populating an m4/ subdirectory

2007-02-04 Thread Guido Draheim
It is great that aclocal/autoconf can now m4_include multiple
files from a local m4/-subdirectory. Yet, I was missing a tool
to populate it. A while back I just modified the standard
aclocal to do that - i.e. instead of --ouput=FILE using an
--output-dir=DIR. However, as far as I can see, the current
cvs aclocal.in was not extended as such. That's sad, especially
given how easy it is do add.

So, I did return to my earlier acinclude tool (which itself is
based on some very old aclocal version). Applying a similar
patch I am now symlinking it to be "acincludedir" which does
populate some project-local m4/ subdirectory with autoconf
macros from a site-local macro repository. It works similar
to the old acinclude. You can find the documentation at

http://ac-archive.sourceforge.net/doc/acincludedir.html

The source code is present in the download tarball of
the AC-archive. Which does now carry some more tools
that you might like. Please test and report any problems,
thank you,
-- Guido



___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


macro_to_html + macro_to xml unix manpage

2007-02-04 Thread Guido Draheim
I have extracted some routines from the current AC-Archive
website engine (implemented in Python) to work standalone
on a single macro (or small number of macros if you like).

Given a macro ax_example.m4 you can now say
> ac_archive_macro_to_html ax_example.m4
which creates the output file ax_example.html
as it would be presented at ac-archive.sf.net

Additionally I have turned that into a CGI script at
http://ac-archive.sourceforge.net/doc/contribute.html
where you can test the macro submission format and
how it would like to be formatted at the AC-Archive.
Have fun with it,

* Unix Manpage Generation

A long while back I decided to render the AC-Archive
macros also as docbook xml files. Using "xmlto" one
can create unix manpages for each single autoconf
macro in the AC-Archive. A complete man7/* tarball
can be downloaded directly from the website at
http://ac-archive.sourceforge.net/doc/introduction.html

Using that locally, you would say
> ac_archive_macro_to_docbook ax_example.m4
# creates a file ax_example.docbook
> xmlto man ax_example.docbook
# creates a file ax_example.7
> man -l ax_example.7

* -tools.rpm

All these tools are now installed as a separate rpm
ac-archive-tools-2007.0205-.noarch.rpm
If you do download the source tarball from the
http://sourceforge.net/project/showfiles.php?group_id=32081
then you would say
> make install-tools

That does also install a tool to rebuild any other autoconf
repository named ac_archive_gendocs and a combined tool for
html/docbook that prints to stdout named just macro_to, i.e.
> ac_archive_macro_to xml ax_example.m4

Please test,
and have fun,
-- Guido








___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


OT: cross platform tests? SF compile farm is dead

2007-02-18 Thread Guido Draheim
https://sourceforge.net/forum/forum.php?forum_id=665363
> As of 2007-02-08, SourceForge.net Compile Farm service
> has been officially discontinued.

> We feel that our resources are best used at this time
> in improving other parts of our existing SourceForge.net
> service offering, and in introducing other high-demand
> features. There are no immediate plans to offer a
> replacement for our Compile Farm service.


I was using the sourceforge compile farm for cross platform
support of my projects - it was flaky in the past (to say the
best) but I did not know a better place where you can get
access to non-Linux machines easily. So where to go else?

What's your way to test for cross platform support?
Anywhere else to get access to a compile farm?

cheers, Guido


___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


Re: OT: cross platform tests? SF compile farm is dead

2007-02-18 Thread Guido Draheim
Dirk schrieb:
> Guido Draheim wrote:
>> https://sourceforge.net/forum/forum.php?forum_id=665363
>>> As of 2007-02-08, SourceForge.net Compile Farm service
>>> has been officially discontinued.
>>> We feel that our resources are best used at this time
>>> in improving other parts of our existing SourceForge.net
>>> service offering, and in introducing other high-demand
>>> features. There are no immediate plans to offer a
>>> replacement for our Compile Farm service.
>>
>> I was using the sourceforge compile farm for cross platform
>> support of my projects - it was flaky in the past (to say the
>> best) but I did not know a better place where you can get
>> access to non-Linux machines easily. So where to go else?
>>
>> What's your way to test for cross platform support?
>> Anywhere else to get access to a compile farm?
>>
>> cheers, Guido
> 
> VMware
> 

Yeah, but wouldn't cover ppc-osx or sparc-solaris etc but in
reality I don't have install CDs either to get them setup'd.
*sigh* sure I could replace the bunch of ix86 Linux boxes
on the SF CF (and Win32 had to be done in VMware in the
past anyway) but that's hardly cross platform.

The only fallback I see so far is to get back to some old
contacts that were providing access to one of their
machines - usually in remote places. But my experiences
did show that it is quite PITA - not for least that
everytime they reinstall the machine they forget about
any ssh key given by external partners plus you can't
quite be sure what's on the machine the next day. If it's
available at all.

oh well, Guido



___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


Re: OT: cross platform tests? SF compile farm is dead

2007-02-18 Thread Guido Draheim
Thomas Dickey schrieb:
> On Thu, 15 Feb 2007, Dalibor Topic wrote:
> 
>> HP's test drive covers a bunch of platforms. For the rest: qemu,
>> vmware, aranym, hercules, gxemul, or (least desirable, imho) real
>> hardware.
> 
> It's less than it was (about half the platforms went away last fall).
> 

However it looks promising - especially the breed of compilers and
the hpux/tru64 machines are a nice thing to have. I just signed up
and I'll see to check this out later - thanks for the pointer.



___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


Re: How to avoid warning: "PACKAGE_NAME" redefined

2007-03-10 Thread Guido Draheim
Reminder:
- http://ac-archive.sourceforge.net/guidod/ax_prefix_config_h.html

The macro creates a package config.h that can be installed.
It avoids to create a special ace-config.h manually.
Point the ACE developers to that as installing config.h is plain wrong.
And, use it yourself thereby avoiding standard config.h defs altogether.

cheers, Guido

haibin zhang schrieb:
> Hi all:
> When I build my project , when I include  in my source file, it 
> will warn me :
> In file included from /opt/ace/include/ace/config-macros.h:24,
>  from /opt/ace/include/ace/config-lite.h:24,
>  from /opt/ace/include/ace/Basic_Types.h:46,
>  from /opt/ace/include/ace/SStringfwd.h:22,
>  from /opt/ace/include/ace/Configuration.h:34,
>  from /opt/ace/include/ace/Configuration_Import_Export.h:28,
>  from ../../../CallCompletion/Codes/libnet/Properties.cpp:8:
> /opt/ace/include/ace/config.h:2050:1: warning: "PACKAGE_NAME" redefined
> In file included
> 
> I found that I use ACE project , ACE has include config.h that generated by 
> autoconf. and it has defined PACKAGE_NAME.
> 
> How can I  avoid warning: "PACKAGE_NAME" redefined?
> 
> Regards
> 
> Zhang HaiBin
> 
>   
> -
>  Mp3疯狂搜-新歌热歌高速下   
> ___
> Autoconf mailing list
> Autoconf@gnu.org
> http://lists.gnu.org/mailman/listinfo/autoconf
> 



___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


Re: Overloaded Function Checks

2007-03-15 Thread Guido Draheim

Eric Lemings schrieb:
> Hi,
>  
> I want to write an Autoconf macro that checks for overloaded functions
> in C++ (assuming of course all basic C++ configuration checks have been
> done).  For example, does the C++ compiler support function calls to
> abs() in  or  for all integer types ranging from 'bool'
> up to 'unsigned long long'?

If it's just for integers then it should be easy to implement. Instatiate a 
variable and try.

> 
> I couldn't find such a macro in the latest Autoconf distribution or the
> Autoconf macro archive.  Any help would be appreciated.

But you could take the check for a C++ overloaded function as library export 
for a reference. Hope you did find it in the archive, see 
http://ac-archive.sourceforge.net/guidod/ax_cxx_check_lib.html

> 
> Thanks,
> Eric.
> 
> 
> ___
> Autoconf mailing list
> Autoconf@gnu.org
> http://lists.gnu.org/mailman/listinfo/autoconf
> 



___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


Re: Proposed patch for ax_enable_builddir.m4

2007-08-05 Thread Guido Draheim
Julian Cummings schrieb:
> Attached is a proposed patch for the autoconf macro ax_enable_builddir.m4.
> The patch universally replaces "host" with "build" and "HOST" with "BUILD".
> The rationale is that typically the user wishes to segregate builds based
> upon the BUILD target rather than the configuration HOST type.  Now that
> these host and build variables are treated as more fully distinct in
> autoconf, it makes sense to honor this disticntion.  I notice that you
> created a separate macro ax_enable_builddir_uname.m4 that sort of does the
> same thing, except that it explicitly uses the command "uname -msr" instead
> of relying on the existing autoconf macro AC_CANONICAL_BUILD.  I don't see
> the point of this.  (Also, this "uname" macro omits support for transferring
> the config auxiliary directory $ac_aux_dir for some reason.)  Perhaps with
> my proposed patch, the ax_enable_builddir_uname.m4 macro becomes
> obsolete...?  Please let me know your thoughts.
>
> Regards, Julian C.
>
> Dr. Julian C. Cummings
> Staff Scientist, CACR/Caltech
> (626) 395-2543
> [EMAIL PROTECTED]
>

Sorry, Julian, but that's completely wrong from my POV. The macro had been
invented in a build environment using crosscompilers - there is a compile
BUILD host that targets the compiled binary for a runtime HOST. The GCC
derived canonical system defines have an additional TARGET that implies
the compiled runtime binary to be a compiler compiling for third host
type.

On such a crosscompiler build host it is very natural to install multiple
compilers which have a name prefix defaulting to the runtime $HOST. Now
have a look at /usr/share/autoconf/autoconf/c.m4 AC_PROG_CC and its use
of $ac_tool_prefix which you find again in autoconf/general.m4 set to
test -n "$host_alias" && ac_tool_prefix=$host_alias-

Now that's where the differentiation comes from for different build
subdirectories that are meant to be used for different build environments.
Of course, I was using the macro also for other projects where one would
not use a crosscompiler but instead one would remote login to different
system host and using the native compiler ($build == $host) in a
/home directory mounted from a central place. That was the case for
the sourceforge compilefarm (RIP) atleast and I guess it is a common
scenario for other multi-target development labs.

So, it is really the runtime $HOST that I like to distinguish and for
remote login with a native compiler (compiling for its own host type)
it is equal to the $build defines.

The _UNAME variant has a different reason however. The original macro
was used in a build environment where a wrapper script was already
presetting the runtime target host via a some shell variables. And it
was using the naming scheme of the config.sub parts. During later
developments I was noticing it to be somewhat cumbersome on two
accounts
(1) in most environments there is no shellvariable $HOST predefined
and there were problems to execute the config.guess shell-script
(mainly because I moved it to a subdirectory and different
automake/autoconf version were shuffling variables around how to
find the path to it)
(2) the config.guess is distinguishing less host types as it usually
strips of versions and vendor specifics. That is especially true
for the load of Linux distros which is generally put under the
common name of i686-suse-linux-gnu. In the sourceforge compilefarm
it could not distinguish between suse, redhat, debian, etc but
uname -msr can give us atleast sligthly different kernel versions.

So effectivly, the uname -msr is easier to handle - I can easily run
a custom make (make HOST=`uname -msr`) to target a different runtime
host in a crosscompiler environment and secondly it allows me to
easier get away with a compilefarm layout where multiple compile
hosts are mounting the same home directory for the build sources.

cheers, Guido
(CC'd to autoconf ML for broader reference - ac-archive ML being dead)



___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


Re: Proposed patch for ax_enable_builddir.m4

2007-08-05 Thread Guido Draheim
Peter Simons schrieb:
> Guido Draheim writes:
> 
>  > Peter Simons writes:
>  >
>  >> Hi Julian,
>  >>
>  >> thank you for taking the time to update the macro. Your patch
>  >> certainly feels reasonable to me, so I applied it to the macro:
>  >>
>  >>   http://autoconf-archive.cryp.to/ax_enable_builddir.html
>  >>
>  >> I also followed your advice and marked AX_ENABLE_BUILDDIR_UNAME
>  >> as obsolete.
>  >>
>  >> Guido, you are the principal author of those macros. I make those
>  >> changes in good faith that they are okay with you.
>  >>
>  >> Thank you for supporting free software to both of you.
>  >>
>  >> Best regards,
>  >> Peter
>  >
>  > Peter, on such occasions it is good idea to put the name of
>  > the new maintainer in front of the authors list - the _UNAME
>  > macro is actually the new one while I do consider the old one
>  > as obsolete. Obviously, the longer name could inspire the idea
>  > that it came later in development and it is newer therefore.
>  > As the old macro is used quite heavily in some projects, I did
>  > not want to occupy its name for something else.
> 
> Well, Guido, what can I say?
> 
> You have received Julian's patch. Julian has put quite a bit of
> effort into describing his rationale. As usual, you just don't
> respond. After a few days, I decide to commit the patch because
> it looks absolutely reasonable to me and I have no indication
> that you think otherwise. As usual, _after_ the commit has been
> made, you suddenly respond with a massive, long-winded essay that
> gives me headaches when I try to decipher it. Not that there is
> much to decipher, because the message is loud and clear: "Sorry,
> but that's completely wrong." And naturally, you also see the
> need to drag a private e-mail conversation to a public mailing
> list for no apparent reason other than an audience.
> 
> I really love it when I have to deal with you.
> 
> I don't feel qualified to judge the technical details of the
> change and I trust that you, Julian, and all other interested
> parties find some sort of consensus what the AX_ENABLE_BUILDDIR
> macro should look like. In order to provide context for everyone,
> here are the commits we're talking about:
> 
>   http://tinyurl.com/ys75r2
>   http://tinyurl.com/2yuh5f
> 
> To address your last comment. The order of the authors in the
> macro file is more or less random; it's typically simply the
> order in which people contributed to the macro. The person who
> has been listed first is not necessarily the one who contributed
> the most or the most frequently. In other words: there is no such
> thing as the "first" author. Regardless of the author
> attribution, the maintainer of the macro am I because I maintain
> the Archive.
> 
> The AX_ENABLE_BUILDDIR_UNAME macro has been obsoleted because it
> duplicates functionality from AC_CANONICAL_BUILD. To me, it is
> not clear what this macro is supposed to do and why it is better
> than AX_ENABLE_BUILDDIR. Can you please elaborate on that
> subject?
> 
> Best regards,
> Peter
> 

(a) I do simply have NOT MUCH TIME in my life to write up longer
explanations, you know, it was sunday and I thought to clean
up my overflowing inbox a bit. That's all, the rest is your
interpretation.
(b) the message about autor pickung and obsoletion was directed
at you alone, as perhaps your mileage may vary. Nice to see
it pop up in public to wash some dirty linen of your own
cunning mind.
(c) The reasons for using the _UNAME macro in my own sources
have been described in quite a length. If that is not
intelligible to you then I do feel very sorry.

Have a nice day, Guido



___
Autoconf mailing list
Autoconf@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf


Re: cross-compiling question: static libraries and binaries to different places?

2002-03-04 Thread Guido Draheim

Es schrieb Dan Kegel:
> 
> Guido Draheim wrote:
> > > ...
> > > I don't think this does the trick, though.  I can't see
> > > how it lets you install binaries and shared libs to a staging
> > > area for transfer to the target, and everything else to
> > > their final location on the build system, while making
> > > sure that libtool is told the proper final location
> > > of shared libraries as they will appear upon boot of the target.
> >
> > now that's a good one - the .la file does have the information
> > where the lib will be on the target system. And we want to
> > retain it on the build system to have it ready for other
> > packages to benefit from the information. That would be
> > perfect to compile a complete series of packages during
> > cross-compiling.
> 
> Yes, but just because it gets copied to the staging
> area doesn't mean it's unavailable to the build system.
> We just have to tell whoever is looking for .la's and
> shared libraries at build time to look in the staging area.

not quite. atleast this is not common behaviour at the
moment. See - the libtool crossgcc support (to which I did 
contribute some of the time) can simply ask the cross-gcc 
for the local searchpath via `gcc -print-search-dirs` - 
this is needed for win32 compiles atleast, and I have a
patch on my disk which generalizes the idea for all
cross-gcc targets (since there have been problems with
crosscompiling linux-to-linux). Now the result is simply
grepped for the "libraries:" entry, at it usually spits
out a couple of directories very close the installation
of the crossgcc himself.

[guidod@pc3 guidod]$ i386-mingw32-gcc -print-search-dirs | grep libraries:
libraries: 
 /usr/lib/gcc-lib/i386-mingw32/2.95.3/:
 /usr/lib/gcc/i386-mingw32/2.95.3/:
 /usr/i386-mingw32/lib/i386-mingw32/2.95.3/:
 /usr/i386-mingw32/lib/

Now I'd say for the build-install, we could simply extract
the last entry of this setting - in this case it would be
 /usr/i386-mingw32/lib  - and a similar entry exists for
the native compiler that would be /usr/lib on linux. This
entry is in the default libpath of the gcc-linker, and it
is damn needed since other crosscompiled programs must
resolve to the libs being found in the libsearchpath - if
there would be no such default, you might need to give
another way to have it added for libtool compiles. And
the staging area is never a default path to be deduced
from the --host=cross-target identification, IMO.


> 
> > However, hmm,
> > *scratchinghead* in the case that we did run a crosscompiler,
> > the configure does know the default location of our cross
> > build tools anyway *hmmm* - still, the biggest support would
> > be needed in the makefile to install into a buildtool path
> > and WITHOUT the need to relink there.
> >
> > does this get us on the right track?
> 
> I dunno, I think we need an example project to illustrate this.
> A 'hello world' where main writes 'hello' and a shared libarary
> writes 'world', maybe.

it is even more complicated, since a single hello/sharedlib does
not show the problemspace. We need two projects, one is creating
a sharedlib and installs it, and the other shall depend on it.
We check that the second configure can see the necessary header
files (or even libraries?), and that the build can correctly 
resolve with these build-installed libraries.

cheers,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: cross-compiling question: static libraries and binaries to different places?

2002-03-08 Thread Guido Draheim

Es schrieb Dan Kegel:
> 
> Guido Draheim wrote:
> > ... See - the libtool crossgcc support (to which I did
> > contribute some of the time) can simply ask the cross-gcc
> > for the local searchpath via `gcc -print-search-dirs` -
> > this is needed for win32 compiles atleast, and I have a
> > patch on my disk which generalizes the idea for all
> > cross-gcc targets (since there have been problems with
> > crosscompiling linux-to-linux).
> 
> You mean this?

YES!! I just didn't know how to make an argument about it on
the mailinglist, may be the sheer number of our voices can
avoid that ;-) - oh btw your patch needs an enhancement for
the last s/;/ /g since on unix a ":" is used. There was a
discussion a while back about the use of path_separator here,
but the final solution was to cut the line into three, and
examine the raw print-search-dirs string for occurrences of
";". If yes, use that one as the path_separator, if no
then use ":" and assume that unix path parts do never 
contain ";" either but have ":" as the pathseparator. (and
IIRC, there was some rumour about differences as with a gcc 
on win32 hosted in a unix pesonality, so you can't be that 
sure about the ";" even on win32.). Anyway, you actually got
the idea, and I didn't say anything explicit, so it makes
me confident that this is an obvious flaw (a bug?) in the
libtool crosscompiling support. It needs to be fixed.

go ahead, I'm all with you, guido

> 
> --- libtool.m4.orig Thu Mar  7 16:58:42 2002
> +++ libtool.m4  Thu Mar  7 17:04:23 2002
> @@ -2312,6 +2312,13 @@
>dynamic_linker=no
>;;
>  esac
> +
> +# When cross-compiling, get the list of system library directories
> +# from gcc if possible, since hardcoded paths above are surely wrong.
> +if test "$GCC" = yes && test "$cross_compiling" = yes; then
> +  sys_lib_search_path_spec=`$CC -print-search-dirs | grep "^libraries:" | sed -e 
>"s/^libraries://" -e "s/;/ /g"`
> +fi
> +
>  AC_MSG_RESULT([$dynamic_linker])
>  test "$dynamic_linker" = no && can_build_shared=no
>  ##
> 
> I just spent two hours realizing why libtool could not link with shared
> libraries when cross-compiling; it ended up being exactly the thing
> you're talking about.  I created a patch according to your suggestion
> on the libtool mailing list, and it makes life much nicer.
> 
> Is this in CVS yet?  By golly, it sure needs to be.
> 
> - Dan




Re: cross-compiling question: static libraries and binaries to different places?

2002-03-08 Thread Guido Draheim


Es schrieb Guido Draheim:
> crosscompiling linux-to-linux). Now the result is simply
> grepped for the "libraries:" entry, at it usually spits
> out a couple of directories very close the installation
> of the crossgcc himself.
> 
> [guidod@pc3 guidod]$ i386-mingw32-gcc -print-search-dirs | grep libraries:
> libraries:
>  /usr/lib/gcc-lib/i386-mingw32/2.95.3/:
>  /usr/lib/gcc/i386-mingw32/2.95.3/:
>  /usr/i386-mingw32/lib/i386-mingw32/2.95.3/:
>  /usr/i386-mingw32/lib/
> 
> Now I'd say for the build-install, we could simply extract
> the last entry of this setting - in this case it would be
>  /usr/i386-mingw32/lib  - and a similar entry exists for
> the native compiler that would be /usr/lib on linux. This
> entry is in the default libpath of the gcc-linker, and it
> is damn needed since other crosscompiled programs must
> resolve to the libs being found in the libsearchpath - if
> there would be no such default, you might need to give
> another way to have it added for libtool compiles. And
> the staging area is never a default path to be deduced
> from the --host=cross-target identification, IMO.
> 

might not be that interesting, but I had a couple of 
thoughts about it the last days, nothing conclusive so
the following text is looong and winding

*

let's come back to one of th ideas to have automake support 
for a copy of the libs on the build host. My first guess 
above was about using the last part of print-search-dirs, 
and possibly derive the incudedir from it. That's one idea.

The other idea is about that staging area, and actually
this might be a good idea as well for the build host, to
let it access the libs in the staging area. One could
even let autoconf-tests to be able to try_link with those
and even get back the install-location on the target
system. But there are some problems.

See, the current gcc does not use this scheme, it does
always install itself according to the FHS of the build
host, and it was quite common to let the gcc have all
the target differences embodied in its  lib/gcc subtree.
That way, one was able to take one toplevel gcc and
just switch the target-system for the current compile
job. But I think this idea is outdated and autoconf uses
the other way, checking for a -gcc in the path.

Well, libtool has a list of hardcoded fhs entries for
the libsearchpath of quite some target systems, but
that would be nothing worth without the staging area
prefix. Now, let's assume we change the gcc and do not
copy the target support files into one of its package
subdirs (as discussed above), but it uses a staging
area for it - in a way it simulates that the system
headers and libs are just there on the target system
mounted via some network-aware system.

Currently, my crossgcc uses a simple prefix for the
cross compile support dirs, i.e. the initial libdir
would be /usr/i386-mingw32/lib alongside of an
/usr/i386-mingw32/include,  while for a simulation
of the target host, the latter should not use a "/lib"
subdir particle, as that would not be the final
location on such a target system I guess.

That one boils down to the fact, that we do not 
need any automake support when a staging area (a
chroot-like target host simulation-filetree) is
being used, it would just be a different DESTDIR,
and it would also provide the support for any
other build support file, like includeheaders or
some other modules, one could even use a binary
package for the target host, and install under
a reloc-prefix.

That gives a good future compatibility as we do not
need to know anymore which of the files need a
copy on the build host (if in doubt, install them
all), and there is just one problem left - where
is the staging area, and how is the fhs being
used in there. That is sure no easy answer. The
print-search-dirs scheme migh help, but I didn't
come to any conclusive so far.

your ideas?

cheers,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: how to prefix definitions in config.h

2002-03-17 Thread Guido Draheim

Es schrieb Paul Eggert:
> 
> > From: Akim Demaille <[EMAIL PROTECTED]>
> > Date: 13 Mar 2002 11:23:10 +0100
> >
> > My question is merely one of interface.
> >
> > Currently
> >
> > AC_INIT
> > AC_CONFIG_HEADERS(config.h)
> > AC_CONFIG_COMMANDS(config.h, [echo Hello, world])
> > AC_OUTPUT
> >
> > is wrong (grr, it is not caught, I don't know why, but autoconf is
> > supposed to die on this).
> >
> > My question is should we make this the normal way to hook a command,
> > or should we keep this invalid, and introduce
> >
> > AC_INIT
> > AC_CONFIG_HEADERS(config.h)
> > AC_CONFIG_HOOKS(config.h, [echo Hello, world])
> > AC_OUTPUT
> >
> > or something like that?
> 
> Sorry, I don't understand the question.  I went back and read the
> thread, and I still don't understand the question.
> 
> I did understand Russ Allbery's point.  He wrote that if you need a
> config.h variant, then it should be easy enough to create the variant
> with a makefile rule that looks something like this:
> 
> my_config.h: config.h
> sed 's/#define /#define MY_/; s/#undef /#undef MY_/' $@
> 
> and once you do that, you don't nee Autoconf to generate my_config.h.
> 
> Guido Draheim's rejoinder
> 
> doesn't make sense to me.  He seems to be arguing that the makefile
> rule is too complicated.  I dunno; it looks pretty simple to me.
> Maybe I'm missing something.

Did I say that? Should be "no" since the whole prefix-config.h stuff
started out as some extra makefile rule *grumble* may be we just cut 
that superfluous "too" from the "complicated" characterisation. So
please accept that I did NOT want to cut-n-paste rules from file to
file to file, and if I find a bug somewhen then I have to go through 
all of them and actually, the resp. sed-lines aren't that easy to
read, horrors if a few var-substitutions in there would have been
done a little different than just cut-n-paste - the bottom of it:
give the feature a simple name, and just use it. And take it, Paul,
if a feature has a name that expands to multiple lines with some
possible arg-substitutions, that's what we generally call a "macro".

back to the topic. your point is quite fine - why not let the lines
go into the makefile instead of config.status. I just wonder how to
do this - if it would be possible then it would be fr better
since `make` can take of the dependencies. Let's try it:

my-config.h : config.h
@PREFIX_CONFIG_H@

and let @PREFIX_CONFIG@ expands to the necessary lines to do the
conversion, and in fact it can use the normal make-autovars
like "$@" and "$<" (atleast around here it is possible). First
question: is it possible to get a SUBST that expands to 
*multiple* lines. How about some of the multi-lines just broken
up with backslash-escaped newlines for readability - currently
all of the multiline rules come in from automake AFAICS but
may be I am missing something, atleast I know that gnu-make
has multline make-var substitions at hand, so it should be
possible somehow.

Secondly, how about letting the user choose a different prefix,
well the current default is the $(PACKAGE) name, but I do have
one package where the prefix differs from the package name
(zziplib -> zzip prefix). You might call that wrong scheme but
the autoconf-macro can take care of it, and give a lot of
freedom therein. - and it could contain much more knowledge of
heuristics than it does now. AFAICS, the makefile-rule above
would need a companion in the configure-script, atleast one
that sets that SUBST, e.g.

AC_PREFIX_MAKERULE(MY_PREFIX_CONFIG_H)

still there is one thing: the multiple sed-lines to do the
actual trick, they would still need to be passed around, 
it changes the dicussion to let them end up in the makefile
instead of config.status - it does not get easier to be
handled in the autotools. And don't like to hear that the
autotools shall not be able to support such macro stuff,
and let the end-user be bound to cut-n-paste-n-modify.

cheers,
-- guido http://freespace.sf.net/guidod




Re: how to prefix definitions in config.h

2002-03-18 Thread Guido Draheim

Es schrieb Clinton Roy:
> 
> > > my_config.h: config.h
> > > sed 's/#define /#define MY_/; s/#undef /#undef MY_/' $@
> 
> It would appear I can air my dirty code :)
> 
> We define some of our own defines, that are already prefixed, so we
> have to take care of the case of double prefixing. We also wrap the
> header in an `#ifndef PREFIX_CONFIG_H; #define PREFIX_CONFIG_H ;
> #endif' so we don't have to worry about multiple inclusion.
> 
> Finally:
> 
> config.h: _config.h
> echo "#ifndef PREFIX_CONFIG_H" > $(output)
> echo "#define PREFIX_CONFIG_H" >> $(output)
> sed -e 's/#define /#define PREFIX_/' \
> -e 's/#undef /#undef PREFIX_/' \
> -e 's/PREFIX_PREFIX_/PREFIX_/' < $(input) >> $(output)
> echo "#endif /* PREFIX_CONFIG_H */" >> $(output)
> 
> Pretty icky, but it does get the job done.
> 

yes, that was my first shot too ;-) - it's seems to be the canonical
answer to the problem we have to handle and it seems to be a very
common approach to get the job done. 

just a hint: after doing some projects with the first attempt of
the prefix-macro, I noticed it would be better to prefix lowercase
entries in config.h with a lowercased prefix, so that for example
off_t maps to my_off_t and const maps to my_const. These examples
are also the case for wrapping the generated my-config.h into another
my-conf.h that looks like:

#ifdef _MSC_VER
#include 
#else
#include 
#endif
#ifndef my_off_t
#define my_off_t off_t
#endif

after here, all the library headers include my-conf.h and use the
my_off_t entry for the function declarations.

well, one might call these just cosmetics but I think it looks
better, or perhaps just *hmmm* more professional *giggle*.

have fun,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: how to prefix definitions in config.h

2002-03-18 Thread Guido Draheim

Es schrieb Akim Demaille:
> 
> > "Paul" == Paul Eggert <[EMAIL PROTECTED]> writes:
> 
> Paul> Sorry, I don't understand the question.  I went back and read
> Paul> the thread, and I still don't understand the question.
> 
> Sorry for being confuse.
> 
> Paul> I did understand Russ Allbery's point.  He wrote that if you
> Paul> need a config.h variant, then it should be easy enough to create
> Paul> the variant with a makefile rule that looks something like this:
> 
> Paul> my_config.h: config.h sed 's/#define /#define MY_/; s/#undef
> Paul> /#undef MY_/' $@
> 
> Paul> and once you do that, you don't nee Autoconf to generate
> Paul> my_config.h.
> 
> I agree your solution is the most appropriate for their issue, but the
> general question I was trying to address was that of hooking commands
> to AC_CONFIG_FILES, AC_CONFIG_HEADERS.  Some people want some commands
> to be ran when the header is created.  I was merely looking for the
> syntax to code into Autoconf.
> 
> But I'm now tempted to drop this, until someone comes with an actual
> need.

the underlying problem is the generation of a order between
generated files - no need to spend time on hooks if one can
ensure an order another way. It looks logical to put such
order-specs into the makefile but it raises other problems
with (a) automake not being good at expressing things like
aclocal^H^H^H^H^H^H^Hamlocal macros to be transferred into
rules and (b) not good at expressing macro-args and any of
the current expressions need heavy logics in the core 
automake perl base - which atleast looks overdone for the
simple thing we like to achieve. If autoconf could be
instructed to follow a certain order, that would well be
enough - nobody cares much if the generation-lines end up 
in the makefile.in or in the config.status body.

cheers,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: how to prefix definitions in config.h

2002-03-18 Thread Guido Draheim

Es schrieb Paul Eggert:
> 
> > Date: Sun, 17 Mar 2002 13:48:30 +0100
> > From: Guido Draheim <[EMAIL PROTECTED]>
> 
> > your point is quite fine - why not let the lines
> > go into the makefile instead of config.status. I just wonder how to
> > do this - if it would be possible then it would be fr better
> > since `make` can take of the dependencies.
> 
> Since it's a make rule, I would put this sort of thing into Automake
> rather than into Autoconf.  You can look at how Automake does things,
> for ideas about how to implement it.

the syntax model of automake isn't that well-suited to the problem,
since the input-file might something different, nor is the output
file always the same or the prefix that is to be used. Those are
the three optional arguments of the prefix-config macros. If you 
have a proposal how to express that in automake-terms, don't hide it...

-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




BTW, mandrake 8.2 has autoconf-2.52d preinstalled

2002-03-26 Thread Guido Draheim


During the cooker developments of mandrake, I noticed they did create
a little script as /usr/bin/autoconf that would look for either
configure.in or configure.ac - if it finds the latter, it will
forward the call to /usr/bin/autoconf-2.5x, otherwise it will use
/usr/bin/autoconf-2.13, where of course each one is the renamed
binary of their respective releases.

While this is a good idea, after switching to 8.2 I wondered why 
some autm4ate files did pop up as I did expect this to be bound 
to 2.53 - and indeed, a call to `autoconf --version` in a directory 
with configure.ac will reveal 2.52d - and not 2.52. Can it be that 
the mandrake guys were accidently using a development version as the 
usage of an alpha-suffix to the last release-version makes one 
intuitivly expect an update/fixedrelease?

Or was that intentional for some reason - may be someone could
enlight me whether autom4te based series and autoconf-2.13
could perhaps share the same datadir tree.

TIA,
-- guido   
http://freespace.sf.net/guidod




Re: BTW, mandrake 8.2 has autoconf-2.52d preinstalled

2002-03-26 Thread Guido Draheim

Es schrieb Akim Demaille:
> 
> >>>>> "Guido" == Guido Draheim <[EMAIL PROTECTED]> writes:
> 
> Guido> Or was that intentional for some reason - may be someone could
> Guido> enlight me whether autom4te based series and autoconf-2.13
> Guido> could perhaps share the same datadir tree.
> 
> There should be no problems.

yepp, I did hope that (from what I know).
good to have a simple answer, 'wish it'be more often. ;-)

Thanks, Guido




Re: Configure/make files for cross compilers

2002-03-28 Thread Guido Draheim

> Es schrieb Andrew Kiggins:
> 
> Folks,
> I need to port some UNIX based code to an embedded OS (VxWorks).
> 
> Autoconf/configure is the modus operandus for building the various 
> UNIX flavours. To keep things nice I'd like to try to
> follow this module.
> 
> Is it possible to generate configure files that will do the 
> right thing for the cross-compiler/include files/library files ?
> Can I tell autoconf to look elsewhere for the various header, if 
> so how do I set about verifying their operation in the native
> system?

not problematic - but for cross-compiling you have to avoid any 
autoconf macros that try to AC_TRY_RUN, and with vxworks I noticed
that one better avoids to AC_TRY_LINK either. That limits the
choice of autoconf-detections to check-headers and check-compiler
stuff. Most of the time it did help to create a config.site file
that answers the questions - it also helps to speedup the
configure-time and decreases turn-around times when autoconf is
used for a small range of target systems where it is worth to
add a system-wide autoconf cache file.

> 
> The whole thing smacks of a Canadian Cross, but I'm sort of at a 
> loss to figure out whether this is even possible, or whether
> I just have to #ifdef the code and supply handcrafted Makefiles.
> 

well, autoconf is lowlevel enough that there is no benefit to 
use a set of handcraftet makefiles - better use a config.site
file and a few extra ac-macros for your environment that you
can add (it's not that problematic to create new autoconf macros,
it's documented, and more examples can be downloaded from
http://ac-archive.sf.net ) - most people just don't know about
config.site, so they miss the chance to pre-answer questions
that apply to (all of your) own projects in a closed setup.

good luck,
-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@>+++ y++ 5++X- (geekcode)




Re: RFC: ./configure or ./config.status --clean

2002-04-04 Thread Guido Draheim

Es schrieb Peter Eisentraut:
> 
> Akim Demaille writes:
> 
> > What I'm doing now is buying my freedom.  The freedom to extend
> > Autoconf without 1. requiring from the rest of the world that they
> > adjust their distclean rules, 2. requiring that Automake folks release
> > a newer Automake etc., not to mention that it needs 1. writing
> > documentation, 2. telling the people to read it.
> 
> All of these are good goals, but there are at least three other ways to
> achieve them:
> 
> 1. _AC_EXTRA_CLEAN_FILES([configure.lineno autom4te.cache])
> 
>To be traced by automake.
> 
> 2. AC_SUBST(ac_extra_clean_files, [configure.lineno autom4te.cache])
> 
>To be added to DISTCLEANFILES in automake-generated makefiles.
> 

AC_SUBST(ac_extra_clean_files, [configure.lineno])
AM_ADD(DISTCLEANFILES,ac_extra_clean_files)

Makefile.am:
   (nothing special)
Makefile.in:
   DISTCLEANFILES = @ac_extra_clean_files@

good??




Re: Can the AC_DEFINE'd macros be prefixed?

2002-04-26 Thread Guido Draheim

Es schrieb Yves Arrouye:
> 
> Hi,
> 
> I am running into a case where different Autoconf-based packages define the
> same macros, giving me plenty of warnings. Is there a way to prefix the
> AC_DEFINE'd names with some unique string (or maybe just the ones defined by
> Autoconf, such as the HAVE_, the PACKAGE and VERSION, ...)? If not,
> wouldn't that be useful?

widely used:

http://ac-archive.sf.net/Miscellaneous/ac_create_prefix_config_h.html




Re: Site Macro Directory

2002-05-16 Thread Guido Draheim

Es schrieb "Mark D. Roth":
> 
> On Thu May 16 05:59 2002 -0400, Thomas E. Dickey wrote:
> > aclocal has good intentions, poor design.
> 
> I don't really understand why everyone says that like there's nothing
> we can do about it.  If the design is poor and inconvenient, let's fix
> it!  That is why it's called "software", after all. :)
> 
> > there's nothing to stop - but my point: the given approach makes it less
> > likely that someone will be able to easily get the macros since they're
> > in a site-specific somewhere-else.  I've seen far too many crappy packages
> > built with the automake scheme where I cannot find the associated macros.
> 
> I'm not sure that I agree that having a site macro directory makes it
> any more likely for developers to screw things up.  However, even if
> that is the case, we can have autoconf cache any macros that get used
> in the aclocal.m4 file, just like aclocal does.  That way you still
> get a copy of all of the necessary macros as part of the distributed
> package, but it happens automatically and without the need to install
> a seperate package.  Would that address this objection?
> 

HINT:
there is an "acinclude" tool in the ac-archive.sf.net distribution,
it will create an acinclude.m4 file from a set of macro-holder dirs...

 :-)
-- guidohttp://ac-archive.sf.net




Re: Site Macro Directory

2002-05-16 Thread Guido Draheim

Es schrieb Paul Eggert:
> 
> > From: "Mark D. Roth" <[EMAIL PROTECTED]>
> > Date: Thu, 16 May 2002 12:28:27 -0500
> >
> > That's fine, as long as it gets invoked automatically when you invoke
> > autoconf.  It should all get done in one step.
> 
> Personally, I dislike site directories since
> they make it a hassle to move my stuff from one site to another.
> However, I wouldn't object to having a configure-time option for
> building autoconf, for people who like to have site directories.
> 
> Unfortunately, I've seen so many proposals in this area that I'm a bit
> lost as to what's being proposed.  For example, I don't understand
> where the stuff will get cached.  I believe you mentioned that it will
> get cached into aclocal.m4, but won't that collide with Automake's
> aclocal?

yes it would. I did that quite a time with my tools to let aclocal
read a directory of mine holding autoconf scripts which I did later
convert to bind itself to the ac-archive. It did work most of the
time, but note that "most", it wasn't quite good an idea to have
that automated somehow. After deriving "acinclude" from the code 
of "aclocal", I don't quite know how I could live without it 
before - whenever an outer-macro changes, I can call "acinclude" 
and create a local `cache file` acinclude.m4 that automake's aclocal 
will see and use. By default, it also searches subdirectories of 
the current-one since that has proven to be very convenient... 
and useful. I do usually put this acinclude.m4 into cvs of the
projects, so that co-workers don't need to install it on every
platform, which did prove to be a good idea - only the guys who
know how to program m4/autoconf will need the tool to update the
acinclude.m4 file with the changes they made. In general, this is
one or two persons even in BIG projects.

> 
> > It would also be nice if the autoconf installation let you override
> > the path of the site macro directory,
> 
> I think that's essential.
> 
> Also, I would say that for now there should not be a site macro
> directory unless the autoconf installer or autoconf user says so.  If
> the feature proves to be popular, we can always change this default.

no changing default - a new tool! ;-)
 --> http://ac-archive.sf.net




Re: PACKAGE_FOO variables

2002-06-04 Thread Guido Draheim

Es schrieb Stefan Seefeld:
> 
> hi there,
> I'v got another question / bug report:
> 
> I'm using the AC_INIT() macro with three
> arguments, which (even though it is not documented,
> as I wrote in an earlier post) generates a set of
> variables PACKAGE_SOMETHING that get automatically
> inserted into my config header.
> That's a big problem, as the config header is meant
> to be used (indirectly) by other packages that include
> the current package's headers. Why are these PACKAGE
> variables put into the header anyways ?
> How can I disable them ?
> Well, a quick hack is certainly to #undef them
> in the AH_BOTTOM macro, but that's really ugly...
> 

hmm, seems like another variant of the installable-config.h FAQ...
http://ac-archive.sf.net/Miscellaneous/ac_create_prefix_config_h.html




OT Re: what's the effect of test in ac_define_dir.m4

2002-06-24 Thread Guido Draheim

Before things get messy:

for a newbie it might be best to think of every shell command line
to expand into setting finally the integer-variable $? representing
the final value of the command. Therefore, a line like that written
below will expand to

$? = ( xxx == TRUE && yyy == TRUE )

and in a shell, the "&&" works like you used to from C as that it
is short-circuting - if the first execution (xxx == TRUE) returns
false, the second will never get executed. As homework, explain
the following line, and notice that the "test" command will only 
receive arguments up to but _not_ including "||".

test -d "/tmp/my" || mkdir "/tmp/my"

and note that shell-assignments survive a "&&" as long as you do
not try to put them into a subshell with the innocent-looking
round paratheses, so that in the follwoing snippets the first
one has a different results on the terminal than the latter two:

prefix=NONE
test "x$prefix" = xNONE && prefix="/usr/local"
echo $prefix

- vs. -

prefix=NONE
(test "x$prefix" = xNONE && prefix="/usr/local")
echo $prefix

- vs. -

prefix=NONE
(test "x$prefix" = xNONE) && (prefix="/usr/local")
echo $prefix



Es schrieb Ionutz Borcoman:
> 
> Evrika :-)
> 
> Yes, you're right. But I've been for too long a C programmer (and never
> a shell one). I've thought '&&' was for test command. Something like in:
>if ( xxx == TRUE && yyy == TRUE ) {};
> 
> So the C++ equivalent of this bash line:
>test xxx="xxx" && xxx="zzz"
> is
>if( xxx == "xxx" ) { xxx  = "zzz"; }
> 
> Right ?
> 
> TIA,
> 
> Ionutz
> 
> Andreas Schwab wrote:
> >
> > ??? Of course, it is used, in a conditional.




ac-archive not known? Re: Detecting C compiler

2002-07-01 Thread Guido Draheim

http://ac-archive.sf.net/C_Support/ac_prog_cc_warnings.html
see also
http://ac-archive.sf.net/C_Support/ac_prog_cc_strict_prototypes.html
http://ac-archive.sf.net/C_Support/ac_prog_cc_no_writeable_strings.html

Es schrieb Philip Willoughby:
> 
> Hi all,
> 
> I'm trying to write a pair of macros to enable strict ansi conformance mode
> and all vaguely useful warnings for at least my target platforms:
> 
> * AIX with Visualage for C++ C compiler (xlc_r)
> * Solaris with Sun's ANSI C compiler (cc)
> * HP-UX with HP's C compiler (cc)
> * Linux with GCC
> 
> I have used the feature of AC_PROG_CC which sets GCC to "yes" to check for
> gcc, and when it comes to AIX I can just test if CC is xlc or xlc_r (I hope
> -- this is untested).
> 
> The problem is what to do to differentiate between the Sun and HP cc's.
> 
> Any ideas are welcomed, as is any information (prefereably web addresses of
> manuals) for other unix compilers (I'm afraid I can't bring myself to care
> about windows)...
> 
> Ideas that have crossed my mind so far are:
> 
> * use `which' to detect the path to cc and grep that for clues.
> * parse the output of `cc -h' and hope for clues
> * get the user to tell me via --with-cc=[hp|sun|...]
> 
> I don't fancy which because there are no guarantees that the path will be
> the same or even share any common elements across platforms/releases.
> 
> I don't want to parse `cc -h' because there is too much scope for change.
> 
> The final option is most reliable, but just plain ugly.
> 
> I haven't even considered looking at uname and relying on the kernel
> architecture since that breaks the philosophy of autoconf.
> 
> Regards,
> 
> Philip Willoughby
> 
> Systems Programmer, Department of Computing, Imperial College, London, UK
> --
> echo [EMAIL PROTECTED] | tr "bizndfohces" "pwgd9ociaku"




Re: config variables in config.h

2002-09-05 Thread Guido Draheim

Es schrieb Viktor Pavlenko:
> 
> Hello,
> 
> I would like my program to know where it has been installed, in
> particular, the location of $datadir. Looks like a natural way to do
> it is to have a #define in config.h, like this:
> 
> /*
>  * myprog data directory
>  */
> #define MYPROG_DATA_DIR "/usr/local/share/myprog"
> 
> I tried the following in configure.in:
> 
> AC_DEFINE_UNQUOTED(MYPROG_DATA_DIR,$datadir,[myprog data directory])
> 
> but it doesn't work ($prefix/share is inserted and not expanded). I
> would appreciate it if someone could tell me how to do this properly.
> 

http://ac-archive.sf.net/Miscellaneous/ac_define_dir.html

There are a lot of things that people have done often and 
which have been memorized in the ac-archive. If you make
up your own ac-macros, just send them to me :-)





Re: ../configure question

2002-09-06 Thread Guido Draheim

Es schrieb Troy Cauble:
> 
> I am cleaning up some autoconf scripts to support
> multiple builds against the same source, as in
> 
>mkdir build_dir1
>cd build_dir1
>../configure
> 
> In the middle of this large autoconf based project
> there's a third party module that does not use autoconf.
> "./configure" style builds work because make just finds
> the stock Makefile in the source directory.  "../configure"
> style builds break because autoconf didn't create a
> directory and Makefile in the $(top_builddir) tree.
> 
> Is there a standard way to work around this other than
> adding Makefile.am's to the third party module?
> 
> I imagine that the workaround would involve copying the
> entire source module to the $(top_builddir) tree...
> 

hhmmm, there might be a real answer, and a real answer
for autoconf and automake, but I am just asking myself.

If you do a build with a normal Makefile and thereby
having builddir != sourcedir, then you surely must have 
some VPATH feature enabled. The automake files will do 
that too but they are able simulate it as the vpath
feature is not available in make `make`s (and more
often available but broken).

Now that's the point - if you use a make that has a
good vpath, and the plain Makefile has it enabled,
then you could just copy that Makefile to the
builddir, and if vpath is not available, you need
to copy all sources (or symlink it, as it is done
in some projects I know of).

Now... how the hl should autoconf know about those
parts of vpath-enbled make and vpath-using subdir?
Perhaps that's part of the problem it is left out
for now, although there are people with historic
mind about autoconf/automake who might be able to
say more...





vpath / Re: ../configure question

2002-09-06 Thread Guido Draheim

Es schrieb Guido Draheim:
> 
> Es schrieb Troy Cauble:
> >
> > I am cleaning up some autoconf scripts to support
> > multiple builds against the same source, as in
> >
> >mkdir build_dir1
> >cd build_dir1
> >../configure
> >
> > In the middle of this large autoconf based project
> > there's a third party module that does not use autoconf.
> > "./configure" style builds work because make just finds
> > the stock Makefile in the source directory.  "../configure"
> > style builds break because autoconf didn't create a
> > directory and Makefile in the $(top_builddir) tree.
> >
> > Is there a standard way to work around this other than
> > adding Makefile.am's to the third party module?
> >
> > I imagine that the workaround would involve copying the
> > entire source module to the $(top_builddir) tree...
> >
> 
> hhmmm, there might be a real answer, and a real answer
> for autoconf and automake, but I am just asking myself.
> 
> If you do a build with a normal Makefile and thereby
> having builddir != sourcedir, then you surely must have
> some VPATH feature enabled. The automake files will do
> that too but they are able simulate it as the vpath
> feature is not available in make `make`s (and more
> often available but broken).
correction: in many `make`s. 
(although gmake has set a standard and many many corporate 
environment use gmake as make today. In some respects, the
macro capabilities of gmake can replace automake to high degree).

> Now that's the point - if you use a make that has a
> good vpath, and the plain Makefile has it enabled,
> then you could just copy that Makefile to the
> builddir, and if vpath is not available, you need
> to copy all sources (or symlink it, as it is done
> in some projects I know of).
forgot: if you assume good vpath (in a corporate
controlled environment that is a valid assumption)
then you could have the copy-makefile-to-builddir
done by autoconf itself - just rename the Makefile 
to Makefile.in and include it into configure-output. 
autoconf makes a copy for trying to substitute
some at-vars where there are none - which still works.

> 
> Now... how the hl should autoconf know about those
> parts of vpath-enbled make and vpath-using subdir?
> Perhaps that's part of the problem it is left out
> for now, although there are people with historic
> mind about autoconf/automake who might be able to
> say more...
('cause I don't quite remember extra macro hints to
 declare extra assumptions in this area and perhaps
 they are not intended to exist, or not just perhaps).





  1   2   3   >