(no subject)

2003-10-10 Thread Robert
During the last 11 years we have been successfully presenting our training
seminars in the UK. We have now decided to let the rest of the world have a
chance! 
Listed here are the 3 seminars that we will be presenting in various
locations around the world between now and March 2004.
 For further info please send us your details and we will send our
information pack by return.


1 - Selling for Engineers - One-day Seminar

The Selling for Engineers seminar is a good introduction to effective sales
principles for people who are new to selling, and also a useful refresher
for 'old hands'.  It applies to selling both technical products and
intangible services.

Many Sales Engineers have been learning the technical skills of their job
for years but have had little formal training in selling.  This course
helps correct that imbalance.  
Typical job titles of delegates: Sales Engineer, Account Executive and
Business Development Manager.


2 - Telephone Sales Prospecting for Engineers - One-day Workshop

This event is a practical workshop teaching people in technical companies
how to find new customers on the phone.  It is applicable to business
development for both tangible products and intangible services.  The first
session addresses whom to target, what to say and how to handle problems. 
The remainder of the day consists of live sales calls with coaching from
Robert Seviour; the objective being to give delegates some positive
experiences of prospecting, make sales appointments and maybe sell
something!

Please note that this telephone sales prospecting event is restricted to a
maximum of six delegates to permit individual coaching.


3 - Closing Techniques Workshop - One day workshop

What if the customer says:
"   "  'It's too expensive'
"   "  'We're happy with our present supplier'
"   "  'I want to think about it'
 
Can you handle these common objections?
By far the most efficient way to be more profitable is to turn more of the
enquiries you receive into paid orders.  For this, the ability to resolve
objections is critical - either you close or you lose the sale.
And if you answer 'How much discount will you give me?' with: 'I'll ask my
boss', you waste profit, which could be yours with a better reply.
 
In only a day I will teach you techniques which overcome these objections
and more. You will be able to use them immediately to win profitable orders.
 
There is no need to lose business to your competitors or give big
discounts. 
 
 
If you have never had any formal sales training or need a refresher, don't
continue to work at a disadvantage. 
 
All courses are from 9am until 5pm and cost £300 per person per day

Reservations and information
Please contact Sue on: 

Tel:  +44(0)1481 720 294 Fax: +44(0)1481 720 317

If sales training is not an issue for your company please reply to this
email with the word "DELETE" in the subject line. We will remove your
details promptly.





You are too expensive, can I have a discount?

2003-10-22 Thread Robert




Closing 
Techniques Workshop
One 
day workshop
What if 
the customer says:
·  
·  
‘It’s too 
expensive’
·  
·  
‘We’re happy 
with our present supplier’
·  
·  
‘I want to 
think about it’
 
Can you handle these common 
objections?
By far the most efficient way to be more profitable 
is to turn more of the enquiries you receive into paid orders.  For this, the ability to resolve 
objections is critical - either you close or you lose the 
sale.
And if you answer ‘How much discount will you give 
me?’ with: ‘I’ll ask my boss’, you waste profit, which could be yours with a 
better reply.
 
In only a day I will teach you techniques which 
overcome these objections and more. You will be able to use them immediately to 
win profitable orders.
 
There is no need to lose business to your competitors 
or give big discounts. 
 
Fee for this event is £300. 

 
Selling for 
Engineers
One-day 
Seminar
 
The Selling for Engineers seminar is a good 
introduction to effective sales principles for people who are new to selling, 
and also a useful refresher for ‘old hands’.  It applies to selling both technical 
products and intangible services.
 
Many Sales Engineers have been learning the technical 
skills of their job for years but have had little formal training in 
selling.  This course helps correct 
that imbalance.  

Typical job titles of delegates: Sales Engineer, 
Account Executive and Business Development Manager.
 
Fee for this event is £300.
 
 
 
Telephone 
Sales Prospecting for Engineers
One-day 
Workshop
This event is a practical workshop teaching people in 
technical companies how to find new customers on the phone.  It is applicable to business development 
for both tangible products and intangible services.  The first session addresses whom to 
target, what to say and how to handle problems.  The remainder of the day consists of 
live sales calls with coaching from Robert Seviour; the objective being to give 
delegates some positive experiences of prospecting, make sales appointments and 
maybe sell something!
 
Please note that this telephone sales prospecting 
event is restricted to a maximum of six delegates to permit individual 
coaching.
Fee for this event is £300.
 
 
 
 
 
 
If you have never had any formal sales training or 
need a refresher, don’t continue to work at a disadvantage. 

 
 
Reservations and 
information
Please contact Sue on: 
 
Tel:  +44(0)1481 720 294 Fax: +44(0)1481 720 
317
 
If sales training is not an 
issue for your company please reply to this email with the word “DELETE” in the 
subject line. We 
will remove your details promptly.
 


Free sales training for Engineering and Technical Companies

2003-11-28 Thread Robert




Is there someone in your company who 
could benefit from some professional sales-training?
 
Well, it’s yours for the 
asking.
 
Hello, I'm Robert Seviour, I run sales training seminars 
specific to engineering, technology and scientific companies. So that you 
can try it to see whether it could help your business, I’m offering free, 
40-minute, live sales-training sessions by 
telephone.
 
They will be on the first 
Tuesday of every month at 9:30 
am 
UK time and 5:00 
pm 
UK time. (If your are 
in the US or other countries, let me 
know and we'll run events at times to suit you).
 
It’s simple to participate: all you have to do is 
email us a message saying you would like to try it  
and we 'll send you back joining 
instructions .Then you phone into the conference if you want to. 

 
There is no fee for the 
sales training. You have to pay for the 
call, of course, but that is at a normal call rate, it is not a premium 
rate number and I won’t send you an invoice. 
 
 
The training is high quality, it’s specific to 
professional-level sales, I’m experienced – delivered over 500 events in the 
last 10 years, written 6 books on the subject. I’m doing it because we have 
found over time that the more people become acquainted with sales-training the 
more in-house training jobs I get. That’s where I earn my 
living.
 
I acknowledge that 
telephone training isn’t as good as face-to-face. But the logistics and 
expenses involved with the latter mean that most people in techie 
companies never receive any formal-sales training. This telephone 
based course is a lot better than doing nothing. 
And the 
tele-conference makes it possible to take one topic at a time, cover it 
thoroughly and then you have a month to apply the ideas before the next 
chunk of information is delivered to you. 
 
There are:
· no lost productive days 
· no travelling time 
· no costs
· no information overload 
 
Simply dial a number from your home, 
office - anywhere, enter a login code and join the 
tele-conference for live, professional sales-training, free of 
charge. 
 
The program for the next 6 months is; 

· 'How to prospect for 
more business'
· 'How to make sales 
appointments'
· ‘How to make a 
powerful presentation'
· 'How to ask for the order and close the 
sale'
· 'How to deal with difficult 
objections'
· 'How to acquire repeat customers and build profitable 
relationships'.
 
The first 30 minutes of the event are a review by me of 
the main points you need to know about each of these topics. The final 
period is an opportunity for you to ask questions and get answers to any sales 
issues which have been causing you problems.
 
You’ll receive the seven-page course workbook beforehand 
and afterwards an audio CD of the event will be available to purchase, if you 
choose. Over the 6 months, the key areas of sales skills will be covered and you 
will have a library of support material for future 
reference.
 
 Try it for 
free, to see if this is for you. The first session will be Tuesday, 
December 2. 
Just reply with the subject: ‘I want to try the free 
tele-seminar’ and we’ll send you out joining 
instructions.
 
I promise this course will be motivating and informative. 

 
Sincerely,
 
Robert 
Seviour
 
Robert Seviour
 
Sales Training Specifically for 
Engineers
 
Tel:   +44 (0)1481 720 
294
Fax:  +44 (0)1481 720 
317
email:   [EMAIL PROTECTED] 
 
This email is being sent only to 
companies in the engineering and technology sector, if it has reached you 
incorrectly or if sales training isn't relevant to your organisation, please 
reply with 'Delete' as the subject and we'll take you off our list 
immediately.
 
 


We still have places left.

2003-12-10 Thread Robert




Hi, 
 
We still have spaces left on our 
worldwide training seminars for 2004.
 
Reply to this email for our 
information pack.
 
Robert Seviour
 
Sales Training Specifically for 
Engineers
 
Tel:   +44 (0)1481 720 
294
Fax:  +44 (0)1481 720 
317
email:   [EMAIL PROTECTED]
 
 
 PS Remember to ask for your 
discount
 
If sales training is not an issue for your company please 
reply to this email putting "Delete" in the subject box and we will remove you 
from our list. Sorry for the inconvenience.


ArtBarker.com: Winter Promotion

2003-02-05 Thread Robert lash
ArtBarker.com 
Your #1 Art-Related Marketplace
(http://www.ArtBarker.com )


Just in case you were interested, we are currently running a 
Winter special of HALF OFF the listed advertising rates (for as 
many months as you wish to list!), so please contact us if you 
would like more details on advertising your business.

ArtBarker.com  is a way for anyone in the art-related industry to 
effectively bring traffic to their site for only pennies per click.

We continually advertise a new set of pet-related keywords 
across the web (including all the major pay per click search 
engines such as WebCrawler, Yahoo, Overture, Lycos, AltaVista, 
Excite, & Google).
This system places us in the top 3 positions on over 90% of the 
search engines for a specific set of keywords.  Soon we will be 
advertising tens of thousands of keywords maintaining the top 3 
spots for each.

Please visit our website (http://www.ArtBarker.com ) in order to 
see firsthand the benefits that exposure can bring to 
you art-related business.  Remember, our half-price advertising 
promotion is for a limited time only.


Best regards,

Robert Lash
941-544-5422
[EMAIL PROTECTED] 





Re: Automake 1.7.2b uploaded (beta for 1.7.3)

2003-02-13 Thread Robert Boehne
Hello,

I have a problem with the beta, after bootstrapping a
fresh checkout of Libtool with Autoconf 2.57 and Automake
1.7.2b, I get a make distcheck error.  Apparently the
distcheck fails because files are left over in the
insallation directory after "make uninstall" is run.
Below are the offending files
 ${infodir}/dir
 ${prefix}/share/libtool:
install-sh  missing  mkinstalldirs

Is this a bug or misuse?

Robert Boehne


ERROR: files left after uninstall:
./share/libtool/install-sh
./share/libtool/missing
./share/libtool/mkinstalldirs
./info/dir
make[1]: *** [distuninstallcheck] Error 1
make[1]: Leaving directory `/net/libtool/libtool-1.4e/_build'
make: *** [distcheck] Error 2
[rbo@lucifer libtool]$ ls -R /net/libtool/libtool-1.4e/_inst
/net/libtool/libtool-1.4e/_inst:
bin  include  info  lib  share

/net/libtool/libtool-1.4e/_inst/bin:

/net/libtool/libtool-1.4e/_inst/include:

/net/libtool/libtool-1.4e/_inst/info:
dir

/net/libtool/libtool-1.4e/_inst/lib:

/net/libtool/libtool-1.4e/_inst/share:
aclocal  libtool

/net/libtool/libtool-1.4e/_inst/share/aclocal:

/net/libtool/libtool-1.4e/_inst/share/libtool:
install-sh  missing  mkinstalldirs


Alexandre Duret-Lutz wrote:
> 
> Hi people!
> 
> Here is a snapshot of the 1.7.x branch of Automake.
> 
>   ftp://alpha.gnu.org/gnu/automake/automake-1.7.2b.tar.gz
>   ftp://alpha.gnu.org/gnu/automake/automake-1.7.2b.tar.bz2
>   ftp://sources.redhat.com/pub/automake/automake-1.7.2b.tar.gz
>   ftp://sources.redhat.com/pub/automake/automake-1.7.2b.tar.bz2
> 
> This should be reasonably close to what 1.7.3 will be.
> Please test it and report any problem you have to <[EMAIL PROTECTED]>.
> I hope we can release 1.7.3 by the middle of next week.
> 
>   There is a known bug with VPATH handling of Texinfo and Lex rules
>   which isn't fixed here.  This affects all 1.7.x versions for Texinfo
>   rules.  I think older versions will also fails on Lex rules but I
>   haven't verified.  This is triggered during VPATH builds with (at
>   least) Tru64 make, FreeBSD (current) make, and OpenBSD (3.2) make.
>   In short, these make implementations will *not* perform a
>   VPATH search for dependencies which appear as targets in Makefiles.
>   As far as BSD make is concerned this seems to be a recent change in
>   its behavior, as I believe it worked fine in the past.
>   Fixing this requires too much changes for the 1.7.x branch.
> 
>   The following failures in the test suite can denote this bug:
> lex3.test, txinfo3.test, txinfo13.test, txinfo16.test, txinfo18.test
> 
>   Please report these failures anyway, with your system's version,
>   so we know more precisely where this happens.
> 
> Here is a list of changes since 1.7.1.  Those that worry me the more
> are the changes to elisp compilation, and the changes to depcomp.  If
> you can test these, by all means please do!
> 
> Bugs fixed in 1.7.2b:
> * Fix stamp files numbering (when using multiple AC_CONFIG_HEADERS).
> * Query distutils for `pythondir' and `pythonexecdir', instead of
>   using an hardcoded path.  This should allow builds on 64-bit
>   distributions that usually use lib64/ instead of lib/.
> * AM_PATH_PYTHON will also search for python2.3.
> * elisp files are now built all at once instead of one by one. Besides
>   incurring a speed-up, this is required to support interdependent elisp files.
> * Support for DJGPP:
>   - `make distcheck' will now work in `_inst/' and `_build' instead
> of `=inst/' and `=build/'
>   - use `_dirstamp' when the file-system doesn't support `.dirstamp'
>   - more changes that affect only the Automake package (not its output)
> * Fix some incompatibilities with upcoming perl-5.10.
> * Properly quote AC_PACKAGE_TARNAME and AC_PACKAGE_VERSION when defining
>   PACKAGE and VERSION.
> * depcomp fixes:
>   - dashmstdout and dashXmstdout modes: don't use `-o /dev/null', this
> is troublesome with gcc and Solaris compilers. (PR/385)
>   - makedepend mode: work with Libtool. (PR/385 too)
>   - support for ICC.
> * better support for unusual gettext setups, such as multiple po/ directories
>   (PR/381):
>   - Flag missing po/ and intl/ directories as warnings, not errors.
>   - Disable these warnings if po/ does not exist.
> * Noteworthy manual updates:
>   - New FAQ chapter.
>   - Document how AC_CONFIG_AUX_DIR interacts with missing files.
> (Debian Bug #39542)
>   - Document `AM_YFLAGS = -d'.  (PR/382)
> 
> --
> Alexandre Duret-Lutz





Re: Automake 1.7.2b uploaded (beta for 1.7.3)

2003-02-14 Thread Robert Boehne
Alexandre,

Ok, I have Automake 1.7.2b, Autoconf 2.57, Texinfo 4.2 (also tried 4.5)
Make 3.79.1 and bash 2.05a.0(1)-release.  I removed everything
in my libtool directory but the top-level "CVS" directory, then
did an update to get current cvs.  ./bootstrap && ./configure \
--prefix=/net/testme && make && make install
make uninstall
ls -alR /net/testme/

and I get exactly the same result, the same files left over.
So to answer your question directly, these files are installed
by "make install", which is run as a dependency of distcheck.
Any ideas why this is happening?  I began to look into this
problem because another maintainer has not been able to run
"make distcheck" since he abandoned a much older Automake.
Let me know if I can provide more information.

Thanks,

Robert


Alexandre Duret-Lutz wrote:
> 
> >>> "Robert" == Robert Boehne <[EMAIL PROTECTED]> writes:
> 
>  Robert> Hello,
>  Robert> I have a problem with the beta, after bootstrapping a
>  Robert> fresh checkout of Libtool with Autoconf 2.57 and Automake
>  Robert> 1.7.2b, I get a make distcheck error.  Apparently the
>  Robert> distcheck fails because files are left over in the
>  Robert> insallation directory after "make uninstall" is run.
> 
> I was luckier: CVS Libtool distchecks succesfully here.
> 
> I'm using Debian unstable, with Autoconf 2.57, Automake 1.7.2b,
> Libtool 1.4e, Texinfo 4.3a, Make 3.80, Bash 2.05b.0(1)-release.
> 
> My install-info is that from Texinfo, not from Debian.
> (Maybe Debian's install-info creates the ${infodir}/dir file?)
> 
>  Robert> Below are the offending files
>  Robert> ${infodir}/dir
>  Robert> ${prefix}/share/libtool:
>  Robert> install-sh  missing  mkinstalldirs
> 
> Any idea when these files get installed?
> Are they installed by `make install'?  (i.e., outside distcheck')
> --
> Alexandre Duret-Lutz





Re: Automake 1.7.2b uploaded (beta for 1.7.3)

2003-02-16 Thread Robert Boehne
Alexandre:

The "dir" file in ${prefix}/info is created by this command:
install-info --info-dir=/net/testme/info /net/testme/info/libtool.info
It makes sense that this file still exists after "make uninstall"
because unless it is alone in the directory, it should still exist.

The other files were eventually tracked down to libltdl/Makefile.am's
local-install-files rule, checked in as 1.30 May 22, 1999.
ChangeLog entry:
* libltdl/Makefile.am (local-install-files):  New rule to install
libltdl without creating links or mode 777 directories.

The rule itself has a tell-tale "FIXME:" in it, which I would have found
if this rule was run by "make install" in the libltdl subdir, but
instead
it is run by install-data-hook in the top-level Makefile.

## This allows us to install libltdl without using ln and without
creating
## a world writeable directory.
## FIXME:  Removed this rule once automake can do this properly by
itself.
local-install-files: $(DISTFILES)
-rm -rf $(DESTDIR)$(datadir)/libtool/libltdl
$(mkinstalldirs) $(DESTDIR)$(datadir)/libtool/libltdl
@for file in $(DISTFILES); do \
  d=$(srcdir); \
  if test -d $$d/$$file; then \
cp -pr $$d/$$file $(DESTDIR)$(datadir)/libtool/libltdl/$$file; \
  else \
test -f $(DESTDIR)$(datadir)/libtool/libltdl/$$file \
|| cp -p $$d/$$file $(DESTDIR)$(datadir)/libtool/libltdl/$$file ||
:; \
  fi; \
done

And from the top-level Makefile.am, this rule to run the above:

# Create and install libltdl
install-data-hook:
cd libltdl && $(MAKE) local-install-files

Because $(DISTFILES) contains ../config.guess ../config.sub
../install-sh
../mkinstalldirs" ../ltmain.sh and ../missing, these files are installed
by the local-install-files rule as
 ${datadir}/libtool/libltd/../config.guess
and so on, and these are then not cleaned by this rule from the
top-level Makefile.am:
# Uninstall libltdl
uninstall-local:
-rm -rf $(DESTDIR)$(pkgdatadir)/libltdl

Because they reside in the wrong directory.  The simplest way I can get
around this trouble would be to add the three files that don't get
cleaned by uninstall to the uninstall-local rule, but is there a
cleaner way for Automake to handle all this itself?  I could also use
"basename" in the local-install-files rule to transform ../foo to foo
but is that portable enough to use?

Probably of more concern to the readers is the conculsion that this
problem wasn't Automake's fault.

Thanks,

Robert

Alexandre Duret-Lutz wrote:
> 
> >>> "Robert" == Robert Boehne <[EMAIL PROTECTED]> writes:
> 
> [...]
> 
>  Robert> these files are installed by "make install",
> 
> Could you send the output of `make install', so we see exactly
> when these files are installed?
> 
>  Robert> which is run as a dependency of distcheck.  Any ideas
>  Robert> why this is happening?  I began to look into this
>  Robert> problem because another maintainer has not been able to
>  Robert> run "make distcheck" since he abandoned a much older
>  Robert> Automake.
> 
> Which version worked last?
> 
> [...]
> 
> --
> Alexandre Duret-Lutz





race condition with subdir objects:

2003-07-17 Thread Robert Collins
the following will break on distclean aith automake 1.7.5:

Makefile.am:

SUBDIRS=a
AUTOMAKE_OPTIONS = subdir-objects

bin_PROGRAMS=foo
foo_SOURCES=a/foo.cc

a/Makefile.am
bin_PROGRAMS=bar
bar_SOURCES=bar.cc


The failure is because subdirs are distcleaned first, and a/.deps is rm
-rf'd before the top level makefile runs.

Possible solutions include:
distclean prefix first, not postfix.
change the distclean process to not rm -rf the .deps dir.

I don't have time to hack up a test case just yet... sorry.

Cheers,
Rob





Re: convenience binaries

2003-09-22 Thread Robert Collins
On Mon, 2003-09-22 at 19:56, Warren Turkal wrote:
> Is there any support in automake for building a binary that will only be
> used during the build process?

yes,
noinst_PROGRAMS = convenience_binaries

any rules that depend on one of the binaries should be written as:

thing: binary$(EXEEXT)
./binary 

Rob

-- 
GPG key available at: .





Re: convenience binaries

2003-09-22 Thread Robert Collins
On Mon, 2003-09-22 at 21:22, Warren Turkal wrote:
> Robert Collins wrote:
> > yes,
> > noinst_PROGRAMS = convenience_binaries
> 
> Can these convenience programs be built for the host arch in a
> cross-compiled environment?

probably, you'll likely need to override the default build recipe
though.. I haven't tried, perhaps someone else here has more details.

Rob

-- 
GPG key available at: <http://members.aardvark.net.au/lifeless/keys.txt>.





Re: convenience binaries

2003-09-22 Thread Robert Collins
On Mon, 2003-09-22 at 22:31, Andrew Suffield wrote:
> On Mon, Sep 22, 2003 at 10:01:24PM +1000, Robert Collins wrote:
> > On Mon, 2003-09-22 at 21:22, Warren Turkal wrote:
> > > Robert Collins wrote:
> > > > yes,
> > > > noinst_PROGRAMS = convenience_binaries
> > > 
> > > Can these convenience programs be built for the host arch in a
> > > cross-compiled environment?
> > 
> > probably, you'll likely need to override the default build recipe
> > though.. I haven't tried, perhaps someone else here has more details.
> 
> Can it ever be correct for a noinst object to be built for the target
> environment? By definition, they should only exist on the build
> system.

Not necessarily - but certainly the common case is for them to be build
system only.

Rob

-- 
GPG key available at: <http://members.aardvark.net.au/lifeless/keys.txt>.





RE: Compiling 32-bit code on 64-bit HP-UX

2003-09-23 Thread Boehne, Robert

Martin,

Even on a 64-bit capable machine, aCC defaults to 32-bit libraries.
Having config.guess return hppa2.0w does not change the output that
is produced, aCC will produce whatever you tell it to (32-bit by default).
When you're running a configure script you want to set both (for C)
CPPFLAGS and CFLAGS, for C++ CXXCPPFLAGS and CXXFLAGS when you need to
pass an option that changes the preprocessing like +DA.  This is generally
a better option than setting CXX="aCC -DA2.0".  Try this and read
the Libtool web site regarding contributing if you want to send another patch.
http://www.gnu.org/software/libtool/contribute.html

HTH,

Robert

-Original Message-
From: Martin Frydl [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 11, 2003 3:14 PM
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: Compiling 32-bit code on 64-bit HP-UX


Hello,

   I'm trying to create 32-bit shared libraries on 64-bit PA-RISC 2.0 
(HP-UX 11) with aCC compiler. I have several libraries and executables 
in my project. When I run them from build directory, everything works 
fine. However, when I run make install and delete build directory, the 
executables and libraries stop working. I've looked into them via chatr 
utility and it shows that they reference build directory as search path, 
not install one.

   I've investigated the problem and it looks like it lies in 
config.guess. config.guess checks host type according to uname and 
getconf utility which results in "hppa2.0w" (64-bit PA-RISC 2.0). Then 
there is a check for __LP64__ define, which fails. Here the failure is 
caused by wrong use of compiler options - it runs:

   $CC_FOR_BUILD -E -

   which is not understood by aCC. It should be rewritten to use 
temporary file instead of "-" (stdin). Moreover, the check does not use 
compiler flags so the result is not affected by switches determining the 
"bitness" of resulting code - +DA2.0W (64-bit), +DA2.0N (32-bit), +DA1.1 
(32-bit PA-RISC 1.1). However, this can be workarounded by putting these 
directly into CC when running configure.

   The host type set after the check is not clear enough since it does 
not correspond to aCC options. When __LP64__ is defined, host type is 
hppa64, which seems correct if it stands for 64-bit. When the check 
fails, host type is set to hppa2.0w. However, I think this host type 
should mean 64-bit since option +DA2.0W to aCC generates 64-bit code.

   Unfortunatelly, libtool determines 32/64-bit hppa in a way "hppa64 is 
64-bit and anything else is 32-bit". There is also problem in libtool.m4 
regarding options supplied via CC variable. When they are supplied, 
libtool checks for compiler do not work since cc_basename contains also 
those options. I've attached simple patch which removes all characters 
after the first space.

   Also I don't understand the way config.guess should be copied to 
project. Currently it is copied by libtoolize from share/libtool, but 
this one is outdated and current one is in share/automake-1.7. Should I 
copy it manually from automake?

   Now here are my conclusions:

- config.guess does not use CFLAGS when making compilation checks but 
this can be "fixed" by providing necessary flags directly in CC variable
- config.guess should be fixed so __LP64__ check works corretly (patch 
attached)
- config.guess "generates" host types which do not correspond to aCC (HP 
native) namings but this cannot be changed
- there are two config.guess versions when both libtool and automake are 
installed. libtoolize takes one from libtool, not automake
- libtool needs to be patched to support options supplied in CC or CXX 
variables

   Martin




Re: Should -i mkdir?

2003-09-26 Thread Robert Collins
On Sat, 2003-09-27 at 02:20, Alexandre Duret-Lutz wrote:

>  adl> autopoint and libtoolize usually run before automake
>  adl> and put things into this directory too.  So if some tools has to
>  adl> create the directory, I think it should be autopoint.
> 
> Sorry, I meant "it should be autoreconf".

/if it's used/


Rob

-- 
GPG key available at: .





Re: precompiled header suggestion

2003-09-30 Thread Robert Collins
On Wed, 2003-10-01 at 04:30, Tom Tromey wrote:
> Recently gcc added precompiled header support.  This is mostly useful
> for C++, but C might benefit in some cases too.

Waay cool.

Are you planning on doing this, or just sketching the design and hoping
for volunteer contributions?

What might be a useful starting point is some manual test cases or
sample rules, to aim for.

Rob

-- 
GPG key available at: .





Re: Aborting automake?

2003-11-07 Thread Robert Collins
On Sat, 2003-11-08 at 11:22, Harlan Stenn wrote:
> I have a situation where I want every Makefile.am to 'include' one of
> several files.
> 
> If none of these files are 'include'd I want the automake run to abort.
> 
> I know how to cause the abort at runtime, but I'd rather catch this problem
> while automake is running.

If you want the automake include facility, then you'd need to extend
automake to allow some sort of policy check at the end of processing
each Makefile.am. thinking out loud, something like
check_script=${top_srcdir)/link_Makfile.am.sh in automake_options...
then have automake call out to that script, with a failure meaning abort
the automake run.

Rob

-- 
GPG key available at: .





Re: Non-recursive make & intermediate objects

2003-11-19 Thread Robert Collins
On Thu, 2003-11-20 at 09:04, Bob Friesenhahn wrote:
> Using Automake 1.7.9, I am attempting to create a single Makefile.am
> which is capable of building all of the libraries used by the project.
> The source files to the project are located in subdirectories, and the
> output libraries should also be located in subdirectories.  The
> objective is to replace an existing recursive build.
> 
> Using a rule like:
> 
> noinst_LIBRARIES = libs/somedir/libfoo.a
> libs_somedir_libfoo_a_SOURCES=foo.cc
> 
> and then inpecting the output of 'make -n' (and the generated
> Makefile), I see evidence that the build will put all .Po files in a
> .deps subdirectory under the Makefile.am, and all the .o files in the
> same directory as Makefile.am.  This approach seems quite wrong to me
> since it is quite possible that libraries and applications may use
> similarly named source files.  Intermediate files should be placed
> either under the directory where the library is placed, or in the
> directory where the source files live.  An even better solution would
> allow the user to specify where intermediate files are placed on a
> per-library and per application basis.

subdir_objects in your automake options.

Problem is, there is a design headache that makes recursive clean fail
with this approach - I forget the bug #, but it's on my todo, waay down
there :p.

Rob








Re: Non-recursive make & intermediate objects

2003-11-21 Thread Robert Collins
On Thu, 2003-11-20 at 09:50, Bob Friesenhahn wrote:
> On Thu, 20 Nov 2003, Robert Collins wrote:
> 
> > subdir_objects in your automake options.
> >
> > Problem is, there is a design headache that makes recursive clean fail
> > with this approach - I forget the bug #, but it's on my todo, waay down
> > there :p.
> 
> Ahhh, subdir-objects.  Since this is so important to non-recursive
> makes, it would be useful if it was referenced in the "An Alternative
> Approach to Subdirectories" section of the documentation.
> 
> Please move the clean bug up in the priority level.  Automake has a
> non-recursive user now. :-)

We've had them for ages - I've been using it for 2 years now in
progressive increasing sizes I keep meaning to get back to my
transforming include patch
(http://sources.redhat.com/ml/automake/2001-08/msg00112.html) to make
authoring them less unpleasant.

PR 373 is the bug I was referring to on clean - it's a general race
condition.
http://mail.gnu.org/archive/html/automake/2003-07/msg00064.html is a
relevant email in this list archives.

> I suspect/believe that libtool will have some problems as well.

libtool has worked fine for me, with non recursive make for 2 years now.
Not to say there are not issues to find :}.

Rob

-- 
GPG key available at: <http://www.robertcollins.net/keys.txt>.


signature.asc
Description: This is a digitally signed message part


Re: Non-recursive make & intermediate objects

2003-11-21 Thread Robert Collins
On Sat, 2003-11-22 at 07:12, Bob Friesenhahn wrote:

> So this bug is only present if SUBDIRS is used to cause the Makefile
> to also have a recursive aspect.

Yes - which projects that include other projects will need. Or for
things like test scripts, I find throwing them in a sandbox of sorts
much easier than a full recursive makefile - at least until I get back
to that proof of concept.

Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


failure building HEAD

2003-11-30 Thread Robert Collins
checking whether autoconf is installed... yes
checking whether autoconf works... no
configure: error: The installed version of autoconf does not work.
Please check config.log for error messages before this one.

I get the above configuring CVS automake.

from config.log:

configure:1819: eval autoconf --version
autoconf (GNU Autoconf) 2.58
Written by David J. MacKenzie and Akim Demaille.

Copyright (C) 2003 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is
NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR
PURPOSE.
configure:1822: $? = 0 
configure:1830: result: yes
configure:1840: checking whether autoconf works
configure:1847: cd conftest && eval autoconf -o /dev/null conftest.ac
autom4te: cannot open /dev/null.tmp: Permission denied

Is this a 'need to use 2.59' thing? (If so, the error is misleading).

autoconf, invoked directly on other projects, works -just fine-.

Rob

-- 
GPG key available at: .





non recursive includes proof of concept #2

2003-12-01 Thread Robert Collins
Well, I finally snuck in a little time to update my proof of concept for
non recursive includes.

Still, I don't code perl - and it shows ;).

How to use?

Grab CVS automake, apply thepatch, drop the test files into tests
subdir.

Have a look at the test cases to see how to use it.

What does it do?

It transforms macros and paths in an included file (called
Makefile.rules for now) , to make them suitable for a non-recursive
build.

As show by the test cases, this allows a couple of neat things:
1) A stub Makefile.am 
===
include \$(srcdir)/Makefile.rules
===
is all that is needed in a given subdirectory to generate a full
makefile. (Useful if you want to be able to cd to a given dir and
perform builds just in that dir).

2) File paths, and canonical macro names are conveniently short - just
what Bob F has been (rightfully IMO) complaining about.

3) You don't end up with a huge Makefile.am to support, rather each part
of the project has a small rules file.

Rob

-- 
GPG key available at: .
Index: automake.in
===
RCS file: /cvs/automake/automake/automake.in,v
retrieving revision 1.1523
diff -u -p -r1.1523 automake.in
--- automake.in	30 Nov 2003 17:00:36 -	1.1523
+++ automake.in	1 Dec 2003 08:41:24 -
@@ -181,11 +181,27 @@ my $ELSE_PATTERN =
 my $ENDIF_PATTERN =
   '^endif(?:\s+(!?)\s*([A-Za-z][A-Za-z0-9_]*))?\s*(?:#.*)?' . "\$";
 my $PATH_PATTERN = '(\w|[/.-])+';
+my $INCLUDE_KEYWORD = 'include';
+my $SUBDIR_INCLUDE_KEYWORD = 'subdir_include';
 # This will pass through anything not of the prescribed form.
-my $INCLUDE_PATTERN = ('^include\s+'
+my $INCLUDE_PATTERN = ('^' . $INCLUDE_KEYWORD . '\s+'
 		   . '((\$\(top_srcdir\)/' . $PATH_PATTERN . ')'
 		   . '|(\$\(srcdir\)/' . $PATH_PATTERN . ')'
 		   . '|([^/\$]' . $PATH_PATTERN . '))\s*(#.*)?' . "\$");
+my $SUBDIR_INCLUDE_PATTERN = ('^' . $SUBDIR_INCLUDE_KEYWORD . '\s+'
+		   . '((\$\(top_srcdir\)/' . $PATH_PATTERN . ')'
+		   . '|(\$\(srcdir\)/' . $PATH_PATTERN . ')'
+		   . '|([^/\$]' . $PATH_PATTERN . '))\s*(#.*)?' . "\$");
+
+# Canonised variable suffixes
+my @canonised_macro_names =
+qw(SOURCES);
+# Canonised variable contents (foo->path/foo)
+my @canonised_macro_values = 
+qw(SOURCES);
+# Canonised macro lists (foo ->path_foo)
+my @canonised_macro_lists = 
+qw(PROGRAMS);
 
 # Match `-d' as a command-line argument in a string.
 my $DASH_D_PATTERN = "(^|\\s)-d(\\s|\$)";
@@ -216,7 +232,7 @@ my @common_files =
 	ansi2knr.1 ansi2knr.c compile config.guess config.rpath config.sub
 	configure configure.ac configure.in depcomp elisp-comp
 	install-sh libversion.in mdate-sh missing mkinstalldirs
-	py-compile texinfo.tex ylwrap),
+	py-compile texinfo.tex ylwrap Makefile.rules),
  @libtool_files, @libtool_sometimes);
 
 # Commonly used files we auto-include, but only sometimes.
@@ -1697,6 +1713,38 @@ sub handle_single_transform_list (@)
 return @result;
 }
 
+# $VALUE
+# transform_file_list ($PREPEND, @FILES)
+# 
+# insert $PREPEND before every file path that is not absolute
+#
+sub transform_file_list ($$)
+{
+  my ($prepend, $tmpfiles) = @_;
+  my $result = "";
+  my @files = ();
+  @files = split(/ /, $tmpfiles); 
+  while (scalar @files > 0)
+  {
+	$_ = shift @files;
+
+	if ($_ =~ s/^\$\(top_srcdir\)\///)
+	  {
+	$result .= " \$\(top_srcdir\)\/" . $_;
+	  }
+	  elsif ( $_ =~ s/^\$\(srcdir\)\///)
+	  {
+	$result .= " \$\(srcdir\)\/$prepend" . $_;
+	  }
+	  else
+	  {
+	$result .= " $prepend" . $_;
+	  }
+  }
+  verb "transformed value: '$result'\n";
+  return $result . "\n";
+}
+
 
 # $LINKER
 # define_objects_from_sources ($VAR, $OBJVAR, $NODEFINE, $ONE_FILE,
@@ -2145,7 +2193,7 @@ sub handle_programs
   # Canonicalize names and check for misspellings.
   my $xname = &check_canonical_spelling ($one_file, '_LDADD', '_LDFLAGS',
 	 '_SOURCES', '_OBJECTS',
-	 '_DEPENDENCIES');
+	 '_DEPENDENCIES', '_CFLAGS');
 
   $where->push_context ("while processing program `$one_file'");
   $where->set (INTERNAL->get);
@@ -2250,7 +2298,7 @@ sub handle_libraries
   # Canonicalize names and check for misspellings.
   my $xlib = &check_canonical_spelling ($onelib, '_LIBADD', '_SOURCES',
 	'_OBJECTS', '_DEPENDENCIES',
-	'_AR');
+	'_AR', '_CFLAGS');
 
   if (! var ($xlib . '_AR'))
 	{
@@ -2371,7 +2419,20 @@ sub handle_ltlibraries
   # Canonicalize names and check for misspellings.
   my $xlib = &check_canonical_spelling ($onelib, '_LIBADD', '_LDFLAGS',
 	'_SOURCES', '_OBJECTS',
-	'_DEPENDENCIES');
+	'_DEPENDENCIES', '_CFLAGS');
+
+# Tell the source code what library we are building
+#	my $tempvariable = '';
+#	if ( &variable_defined ($xlib . '_CFLAGS'))
+#	{
+#	# Define the lib_CFLAGS variable.
+#	$tempvariable .= &variable

Re: failure building HEAD

2003-12-01 Thread Robert Collins
On Mon, 2003-12-01 at 18:09, Alexandre Duret-Lutz wrote:
> >>> "Robert" == Robert Collins <[EMAIL PROTECTED]> writes:
> 
> [...]
> 
>  Robert> configure:1847: cd conftest && eval autoconf -o /dev/null conftest.ac
>  Robert> autom4te: cannot open /dev/null.tmp: Permission denied
> 
>  Robert> Is this a 'need to use 2.59' thing? (If so, the error is misleading).
> 
> It works fine with Autoconf 2.58.   I wonder if your version of Autoconf
> is patched, or if autoconf is a wrapper, or something.  I can't see where
> this `.tmp' suffix would come from.

I'm on debian unstable, so there is a wrapper - but I bypassed it with
the same results.

Two things combined got it going for me:
1) ./bootstrap
2) doing an in-source-tree configure.

I'm not sure which one fixed it.

Rob

-- 
GPG key available at: <http://www.robertcollins.net/keys.txt>.





Re: Non-recursive make & maintenance issue

2003-12-01 Thread Robert Collins
On Fri, 2003-11-28 at 04:29, Bob Friesenhahn wrote:

> It is not a problem as long as Automake provides sufficient
> automatic translation capabilities.  There just needs to be a standard
> way to create definitions and refer to existing definitions, including
> those that Automake generates for its use.
> 
> In order to avoid confusion, Automake could adopt a GNU-make function
> style syntax which indicates where translations are required.
> 
> For example:
> 
>   $(xlate foo/bar++)_SOURCES
> 
> would be automatically translated by Automake to
> 
>   foo_bar___SOURCES
> 
> and substitution of existing Makefile defines (but not defines based
> on autoconf substituted values) should also work so the following
> should produce the same result:
> 
>   FOO_BAR=foo/bar++
>   $(xlate $(FOO_BAR))_SOURCES
> 
> This would be extremely useful since it would allow a package's
> directory organization to be re-arranged without excruciating pain.

Hmm, I'd prefer to do it via the include mechanism - see my crude, but
effective updated proof of concept - posted here a minute ago.

Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Non-recursive make & maintenance issue

2003-12-01 Thread Robert Collins
On Fri, 2003-11-28 at 03:49, Jirka Hanika wrote:


> My view is that these (and other) problems disappear if you use a
> per-directory Makefile.am; but I also see the benefits (esp. compilation
> speed) of a non-recursive Makefile.  So the solution could be to support
> generating a single Makefile from multiple Makefile.am's in
> subdirectories.  (Just kidding.  But interested in seeing the reasons
> why this is nearly impossible.)

It's completely possible - 2 years ago I did a proof of concept:

http://mail.gnu.org/archive/html/automake/2001-08/txt7.txt

And I've refreshed that today.

Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


oops, regression - updated proof of concept - #2a

2003-12-01 Thread Robert Collins
A minor oversight led to a regression, which I caught when the test
cases finished running... here's a replacement patch. (Still use the
test cases from my previous email).

Rob
-- 
GPG key available at: .
Index: automake.in
===
RCS file: /cvs/automake/automake/automake.in,v
retrieving revision 1.1523
diff -u -p -r1.1523 automake.in
--- automake.in	30 Nov 2003 17:00:36 -	1.1523
+++ automake.in	1 Dec 2003 11:02:38 -
@@ -181,11 +181,27 @@ my $ELSE_PATTERN =
 my $ENDIF_PATTERN =
   '^endif(?:\s+(!?)\s*([A-Za-z][A-Za-z0-9_]*))?\s*(?:#.*)?' . "\$";
 my $PATH_PATTERN = '(\w|[/.-])+';
+my $INCLUDE_KEYWORD = 'include';
+my $SUBDIR_INCLUDE_KEYWORD = 'subdir_include';
 # This will pass through anything not of the prescribed form.
-my $INCLUDE_PATTERN = ('^include\s+'
+my $INCLUDE_PATTERN = ('^' . $INCLUDE_KEYWORD . '\s+'
 		   . '((\$\(top_srcdir\)/' . $PATH_PATTERN . ')'
 		   . '|(\$\(srcdir\)/' . $PATH_PATTERN . ')'
 		   . '|([^/\$]' . $PATH_PATTERN . '))\s*(#.*)?' . "\$");
+my $SUBDIR_INCLUDE_PATTERN = ('^' . $SUBDIR_INCLUDE_KEYWORD . '\s+'
+		   . '((\$\(top_srcdir\)/' . $PATH_PATTERN . ')'
+		   . '|(\$\(srcdir\)/' . $PATH_PATTERN . ')'
+		   . '|([^/\$]' . $PATH_PATTERN . '))\s*(#.*)?' . "\$");
+
+# Canonised variable suffixes
+my @canonised_macro_names =
+qw(SOURCES);
+# Canonised variable contents (foo->path/foo)
+my @canonised_macro_values = 
+qw(SOURCES);
+# Canonised macro lists (foo ->path_foo)
+my @canonised_macro_lists = 
+qw(PROGRAMS);
 
 # Match `-d' as a command-line argument in a string.
 my $DASH_D_PATTERN = "(^|\\s)-d(\\s|\$)";
@@ -216,7 +232,7 @@ my @common_files =
 	ansi2knr.1 ansi2knr.c compile config.guess config.rpath config.sub
 	configure configure.ac configure.in depcomp elisp-comp
 	install-sh libversion.in mdate-sh missing mkinstalldirs
-	py-compile texinfo.tex ylwrap),
+	py-compile texinfo.tex ylwrap Makefile.rules),
  @libtool_files, @libtool_sometimes);
 
 # Commonly used files we auto-include, but only sometimes.
@@ -1697,6 +1713,38 @@ sub handle_single_transform_list (@)
 return @result;
 }
 
+# $VALUE
+# transform_file_list ($PREPEND, @FILES)
+# 
+# insert $PREPEND before every file path that is not absolute
+#
+sub transform_file_list ($$)
+{
+  my ($prepend, $tmpfiles) = @_;
+  my $result = "";
+  my @files = ();
+  @files = split(/ /, $tmpfiles); 
+  while (scalar @files > 0)
+  {
+	$_ = shift @files;
+
+	if ($_ =~ s/^\$\(top_srcdir\)\///)
+	  {
+	$result .= " \$\(top_srcdir\)\/" . $_;
+	  }
+	  elsif ( $_ =~ s/^\$\(srcdir\)\///)
+	  {
+	$result .= " \$\(srcdir\)\/$prepend" . $_;
+	  }
+	  else
+	  {
+	$result .= " $prepend" . $_;
+	  }
+  }
+  verb "transformed value: '$result'\n";
+  return $result . "\n";
+}
+
 
 # $LINKER
 # define_objects_from_sources ($VAR, $OBJVAR, $NODEFINE, $ONE_FILE,
@@ -2145,7 +2193,7 @@ sub handle_programs
   # Canonicalize names and check for misspellings.
   my $xname = &check_canonical_spelling ($one_file, '_LDADD', '_LDFLAGS',
 	 '_SOURCES', '_OBJECTS',
-	 '_DEPENDENCIES');
+	 '_DEPENDENCIES', '_CFLAGS');
 
   $where->push_context ("while processing program `$one_file'");
   $where->set (INTERNAL->get);
@@ -2250,7 +2298,7 @@ sub handle_libraries
   # Canonicalize names and check for misspellings.
   my $xlib = &check_canonical_spelling ($onelib, '_LIBADD', '_SOURCES',
 	'_OBJECTS', '_DEPENDENCIES',
-	'_AR');
+	'_AR', '_CFLAGS');
 
   if (! var ($xlib . '_AR'))
 	{
@@ -2371,7 +2419,20 @@ sub handle_ltlibraries
   # Canonicalize names and check for misspellings.
   my $xlib = &check_canonical_spelling ($onelib, '_LIBADD', '_LDFLAGS',
 	'_SOURCES', '_OBJECTS',
-	'_DEPENDENCIES');
+	'_DEPENDENCIES', '_CFLAGS');
+
+# Tell the source code what library we are building
+#	my $tempvariable = '';
+#	if ( &variable_defined ($xlib . '_CFLAGS'))
+#	{
+#	# Define the lib_CFLAGS variable.
+#	$tempvariable .= &variable_value ($xlib . '_CFLAGS');
+#	&variable_delete ($xlib . '_CFLAGS');
+#	}
+#	my $libname_short = $xlib;
+#	$libname_short =~ s/_la$//  ;
+#	$libname_short = uc ($libname_short);
+#	&define_variable ($xlib . '_CFLAGS', ' -D' . $libname_short . '_COMPILATION ' . $tempvariable);
 
   # Check that the library fits the standard naming convention.
   my $libname_rx = "^lib.*\.la";
@@ -5413,19 +5474,62 @@ sub check_trailing_slash ($\$)
   return $$line =~ /\\$/;
 }
 
+# include ()
+# worker routine to include a file.
+#
+sub include(@)
+{
+  my ($path, $relative_dir, $where, $canonise, @include_stack) = @_;
+  my $prepend_path = "";
+
+  if ($path =~ s/^\$\(top_srcdir\)\///)
+{
+  push (@include_stack, "\$\(top_srcdir\)/$path");
+  error ("attempt to translate a top_sr

Re: Non-recursive make & maintenance issue

2003-12-01 Thread Robert Collins
On Tue, 2003-12-02 at 07:08, Bob Friesenhahn wrote:
> By 'read only', I mean that there is an existing source tree with no
> Makefile.am's (perhaps it uses some other build system) and you are
> not allowed to (or shouldn't) update it.  Since Automake supports
> subdirectories, the Makefile.am doesn't need to reside in the source
> tree and it doesn't care if files which would normally conflict with
> Automake already exist in the tree.
> 
> That was the case for my latest Automake expedition.

Ah, that makes more sense. If layered carefully the two approaches could
even be compatible.

Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Non-recursive make & maintenance issue

2003-12-01 Thread Robert Collins
On Tue, 2003-12-02 at 02:10, Bob Friesenhahn wrote:
> > Hmm, I'd prefer to do it via the include mechanism - see my crude, but
> > effective updated proof of concept - posted here a minute ago.
> 
> I like your include approach.  It helps convert existing recursive
> builds into non-recursive builds with minimum pain.  However, there
> are sufficient reasons to write only one Makefile (e.g. source tree is
> treated as "read only", or personal preference) that both mechanisms
> should be supported.

I'm not arguing against the single-big-file-method... but I am curious:
how does a 'read only' source tree affect this? If there is a
Makefile.am in it that you want to use without alteration, you can just
SUBDIRS= x y z   #not subdir
DIST_SUBDIRS=x y z subdir
subdir_include subdir/Makefile.am

Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: failure building HEAD

2003-12-02 Thread Robert Collins
On Tue, 2003-12-02 at 21:44, Alexandre Duret-Lutz wrote:
> I think this is the problem.  Ben, you cannot write
> `$output.tmp' because when $output is /dev/null a user cannot
> create /dev/null.tmp.  This change breaks the configuration of
> all versions of Automake since 1.6 :(

Yah, so, the right way to do this would be to write to a safe temp file
in /tmp, and then move that to the destination?

Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Non-recursive make & maintenance issue

2003-12-09 Thread Robert Collins
On Wed, 2003-12-10 at 05:06, Tom Tromey wrote:

> It isn't impossible.  I once wrote up some ideas along these lines:
> 
> http://sources.redhat.com/ml/automake/2001-07/msg00248.html
> 
> Obviously I never got around to implementing this :-)

Have you looked at either of my proof-of-concepts?

Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


non-recursive via includes

2003-12-17 Thread Robert Collins
Ok, 
I plan to push this through a little closer to completion (some
feedback from the maintainers would be greatly appreciated !)

I've created a branch for this in arch:

[EMAIL PROTECTED]/automake--nonrecursive--1.8

The arch repository is at http://people.initd.org/robertc/automake/

(GNU Arch is an RCS system - http://www.gnu.org/software/gnu-arch/)

I've also imported the CVS HEAD branch's entire history, to the branch
[EMAIL PROTECTED]/automake--HEAD--0. I'll be keeping HEAD
up to date, to keep my branch's delta minised. If there's interest I'm
happy to generate a branch for CVS-1.8 and keep that up to date.

Now, the patch status is where it was before - it passes the two test
cases I created. I'll be adding a couple more complex test cases and
seeing what breaks tonight.

If anyone is interested in commit messages for this repository, drop me
a line.

Cheers,
Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


HEAD test suite - 6 failures.

2003-12-17 Thread Robert Collins
Are the following tests known to fail (on debian unstable):


FAIL: ccnoco.test
FAIL: gnits2.test
FAIL: gnits3.test
FAIL: pr300-lib.test
FAIL: pr300-prog.test
FAIL: python3.test

Cheers,
Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: HEAD test suite - 6 failures.

2003-12-19 Thread Robert Collins
On Thu, 2003-12-18 at 20:00, Alexandre Duret-Lutz wrote:
> >>> "Robert" == Robert Collins <[EMAIL PROTECTED]> writes:
> 
>  Robert> Are the following tests known to fail (on debian unstable):
> 
> Nein, no tests are known to fail.  What does VERBOSE=x say?

that the scripts in lib/ aren't chmodded correctly.

Perhaps 'make' should chmod them to the recommended mode ?

Rob

-- 
GPG key available at: <http://www.robertcollins.net/keys.txt>.


signature.asc
Description: This is a digitally signed message part


Re: HEAD test suite - 6 failures.

2003-12-19 Thread Robert Collins
On Sat, 2003-12-20 at 00:47, Alexandre Duret-Lutz wrote:

>  Robert> that the scripts in lib/ aren't chmodded correctly.
> 
> Why aren't they?  How did they loose their permissions?

Errm, that was my fault. An oversight in a cvs extracting tool, that I
wasn't aware of at the time.

>  Robert> Perhaps 'make' should chmod them to the recommended mode ?
> 
> I'd prefer we find (and possibly fix) the guilty tool that
> resets permissions.

CVS doesn't manage permissions - it only duplicates then at file
creation time. It seems fragile to me to expect this permission ignorant
program to Do The Right Thing.

Still, my problem is fixed, and it won't bite me again.

Rob

-- 
GPG key available at: <http://www.robertcollins.net/keys.txt>.


signature.asc
Description: This is a digitally signed message part


Re: non recursive includes proof of concept #2

2003-12-19 Thread Robert Collins
On Sat, 2003-12-20 at 00:41, Alexandre Duret-Lutz wrote:
> >>> "Robert" == Robert Collins <[EMAIL PROTECTED]> writes:
> 
> [...]
> 
>  Robert> It transforms macros and paths in an included file (called
>  Robert> Makefile.rules for now) , to make them suitable for a non-recursive
>  Robert> build.
> 
> I'm skeptical about whether this approach can be made to work
> intuitively.  More precisely I don't think it is generally
> possible to let users write a subdir Makefile.am fragment as if
> it would be run locally in the subdirectory, and translate
> *anything* so it actually works from another directory.  Let's
> mention user-defined rules referring rewritten variables, or
> flag variables including things such as -I.

Right. To have both in-dir and top-level Makefiles work may not be
possible. My key goal is the top level Makefile assembled from the
included fragments though - if the dual approach won't work.. it won't.
That said, if the user can do it with only minor hoops to jump
through...

> Anyway you asked for comments on the patch so here are some.
> (I'm sorry I had to be brief in the end, because I have a train
> to catch in one hour.  I'll be away for one week.)
> 
> I don't see what in your changes require the file to be named
> Makefile.rules.

It isn't required to be named that now - my original proof of concept 2
odd years ago started out with a fixed name, which I removed even before
posting back then. So - no requirement.

> This sounds neat.  But AFAICT no test case really cd into that
> given dir and perform the build.

I'll add a test case.

> [...]

> Judging from the comments, I understand that 
> bin_PROGRAMS = foo does not become path/foo?
> Or is it a typo in the example of @canonised_macro_lists?

ah. It's 
bin_PROGRAMS = foo
foo_SOURCES = bar.c
->
bin_PROGRAMS = path/foo
path_foo_SOURCES = path/bar.c

> Since these variables will be used for member ship check, better
> use a hash

Ok, will try this out.


> Is this required?  `include' automatically distributes it's
> argument, I presume you've preserved this for `subdir_include'.

Right, so it won't be required.

> Please use GNU-like 2-space indentation for new code (see HACKING).
> 
> The tail of all Perl files already setups the indentation style
> for Emacs.  Maybe you can submit similar hints for your editor.

Hmm, I thought I had.

I'll stop replying here - I'm going to implement your suggestions as I
get time.

(an aside: some of the issues are holdovers from the 2 year old patch
I'd resurrected).

I'll drop a note here when I've done your suggestions.

Cheers
Rob

-- 
GPG key available at: <http://www.robertcollins.net/keys.txt>.





Re: Expressing dependencies

2004-01-03 Thread Robert Collins
On Sun, 2004-01-04 at 08:17, Laurence Finston wrote:

> The problem is that make makes certain assumptions that don't apply when CWEB
> is used.

I think thats an incorrect statement. It would be more accurate to say
that CWEB hasn't been built with any thought to the impact on make. Make
has only the file system data available to it to determine 'has X
changed more recently than Y.' config.status for example, when it
regenerates config.h will only alter the file if the contents have
changed - so that it preserves the timestamp. I think that most
pre-preocessors in this sense could benefit from a wrapper of some sort
that would equally not alter the file IF nothing had changed - and you
could use that wrapper directly in make rules for ctangle.

Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Expressing dependencies

2004-01-04 Thread Robert Collins
On Mon, 2004-01-05 at 03:53, Laurence Finston wrote:

> This is essentially what I tried to do by using the auxiliary program
> `3DLDFcpl' in the rule for building the executable `3dldf' (roughly):
> 
> 3dldf: $(3DLDF_CWEBS)
>3DLDFcpl

Thats not quite what I was suggesting.

> Not changing the timestamp of the .cxx file isn't enough. In fact, I change
> the name of the ctangle output from .c to .cxx based on
> whether the file has changed since the last version, and use the .cxx file in
> the rule for building the object files. This preserves the timestamp of the
> .cxx file, if the .c file doesn't contain any significant changes.
> 
> I actually had a problem with Automake assuming that 3DLDF was a C program
> rather than a C++ program because of the .c extension. Comparing the .c files
> to the .cxx files, renaming them, if appropriate, and using the .cxx files in
> the rules for building the .o files solves this problem. I think there's at
> least one other way of solving it, perhaps by means of an Autoconf variable,
> but I don't remember off-hand. 

Right, thats orthogonal though: if we 3dldf.o is built from 3dldf.cxx
and 3dldf.h, and 3dldf.cxx and 3dldf.h are built from wdlfd.web; then we
can focus on the dependency issue - not the actual extensions.

> It's reasonable behavior for Automake to assume that the sources for a C++
> program are called .cc, .cxx, or .c++, but it is
> very restrictive to assume files need to be rebuilt based merely on the
> information that their prerequisites have a more recent timestamp. 

As I said, thats the /only/ information (cheaply) available to automake.
Alternatives include generating a md5 of the file and comparing that to
a calculated one, or other such has-content-changed tests. But make
isn't language aware - it can't tell if a file change is 'meaningful'.  

> As Andrew
> Suffield pointed out in his posting, this problem affects Bison and Flex, too,
> which are probably used much more often for GNU software than CWEB.  I suspect
> there are other tools affected by this problem as well.

Yes - and the same problem applies - make assumes that the commands it
runs are for a single purpose - with no side effects.

> I don't think the problem lies with CWEB, or Bison and Flex, for that matter. 
> CWEB isn't GNU software, and I doubt whether the authors would appreciate
> being asked to "fix" it. We can't do anything about make, either. I think the
> problem should be solved within Automake. 

The problem lies in the concept of the header 'maybe changing'. If the
header is dependent on the .web source, then it can't be considered
correct if it's timestamp is older than the .web source - because make
is meant to look at just modification dates. If you want to consider
file content changes, make has to generate some database (of sorts) to
track file content, and while we *could* do that in GNU Make, automake
targets posix make - so that would not be portable. So the rules that
will most likely work well for you are - no target to build the .h file
at all, and magic in a script to replace it if needed. Note that if the
header is replaced, you'll need to reinvoke make on the same dir, to get
it to notice that reliably.

Rob


-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: pathnames containing spaces

2004-01-28 Thread Robert Collins
On Thu, 2004-01-29 at 00:08, Earnie Boyd wrote:
> Good luck with fixing the white space problems in every process that 
> reads arguments and uses white space as a delimiter of some sort.

Earnie has a very good point - GNU Arch faces the same problem with a
limited set of tools - patch, diff and tar. (Plus it's own internals of
course). We've come to the conclusion that whilst fixing
spaces-in-filenames, it would be sensible to address unicode glyphs at
the same time - so as to prevent auditing code twice (once to escape
spaces, once to espace unicode).

It's a big project no matter how you look at it... 

Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: RFC: doc for `Handling Tools that Produce Many Outputs'

2004-01-31 Thread Robert Collins
On Sun, 2004-02-01 at 09:28, Alexandre Duret-Lutz wrote:
> This is a new section I'd like to add to the FAQ.  It has been
> discussed two or three times on the list.
> 
> I'm posting it here for comment.  (In fact I'm mainly hoping
> that some kind fellow will point out English mistakes...)

Cute. It reads fine to me.

Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: RFC: doc for `Handling Tools that Produce Many Outputs'

2004-02-04 Thread Robert Collins
On Thu, 2004-02-05 at 10:36, Eric Siegerman wrote:

> I believe this fails on the following corner case.  Suppose the
> date ordering is like this (with data.h being the oldest):
>   data.h   data.foo   data.c
> 
> data.h is out of date with respect to data.foo, so one wants to
> rebuild it, but I don't think that will happen:


Then data.c is not derived from data.foo. Or someone has manually edited
it - either of which is incorrect for this scenario. Try touching
Makefile.am, then editing Makefile.in...

Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Don't be left out

2004-04-02 Thread Robert Hogue
An associate of yours has set you up on a romantic appointment with someone.

Click here to accept the invitation:
http://lovestupidlove.com/confirm/?oc=53035649


The FREE dating web site
CREATED BY WOMEN


Click here if you do not wish to be invited again:
http://lovestupidlove.com/remove/?oc=53032897



Re: non-recursive make and tests

2004-08-30 Thread Robert Collins
On Mon, 2004-08-30 at 20:30 -0500, Bob Friesenhahn wrote:
> On Mon, 30 Aug 2004, Bob Friesenhahn wrote:
> >
> > It would be quite helpful if Automake offered a mode in which it 
> > automatically changed the working directory to the directory where the test 
> > program/script resides and set $srcdir to the relative position in the source 
> > tree to support VPATH builds.  This would emulate the operation of recursive 
> > builds.  Either a global Automake option could be used to enable this 
> > (subdir-tests), or a "RTESTS" (relative tests) mode would be provided.
> 
> It seems that this topic has not caught anyone's interest since there 
> have been no follow-up posts today.  Surely someone else has converted 
> their recursive project to a non-recursive project and noticed that it 
> is very difficult to get test suites working again?

I found it trivial. Mind you, all my tests are - by design - self
contained.

Rob


signature.asc
Description: This is a digitally signed message part


Re: Relative path in CPPFLAGS and distcheck

2004-12-06 Thread Robert Lowe
Alexandre Duret-Lutz wrote:
[Please reply to [EMAIL PROTECTED]

"Robert" == Robert Lowe <[EMAIL PROTECTED]> writes:

 Robert> Hi!
 Robert> I have a set of common headers files in includes/ and the following
 Robert> line in configure.ac:
 Robert> AC_SUBST(CPPFLAGS,[-I../includes])
Should be 
  AC_SUBST([AM_CPPFLAGS], ['-I$(top_srcdir)/includes'])
or 
  AC_SUBST([AM_CPPFLAGS], ['-I$(top_srcdir)/includes -I$(top_builddir)/includes'])
if you have built headers.

See the thread "RFC for new FAQ entry: Flag Variables Ordering"
on the Automake lists to understand why redefining plain
CPPFLAGS is wrong.
Thank you for the pointers!
 Robert> ...since all source files are in parallel directories.  

(The above doesn't require this.)
Well, make distcheck is still failing, but not on every box I've
tried it.  I thought I would distill everything down to a simple
'hello, world' example (with a built header and a copy of stdio.h)
which you can find at:
  http://www.lawrence.edu/fast/lower/hello-0.5.tar.gz
  (it has to replicate there, but should be there by 20:30 GMT)
From that, configure.ac is:
#   -*- Autoconf -*-
# Process this file with autoconf to produce a configure script.
AC_INIT(hello,0.5)
AM_INIT_AUTOMAKE(hello,0.5)
AC_CANONICAL_HOST
# Checks for programs.
AC_PROG_CC
AC_PROG_INSTALL
AC_CREATE_STDINT_H([${srcdir}/includes/_stdint.h])
AC_SUBST(AM_CPPFLAGS,['-I$(top_srcdir)/includes 
-I$(top_builddir)/includes'])

AC_CONFIG_HEADERS([config.h])
AC_CONFIG_FILES([Makefile src/Makefile])
AC_OUTPUT
...and, Makefile.am:
SUBDIRS = src
EXTRA_DIST = reconf
noinst_HEADERS = includes/stdio.h
...and, src/Makefile.am:
bin_PROGRAMS = hello
hello_SOURCES = hello.c
hello_CFLAGS = -Wall -pedantic
A make distcheck with this example fails on my home system with FC3
running on an AMD Athlon, but *succeeds* on an Intel box running an
older Linux kernel, and older versions of autoconf/automake.  I know
that's a lot of variables.  Perhaps I've still not learned something
I should have, or am I running into some other problem?
*FAILS*:
Linux 2.6.9-1.667 #1 Tue Nov 2 14:50:10 EST 2004 x86_64 x86_64 x86_64 
GNU/Linux

$ autoconf --version
autoconf (GNU Autoconf) 2.59
...
$ automake --version
automake (GNU automake) 1.9.3
...
*SUCCEEDS*:
Linux 2.4.21-9.ELsmp #1 SMP Thu Jan 8 17:08:56 EST 2004 i686 i686 i386 
GNU/Linux

# autoconf --version
autoconf (GNU Autoconf) 2.57
...
# automake --version
automake (GNU automake) 1.6.3
...

[...]
 Robert> Also, is it perfectly legit to list these files in the top-level
 Robert> Makefile.am as EXTRA_DIST, rather than explicitly listing this
 Robert> subdirectory, adding a Makefile.am there, ... since there's
 Robert> nothing to build there?
Yes.  However you'd better use noinst_HEADERS instead of
EXTRA_DIST so that `make tags' and friends process the headers.
Again, thank you for the correction!
-Robert



Re: Automake and new tar

2004-12-28 Thread Robert Collins
On Thu, 2004-11-25 at 21:59 +0100, Christian Fredrik Kalager Schaller
wrote:
> Hi Automake hackers,
> 
> I am maintainer of a GNOME  module called gnome-themes-extras containing
> a set of metathemes for the GNOME desktop. After upgrading my distro I
> have been unable to 'make dist' gnome-themes-extras on Fedora due to tar
> complaining:
> tar: gnome-themes-extras-0.8.0/Gorilla/icons/scalable/mimetypes/gnome-
> mime-application-vnd.sun.xml.impress.svg: file name is too long (max
> 99); not dumped
> 
> I was told today that the problem is that the new tar actually follows
> the spec for the method that automake wants to use. So automake needs to
> be fixed to not use the -o option to tar. 
> 
> Is this a known problem, is there some bugzilla somewhere I can
> report this? Anyone know if/when a release which handles this is planed?

Apologies if I missed someone answering this, but its hitting us
(squid-cache) as well,  we can no longer do a make dist on up-to-date
systems.

Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: How to setup an example programs subdirectory?

2005-01-01 Thread Robert Collins
On Sat, 2005-01-01 at 20:24 -0500, Simon Perreault wrote:
> Hi,
> 
> I have a question for which I haven't been able to find an answer on my own, 
> using the usual resources (manual, google, etc).
> 
> My project uses automake and I want to have a directory containing example 
> programs. These programs should not be built on a simple "make", but could be 
> built on a "make examples" directive. How can I handle that?
> 
> Thanks!

IIRC - untested:
===
EXTRA_PROGRAMS=foo bar

examples: $(EXTRA_PROGRAMS)

===

Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Configuring automake says autoconf 2.58 or higher needed. Have au toconf 2.59 installed. What is/goes wrong?

2005-01-16 Thread Robert Collins
On Sun, 2005-01-16 at 07:01 -0500, Thomas Dickey wrote:
> On Sun, 16 Jan 2005, Ralf Corsepius wrote:
> 
> > On Sat, 2005-01-15 at 13:15 +0100, Alexandre Duret-Lutz wrote:
> >
> >> PS: I know this is not the first time, but I simply do not
> >> understand why you respond to bug reports without Cc: the
> >> reporter.
> > I normally respond CC:-ing the reporter on auto*.gnu.org lists, because
> > they tend to be unreliable. Not have done so in this case was just an
> > oversight.
> 
> otoh, when I do that, I usually get 2-3 complaints from people stating 
> that I shouldn't (ymmv).

They should configure their mail system appropriately so that they don't
see duplicates. (for example, see 'man formail'). MS Exchange, for
instance, does this automatically.

From rom what I recall of the mailing list headers, its not possible to tell
a priori who in the CC list is subscribed and who isn't, thus to ensure
that all interested parties do recieve a copy, I always reply-to-all.

Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Configuring automake says autoconf 2.58 or higher needed. Have au toconf 2.59 installed. What is/goes wrong?

2005-01-17 Thread Robert Collins
On Mon, 2005-01-17 at 03:18 +, Andrew Suffield wrote:
> Only the
> sender can do anything better than this, because they're the only one
> with the necessary information.

Its not at all clear to me that they have sufficient information.

Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


python.m4

2005-01-28 Thread Robert White
I am new to this list and automake.  So, I apologize up front if I ask 
something stupid.

I have been studying AM_PATH_PYTHON, because I have been developing 
some macros to augment it since it does not provide the necessary 
information to compile python extensions or programs/libraries that 
embed python.

Within the python.m4 source, there is this statement, "Another macro is 
required to find the appropriate flags to compile extension modules.".  
That macro is never defined nor can I find any references to it.  I 
have found a macro that is scattered about called 
AM_CHECK_PYTHON_HEADERS which at first I thought came from this group.  
However, it does not look like it.  Does anyone know what macro the 
python.m4 documentation references and where it is located?

Thank you for your time.



MCIS Website updated.

2000-08-18 Thread Robert Lim



Hi,
 
I'm writing to let you know that we have updated our website 
recently( www.coal-ink.com ) Available is 
latest news on the coal industry, marker price information and a selection of 
latest import/export statistics. We will continue to improve and update this 
site in the future. Please let me know of any comments or suggestions you have. 
You may wish to add your website on to our links page or place a link to our 
website on your own page. Please feel free to forward this link to anyone who is 
likely to be interested. 
 
Kind regards,
 
Robert Lim
 
 
Email address: [EMAIL PROTECTED]Tel: (+44) 01730 
265095Fax: (+44) 01730 260044
 
If you have received this email in error, we apologise for any 
inconvenience caused.


Directory navigation

2000-08-29 Thread Robert Boehne

How flexible is automake/autoconf about what
the directory structure of a project looks like?
I have a package that I am attempting to rewrite
to support automake.  I would like to keep its
directory structure though.
The package consists of 40 or so C++ shared libraries,
each one has a "source" directory that is basically
empty, each library has about 20 other directories
that contain the source for the shared object.
I would like to make the distribution tree look
like this:

project-root:
drv/  src/  inc/  make/  @host@/

drv:
# contains some of the source for each part
drv/SharedObject
drv/subpart1
drv/...
drv/subpartN

src:
# contains the rest of the source for each part
src/SharedObject
src/subpart1
src/...
src/subpartN

inc:
# contains all header files

make:
# contains top-level configure scripts and makefiles

@host@:
@host@/lib
# contains all shared objects
@host@/obj:
# these directories contain all the object files and
# auto-generated makefiles
@host@/obj/subpart1
...
@host@/obj/subpartN

Is it possible to have my top-level scripts and makefiles
in the above "make" directory?  Or is it absolutely
necessary to have them in the "project-root" directory?
Can I use a Makefile.am in one of the src/subpart
directories to create a Makefile in @host@/obj/subpart ?

I won't even mention the headaches that I will encounter
when I go to link the C++ shared libraries...   ;)

-- 
Robert Boehne Software Engineer
Ricardo Software   Chicago Technical Center
TEL: (630)789-0003 x. 238
FAX: (630)789-0127
email:  [EMAIL PROTECTED]




Re: [GSoC] Early design discussion for TAP/SubUnit support in automake.

2011-07-05 Thread Robert Collins
Very sorry for the slow response, been EBUSY with real-life.

On Sun, May 22, 2011 at 11:42 PM, Stefano Lattarini
 wrote:
> On Sunday 22 May 2011, Ralf Wildenhues wrote:
>> Hi Stefano, and sorry for the long delay,
>>
> No problem, you had warned me in due time about such possible delays this
> month; so there's really no need to apologize.
>
>> * Stefano Lattarini wrote on Fri, Apr 29, 2011 at 11:21:06AM CEST:
>> > Now that my GSoC application "automake - Interfacing with a test protocol
>> > like TAP or subunit" has been officially accepted, I'd like to start
>> > discussing with the community some early, high-level design and interface
>> > decisions.
>>
>>
>> >  1. Reuse parallel-tests "framework"
>> >  ---
>> >
>> >   The new TAP/SubUnit support should reuse as much of the current
>> >   parallel-tests implementation and semantics as possible.  In particular,
>> >   it should be able to run different test scripts in parallel, generate a
>> >   `.log' file for each test script and a "summarizing" `test-suite.log'
>> >   file, honour the make variables AM_TESTS_ENVIRONMENT, TESTS_ENVIRONMENT
>> >   and AM_COLOR_TESTS and the environment variable VERBOSE, and support
>> >   different extensions for the test scripts, with extension-specific "log
>> >   compilers" and flags (the stuff enabled by TESTS_EXTENSIONS,
>> >   _LOG_COMPILER, etc.).
>>
>> Sounds all sane.
>>
>> >   The XFAIL_TESTS variable might be still supported for the sake of
>> >   backward-compatibility (see below for the details), but it should be
>> >   deprecated, since TAP and SubUnit offer better and more granular ways
>> >   to express expected failures.
>>
>> OK.
>>
>> In another mail:
>> > Thinking again about this, it might be worth trying to be even more 
>> > consistent
>> > with the existing parallel-tests functionality, and use an 
>> > `ext_TEST_PROTOCOL'
>> > variable (or similar) instead of a global `tests-protocol' option.  With 
>> > some
>> > tweaking to the post-processing of `.log' files done in `lib/am/check.am' 
>> > (to
>> > generate `$(TEST_SUITE_LOG)'), this might allow greater code reuse and a 
>> > more
>> > consistent API.
>> >
>> > I've started experimenting with this idea, and I'm not seeing any obvious
>> > shortcoming right now.  I'm hoping I'll be able to post some experimental
>> > patches soon enough.
>>
>> Allowing to specify that per-test is a good idea for transitioning test
>> suites.
>>
> About this, in my first two "tentative" patches:
>  
> I've taken an even more general approach, allowing the developer to define
> and use his own program(s) to:
>  1. launch his test scripts,
>  2. interpreter their results,
>  3. display these results on screen, and
>  4. format and generate the log files.
> All of this is attainable simply by assigning a variable `LOG_WRAPPER'
> (and extensione-specific counterparts of it), and, well, obviously
> providing a real "driver" script that obeys to minimal command-line
> interface (so that it can grasp the options the Automake-generated
> Makefiles passes to it).  Then we will hopefully be able to implement
> our TAP/SubUnit parsers on the top of this feature (thus making it
> indirectly more tested, which is always good for a new feature).

If the subunit parser just gets all the output from the test script,
you might want to use the subunit parser itself: folk in the
bootstrap-set couldn't (obviously), but most other environments can
use high level languages freely. (I don't object to more parser
implementations existing, I'm just thinking about reuse where
possible.

>> I hope to look into the posted patches later today.
>>
> About this, please note that I might be AFK until this evening.  So have
> no haste.
>
>> >  2. New automake option `tests-protocol'
>> >  ---
>> >
>> >  The Tap/SubUnit support in the Automake-generated testsuite drivers
>> >  should be enabled by a new (argument-requiring) option `tests-protocol',
>> >  that will be used to specify the level of support for, detection of, and
>> >  enforcing of SubUnit/TAP streams.
>> >
>> >  The possible values for `tests-protocol' will be:
>> >   - tests-protocol=tap
>> >     All test scripts are expected to use the TAP protocol.
>> >   - tests-protocol=subunit
>> >     All test scripts are expected to use the SubUnit protocol.
>> >   - tests-protocol=adaptive
>>
>> The way you describe "adaptive", it sounds like it should rather be
>> named something like "detect" or "detected" or so.
>>
> I'd like to withdraw this proposal now that we can define per-extension test
> protocols.  Having our hypotetical "client developer" rename his test scripts
> as they get converted to use TAP/SubUnit is IMO better than we having to
> implement in Automake a probably non-trivial "metaparser" that could end up
> being scarcerly used anyway.  WDYT?
>
>> >     Each test script is expect

Re: [GSoC] Early design discussion for TAP/SubUnit support in automake.

2011-07-06 Thread Robert Collins
On Thu, Jul 7, 2011 at 7:03 AM, Stefano Lattarini
 wrote:
> Hello Robert.
>> >>
>> > OTOH, I do believe this is a real concern, to be carefully addressed and
>> > tested for.  Thanks for bringing this up.
>>
>> For Both TAP and subunit the test script running needs to feed into a
>> single parser:
>>
> This is not possible with the current implementation/design of third-party
> test drivers in Automake.  Every test script listed in $(TESTS) is passed
> *separately* to dedicated instance of the proper test driver; this means
> that the output of *each* test script in $(TESTS) will be parsed by a
> different instance of the TAP (or SubUnit) parser.

Crossed wires: I meant each test script having a different instance: I
phrased it poorly. Put differently there should be a 1:1 mapping
between test script that runs and parser instances.

-Rob



*simple* example using Autotest

2012-02-07 Thread Robert Boehne

 All,

I'd like to start using Autotest in a project (that needs is badly) but 
the full-featured GNU M4 example is a bit hard to wrap my head around.
Can anyone suggest another project I could look at as an example, that 
has more basic/rudimentary Autotest use?


Thanks,

Robert



Re: *simple* example using Autotest

2012-02-08 Thread Robert Boehne

 On 02/07/12 16:46, Eric Blake wrote:

On 02/07/2012 03:08 PM, Robert Boehne wrote:

  All,

I'd like to start using Autotest in a project (that needs is badly) but
the full-featured GNU M4 example is a bit hard to wrap my head around.
Can anyone suggest another project I could look at as an example, that
has more basic/rudimentary Autotest use?

Automake is probably the wrong list to ask, since autotest is provided
by autoconf, and automake isn't using autotest.  But here are several
projects I know of that use autotest:

autoconf itself
m4
bison
libtool

some of them easier than others to figure out, but that should hopefully
help you get started on seeing how others have used it.


Right you are, this should have gone to the Autoconf list.  I was surprised
to see Libtool on this list, as back when I was actively developing it,
Autotest wasn't available (and hence, not used there).

Thanks Eric!



Re: should an empty "pkgdata_DATA" cause creation of $(pkgdatadir) by "make install"?

2012-03-13 Thread Robert Boehne

 On 03/13/12 07:30, Stefano Lattarini wrote:

[CC:ing Ralf, as I'd like to hear his opinion here]

Reference:
  <http://lists.gnu.org/archive/html/bug-gnulib/2012-03/msg00078.html>

On 03/13/2012 01:14 PM, Stefano Lattarini wrote:

Now that I think about it, I'm not sure whether it was done "by design" from
the beginning,


And I also missed Ralf's answer here:

  <http://comments.gmane.org/gmane.comp.gnu.gsasl.general/52>

So we're in a sort of a tie here: some users think that the current Automake
behaviour is a feature (and I lean toward that position), other ones (with
Ralf among them, apparently) believe it's a bug.  Hmmm.  What now?

Regards,
   Stefano


I would agree with the "feature" camp.  Users should be able to create  
an empty $(pkgdatadir) - suppose that empty directory is populated by 
other methods.  They should also be able to not create $(pkgdatadir) as 
well as a non-empty $(pkgdatadir).


IMHO - whether this was by design or by accident isn't important.  
What's more important is that the behavior is intuitive, and this 
behavior (to me) is intuitive.


Robert Boehne



Re: Dynamic package version numbers with Autoconf and Automake

2012-08-15 Thread Robert Boehne

On 08/15/12 08:45, Bob Friesenhahn wrote:

On Wed, 15 Aug 2012, Miles Bader wrote:


(3) The final version info is updated (using VCS info and/or autoconf
   version info) at make time using a script, and when it changes,
   only causes a source file (e.g., version.c) to change.

   This means that although some things are rebuilt after a commit
   (version.o, and relinking of any binaries that use it), the amount
   of rebuilding is relatively minor while still yielding accurate
   info.


Likewise, GraphicsMagick configures a "version.h" as well as a version 
file used for non-autotools builds under Windows.  With the currently 
used mechanism, only the few files depending on version.h need to be 
rebuilt but the whole project relinks.


If the project "config.h" was to be re-generated (seems to be 
necessary with new AC_INIT), then all of the source modules would need 
to be recompiled and relinked since everything depends on the 
configuration header.


Bpb


I've had a similar complaint when using Autotest.  In my project, the 
test suite depends on an M4 input file that has the project version 
encoded in it.

 cat test_suite/package.m4
# Signature of the current package.
m4_define([AT_PACKAGE_NAME],  [my_server])
m4_define([AT_PACKAGE_TARNAME],  [my_server])
m4_define([AT_PACKAGE_MINOR_VERSION], 4.18)
m4_define([AT_PACKAGE_VERSION],  [4.18.5])
m4_define([AT_PACKAGE_STRING],  [my_server 4.18.5])

Which is created by a makefile rule as suggested in the autoconf 
documentation:


http://www.gnu.org/software/autoconf/manual/autoconf.html#Making-testsuite-Scripts

So when I change the version of the package, I autoreconf, then make 
then autoreconf again.


Scenario #2 -

I have inherited a library versioning scheme that doesn't play nice with 
Libtool  (an absolute requirement)

when the Automake name is not encoded with the version -

  lib_LTLIBRARIES = libmy_server-4.18.la

 So every time I change the version argument to AC_INIT, I have to 
search around my makefiles for anything

that references each library and make the same changes there.

So I would be very interested in a solution to these issues.

Cheers,

Robert



Re: Cannot Create Executables (configure and -fPIC/-pic)

2012-08-21 Thread Robert Boehne

Jeffrey,

If you look at config.log it will show you the test program that it was 
attempting

to compile, and what errors that attempt generated.  It seems like your
compiler doesn't like "-fPIC -pic" and it has little to do with autotools.

HTH,

Robert

On 08/21/12 17:06, Jeffrey Walton wrote:

Hi All,

When I try and run configure with -fPIC -pic (as opposed to -fPIE
-pie), I receive "C compiler cannot create executables".

./configure CFLAGS="-Wall -Wextra -Wconversion -fPIC -pic
-Wno-unused-parameter -Wformat=2 -Wformat-security
-fstack-protector-all -Wstrict-overflow -Wl,-z,noexecstack
-Wl,-z,relro -Wl,-z,now"
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking target system type... x86_64-unknown-linux-gnu
checking for gcc... gcc
checking whether the C compiler works... no
configure: error: in `/home/jeffrey/Desktop/sipwitch-1.3.1':
configure: error: C compiler cannot create executables
See `config.log' for more details

Because LDFLAGS cannot distinguish between executables and shared
objects, I was told to use -fPIC -pic: "using -fPIC instead of -fPIE
is always ok" [1].

Any ideas how to get automake and friends to build ASLR enabled
executables and shared objects in this manner?

Thanks in advance
Jeff

[1] Request: Add -aslr switch that invokes -fPIE/-pie or -fPIC/-shared
as appropriate, http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52885





Re: problem with AM_PATH_PYTHON

2012-10-26 Thread Robert Boehne

On 10/26/12 09:04, Stefano Lattarini wrote:

Hi Václav, sorry for the delay.

On 10/15/2012 09:07 PM, Václav Zeman wrote:

Hi.

I am having a problem with AM_PATH_PYTHON. I have this in my configure.ac:


AS_IF([test "x$with_python" = "xyes"],
   [AM_PATH_PYTHON([2.3], [:], [:])
AX_SWIG_PYTHON
AC_CONFIG_FILES([swig/python/Makefile])])


The problem is that when I am cross compiling it ignores the if-block
and puts the code of AM_PATH_PYTHON outside. The configure process then
breaks like this:


configure: WARNING: using cross tools not prefixed with host triplet
checking pkg-config is at least version 0.9.0... yes
checking for python... /usr/bin/python
checking for a version of Python >= '2.1.0'... yes
checking for the distutils Python package... yes
checking for Python include path... -I/usr/include/python2.7
checking for Python library path... -L/usr/lib -lpython2.7
checking for Python site-packages path... /usr/lib/python2.7/dist-packages
checking python extra libraries... -lssl -lcrypto  -lssl -lcrypto \
   -L/usr/lib -lz -lpthread -ldl  -lutil
checking python extra linking flags... -Xlinker -export-dynamic -Wl,-O1 \
   -Wl,-Bsymbolic-functions
checking consistency of all components of python development environment... no
configure: error: in /home/wilx/log4cplus-bzr/work-trunk/objdir-mips-linux-gnu:
configure: error:
   Could not link test program to Python. Maybe the main Python library has been
   installed in some non-standard library path. If so, pass it to configure,
   via the LDFLAGS environment variable.
   Example: ./configure LDFLAGS="-L/usr/non-standard-path/python/lib"


This doesn't look like output from the Automake-provided AM_PATH_PYTHON
macro...  Could you please post your config.log, configure and aclocal.m4
files (all compressed, please), as well as the exact command you've used
to generate them?  Maybe we'll have a better chance to understand what is
going on ...

Thanks,
   Stefano


That is because it's coming from this line in AX_SWIG_PYTHON in the 
autoconf-archive.


AC_REQUIRE([AX_PYTHON_DEVEL])




Re: install-strip variant that strips then installs?

2013-05-13 Thread Robert Boehne
Library paths are hard coded at link time.  For that reason, on some platforms 
Libtool relinks binaries during install.  Because "prefix" et. al. can be set 
at make time, stripping has to be done on the installed binary because it may 
not exist until then.

So strip before install would not be portable.

HTH,

Robert Boehne


Gavin Smith  wrote:

>On Tue, May 7, 2013 at 5:12 PM, Rhys Ulerich 
>wrote:
>> I gather that 'make install-strip' installs and then strips binaries.
>> Is there some variant that reverses the order?  If not, any
>> recommendations for how to write one in an Automake-compliant manner?
>>
>> My unstripped binaries are absurdly large and my installation
>> directory is NFS-mounted.  So I get to pay lots of network overhead
>to
>> install what eventually becomes O(100MB) of binaries because the
>> unstripped copy is O(1.5GB).
>>
>> Thanks,
>> Rhys
>>
>
>This seems like a good idea to me. Is there any reason why the order
>couldn't be reversed?
>
>The only problem I can think of is that make install-strip isn't
>expected to modify the binaries in the build directory, and the user
>might conceivably be relying on them being unstripped (for some
>obscure reason). If that could be a problem, perhaps a solution is to
>have a separate "strip" rule which could be run.
>
>You could try writing a rule yourself in your Makefile.am to strip the
>binaries. You could use the bin_PROGRAMS make variable that is set in
>the output Makefile.

-- 
Sent from my Android phone with K-9 Mail. Please excuse my brevity.


Re: Help with static linking

2013-05-31 Thread Robert Boehne
Statically linking libc  is a recipe for disaster, so either read and 
understand why, or just take my word for it.

I don't quite understand why you think you need the rest linked statically, BUT 
the easiest way to do that would be to add LT_INIT to configure.ac to use 
Libtool, and add --static-libtool-libs to the target's LDFLAGS.

That will cause all of the Libtool libraries to be linked statically when 
possible.

If you are only targeting Linux desktop systems, png, gobject, gio, and glib 
should already be there, and in most cases already in memory, so you will 
benefit from zero additional memory use for the code pages.  This also goes for 
all the dependencies of these libraries.  I'm not familiar with zzip, so if it 
isn't a Libtool library you will have to make sure it is linked like this:
-Wl,-static -lzzip -Wl,call_shared

I don't have a computer in front of me, so YMMV, you should man ld to make sure 
those flags are correct.

HTH,

Robert Boehne



Kip Warner  wrote:

>Hey lists,
>
>Sorry for posting on both autoconf and automake lists. I wasn't sure
>which one would be more appropriate for this problem.
>
>I know this has come up before, judging by the archives, but I cannot
>figure out the best way to have my executable statically link against
>certain dependencies. This is needed because it executes off of optical
>media and I cannot always guarantee that the user's runtime environment
>will have the needed dependencies and shipping them shared would be a
>maintenance nightmare.
>
>The dynamic dependencies, according to objdump, are the following...
>
>Dynamic Section:
>  NEEDED   libgio-2.0.so.0
>  NEEDED   libgobject-2.0.so.0
>  NEEDED   libglib-2.0.so.0
>  NEEDED   libzzip-0.so.13
>  NEEDED   libpng12.so.0
>  NEEDED   libstdc++.so.6
>  NEEDED   libm.so.6
>  NEEDED   libgcc_s.so.1
>  NEEDED   libpthread.so.0
>  NEEDED   libc.so.6
>
>libc, pthreads, the C++ runtime, etc., are safe to assume are
>available,
>but the rest I'd like to statically link against. Actually, I'd prefer
>to statically link against everything that I can if possible. But the
>ones for certain I know I should be able to statically link against are
>at least libzzip and libpng.
>
>I know there a number of different approaches to doing this, but from
>the pieces scattered in various places, it was difficult to determine
>the most reliable and recommended approach. For instance, I've tried
>'myproduct_LDADD = $(LIBINTL) -static', but objdump still reports all
>of
>the above dynamic dependencies, so maybe it's not doing what I thought
>it was suppose to do.
>
>This is my configure.ac:
>  <http://rod.gs/Jwo>
>
>This is my Makefile.am:
>  <http://rod.gs/Lwo>
>
>Any help appreciated.
>
>Respectfully,
>
>-- 
>Kip Warner -- Software Engineer
>OpenPGP encrypted/signed mail preferred
>http://www.thevertigo.com
>
>
>
>
>___
>Autoconf mailing list
>autoc...@gnu.org
>https://lists.gnu.org/mailman/listinfo/autoconf

-- 
Sent from my Android phone with K-9 Mail. Please excuse my brevity.


Re: libtool libraries requiring other libraries

2013-07-16 Thread Robert Boehne

Steffen,

I would suggest asking questions about Libtool on a libtool mailing list.

That said - It looks to me like you're not *using* libtool to do your 
linking.

Libtool's la files contain all the dependencies that a library needs.
This is a big help when using static archives because they need
to have everything listed in the proper order on the link line.
If you add

LT_INIT

to configure.ac your problem should go away.

HTH,

Robert Boehne

On 07/16/13 09:56, Steffen Sledz wrote:

Sorry, if this is a faq, but i didn't found a clear answer searching around.

Given situation: We have a library called e.g. libfoo which uses e.g. mq_open 
from librt. Than we have a program e.g. called progbar which requires libfoo 
(and therefor also librt).

Our current solution looks like this, but it does not work, because linking 
progbar is missing the -lrt option.

progbar/configure.ac:
-> snip <-
...
AC_CHECK_LIB([rt], [mq_open], [], [ AC_MSG_ERROR([*** Could not find librt 
***]) ])
...
if test -n "$with_libfoo_path" ; then
   FOO_LDFLAGS="-L$with_libfoo_path/lib -lfoo"
   FOO_CPPFLAGS="-I$with_libfoo_path/include"
else
   AC_CHECK_LIB([foo], [foo_init], [], [ AC_MSG_ERROR([*** Could not find 
libfoo, use --with-libfoo-path option ***]) ], [-lrt])
fi
AC_SUBST(FOO_LDFLAGS)
AC_SUBST(FOO_CPPFLAGS)
...
-> snap <-

progbar/src/Makefile.am:
-> snip <-
bin_PROGRAM = progbar
...
progbar_CPPFLAGS = $(FOO_CPPFLAGS)
progbar_LDFLAGS = $(FOO_LDFLAGS)
...
-> snap <-

Where do we have to make which checks?

Do we need a check for librt in the libfoo package?

Do we have to mention the librt link option somewhere in the progbar package?

Thx for any help,
Steffen

PS: FUP set to 





Re: passing flags to C compiler but not to C++ compiler

2013-07-27 Thread Robert Boehne


On 07/27/13 00:32, Vincent Torri wrote:

hello

in my sources, i have both C and C++ files. Something like

my_lib_la_SOURCES = foo.c foo.cpp

I want to pass, for example, -Wdeclaration-after-statement to gcc. Hence,
when compiling foo.cpp, i have the warning :

cc1plus.exe: warning: command line option '-Wdeclaration-after-statement'
is valid for C/ObjC but not for C++

How can I remove this warning when compiling c++ files ?

thank you

Vincent Torri


I misread - for C and not C++ you should set "CFLAGS"

./configure CFLAGS="-g -O2 -Wdeclaration-after-statement" CXXFLAGS="-g -O2"

for example.  If you want to pass something to the preprocessor,
it's used by both C and C++, use CPPFLAGS
CPPFLAGS="-DSYMBOL=value -I/path/to/headers"

You can set them in the environment, but if you use the syntax where 
it's after

the configure in the command, those settings will be saved in config.log
so you can see what they were set to when you built.

HTH,

Robert Boehne



Re: passing flags to C compiler but not to C++ compiler

2013-07-27 Thread Robert Boehne
You add that flag to CXXFLAGS on the configure command line.

 configure CXXFLAGS=-Wdeclaration-after-statement

Along with any other options you need.


Vincent Torri  wrote:
>hello
>
>in my sources, i have both C and C++ files. Something like
>
>my_lib_la_SOURCES = foo.c foo.cpp
>
>I want to pass, for example, -Wdeclaration-after-statement to gcc.
>Hence,
>when compiling foo.cpp, i have the warning :
>
>cc1plus.exe: warning: command line option
>'-Wdeclaration-after-statement'
>is valid for C/ObjC but not for C++
>
>How can I remove this warning when compiling c++ files ?
>
>thank you
>
>Vincent Torri

-- 
Sent from my Android phone with K-9 Mail. Please excuse my brevity.


Re: passing flags to C compiler but not to C++ compiler

2013-07-29 Thread Robert Boehne

On 07/27/13 16:26, Vincent Torri wrote:

On Sat, Jul 27, 2013 at 10:24 PM, Robert Boehne  wrote:


On 07/27/13 00:32, Vincent Torri wrote:

hello

in my sources, i have both C and C++ files. Something like

my_lib_la_SOURCES = foo.c foo.cpp

I want to pass, for example, -Wdeclaration-after-statement to gcc. Hence,
when compiling foo.cpp, i have the warning :

cc1plus.exe: warning: command line option '-Wdeclaration-after-statement'
is valid for C/ObjC but not for C++

How can I remove this warning when compiling c++ files ?

thank you

Vincent Torri


I misread - for C and not C++ you should set "CFLAGS"

./configure CFLAGS="-g -O2 -Wdeclaration-after-statement" CXXFLAGS="-g -O2"

it will still not work as $(CFLAGS) will be passed to the compilation
of the c++ source file

Vincent Torri


Are you saying you've tried it?  That's not my understanding of how Automake
(or GNU Make) works.  Unless the project has overridden the default
use for these variables, that's how it's supposed to work, and
this problem is an example of why.

This is the relevant section of the manual:

http://www.gnu.org/software/automake/manual/automake.html#Standard-Configuration-Variables


for example.  If you want to pass something to the preprocessor,
it's used by both C and C++, use CPPFLAGS
CPPFLAGS="-DSYMBOL=value -I/path/to/headers"

You can set them in the environment, but if you use the syntax where it's after
the configure in the command, those settings will be saved in config.log
so you can see what they were set to when you built.

HTH,

Robert Boehne





Re: passing flags to C compiler but not to C++ compiler

2013-07-29 Thread Robert Boehne

On 07/29/13 10:55, Vincent Torri wrote:

On Mon, Jul 29, 2013 at 5:17 PM, Robert Boehne  wrote:

On 07/27/13 16:26, Vincent Torri wrote:

On Sat, Jul 27, 2013 at 10:24 PM, Robert Boehne 
wrote:


On 07/27/13 00:32, Vincent Torri wrote:

hello

in my sources, i have both C and C++ files. Something like

my_lib_la_SOURCES = foo.c foo.cpp

I want to pass, for example, -Wdeclaration-after-statement to gcc.
Hence,
when compiling foo.cpp, i have the warning :

cc1plus.exe: warning: command line option
'-Wdeclaration-after-statement'
is valid for C/ObjC but not for C++

How can I remove this warning when compiling c++ files ?

thank you

Vincent Torri


I misread - for C and not C++ you should set "CFLAGS"

./configure CFLAGS="-g -O2 -Wdeclaration-after-statement" CXXFLAGS="-g
-O2"

it will still not work as $(CFLAGS) will be passed to the compilation
of the c++ source file

Vincent Torri


Are you saying you've tried it?

1) yes i tried it

2) the fact that I have the warning proves that CFLAGS is passed to
the C++ compiler...

Vincent Torri



  That's not my understanding of how Automake
(or GNU Make) works.  Unless the project has overridden the default
use for these variables, that's how it's supposed to work, and
this problem is an example of why.

This is the relevant section of the manual:

http://www.gnu.org/software/automake/manual/automake.html#Standard-Configuration-Variables




Then that would be a bug, and you should post enough information
that it can be reproduced, like a tarball of your sources and which 
version you're using.





Generated Makefile fails to build

2014-09-19 Thread Robert Parker
>> make
make  all-am
make[1]: Entering directory `/home/bob/Documents/Programs/Pwordsaver'
gcc -Wall -Wextra -g -O2 -lmhash  -o pwordsaver pwordsaver.o fileutil.o
pwordsaver.o: In function `dohash':
/home/bob/Documents/Programs/Pwordsaver/pwordsaver.c:220: undefined
reference to `mhash_init'
/home/bob/Documents/Programs/Pwordsaver/pwordsaver.c:226: undefined
reference to `mhash'
/home/bob/Documents/Programs/Pwordsaver/pwordsaver.c:228: undefined
reference to `mhash_deinit'
collect2: error: ld returned 1 exit status

However when I alter the gcc line to:
gcc -Wall -Wextra -g -O2 -o pwordsaver pwordsaver.o fileutil.o -lmhash
and use that command by hand the compilation completes successfully.

Here is my Makefile.am
AM_CFLAGS=-Wall -Wextra

bin_PROGRAMS=pwordsaver
pwordsaver_SOURCES=pwordsaver.c fileutil.c fileutil.h

AM_LDFLAGS = -lmhash

man_MANS=pwordsaver.1
EXTRA_BUILD=pwordsaver.1

This problem seems to be a 'feature' of gcc because the same error happens
when compiling by manual commands.

What must I do please?


Re: Generated Makefile fails to build

2014-09-19 Thread Robert Parker
On Sat, Sep 20, 2014 at 6:16 AM, Bob Friesenhahn <
bfrie...@simple.dallas.tx.us> wrote:


> A library is not a linker flag so it does not belong in LDFLAGS. Look into
> using a 'LIBADD' type option instead.
>
> Thanks for pointing me in the right direction.
The bad line in my Makefile.am has now been replaced with:

pwordsaver_LDADD=-lmhash

and everything just works.

My searching came up with, use LIBADD when building libraries and LDADD
when building programs.

Bob


--


Re: Generated Makefile fails to build

2014-09-19 Thread Robert Parker
On Sat, Sep 20, 2014 at 6:46 AM, Warren Young  wrote:

> Just stepping back a bit, some linkers are more tolerant than others about
> the order of flags and such.  Libraries that land in standard locations
> like /usr/lib also obscure this issue, by rendering irrelevant an
> improperly-placed or -constructed -L flag.
>
> Because of this, you will find a fair number of bad Makefile.am examples
> out in the wild.  You don't find out that you've been emulating one of
> these bad examples until you try your package on a system with a stricter
> linker or one that places libraries you need in odd places.
>

True enough. I did get my bad Makefile text by conscientious searching.
However the problem is now solved.
Thanks,
Bob


What is minimum set of Automake work files needed for distribution on github?

2015-09-28 Thread Robert Parker
I need to meet the requirements of 2 sets  of users, the ordinary user who
is only interested `./configure; make; make install` and the power users
who want to start with `autoreconf`.

So far google search on the topic has only increased my confusion.

-- 
The Bundys, Cliven, Ted and Al. Great guys to look up to.


Re: What is minimum set of Automake work files needed for distribution on github?

2015-09-28 Thread Robert Parker
Thanks, that does make sense.

On Mon, Sep 28, 2015 at 10:22 PM, Eric Blake  wrote:

> On 09/28/2015 04:20 AM, Robert Parker wrote:
> > I need to meet the requirements of 2 sets  of users, the ordinary user
> who
> > is only interested `./configure; make; make install` and the power users
> > who want to start with `autoreconf`.
> >
> > So far google search on the topic has only increased my confusion.
>
> The most common solution: don't store anything but configure.ac and
> Makefile.am in git.
>
> The power user checks out git, and runs 'autoreconf' to bootstrap the
> project, then runs 'make dist' to create a self-contained tarball.
>
> The ordinary user takes a tarball, unpacks it, and runs './configure;
> make; make install' without needing any autotools installed.
>
> Ordinary users therefore do NOT use direct git checkouts.  Working with
> git is reserved for power users.
>
> --
> Eric Blake   eblake redhat com+1-919-301-3266
> Libvirt virtualization library http://libvirt.org
>
>


-- 
The Bundys, Cliven, Ted and Al. Great guys to look up to.


Re: What is minimum set of Automake work files needed for distribution on github?

2015-09-30 Thread Robert Parker
Thanks guys for your well considered replies.
You gave me plenty to think about.
Presently my github projects have everything that make dist generates.

Clearly the consensus is strongly against doing that so in time I will
reduce my github repos to your recommendations.

I still want to distribute tarballs for those who want to work that way. I
have a web site doing just about nothing right now so I think I'll upload
the tarballs there.

Bob

On Tue, Sep 29, 2015 at 12:20 PM, Russ Allbery  wrote:

> Bob Friesenhahn  writes:
>
> > The main problem with this is that not all users wanting to build from
> > sources in the repository have the correct autotools versions.  Even if
> > they do have the correct autotools versions, the sources may have been
> > hacked by an OS Autotools package maintainer to work differently.
>
> This doesn't work for all projects, but I've had a lot of success in the
> past in automating the process I use to generate a tarball release (which
> might be as simple as make dist), and just run it daily or weekly and
> publish the resulting tarballs somewhere.  That lets people who really
> want to run source repository snapshots still have a distribution-ready
> tree that doesn't require autotools.
>
> I intensely dislike committing autotools output to the repository.  It's a
> matter of taste, honestly, and some people just do that, but I hate large,
> generated files in version control since they're often not mergable and
> cause all sorts of irritating issues on branching, commits from systems
> with slightly different tool versions, etc.
>
> --
> Russ Allbery (ea...@eyrie.org)  
>
>


-- 
The Bundys, Cliven, Ted and Al. Great guys to look up to.


Automake and bitcode files

2015-10-21 Thread Robert Szewczyk
Hi,

I have a project that brings in several conflicting options to
automake/autotools and I'm looking for the best way to resolve the problem:

* I am attempting to emit LLVM bitcode.  The configure script helpfully
determines that the OBJEXT for the files in question is .bc

* the Makefile.am files specify per-object compilation flags, that causes
automake to emit rules of the form:

foo_foo.o: foo.c
   #compile the object etc.

* when running the resulting makefile, there system complains that there is
no rule to build foo_foo.bc

This problem results from automake hardcoding the extension for the
transformed object files: in automake, within the handle_single_transform
subroutine, there is a comment:

 # Only use $this_obj_ext in the derived
 # source case because in the other case we
 # *don't* want $(OBJEXT) to appear here.
 ($derived_source ? $this_obj_ext : '.o'),

I also note that automake emits a hardcoded rule for .obj extention.

I could see several ways to solve my problem:

* edit automake code itself to the effect of:

   ($derived_source ? $this_obj_ext : $obj),
  that would emit rules of the form:
  foo_foo.$(OBJEXT): foo.c
  #compile the object etc.

  but clearly when the patch was introduced commit 081f2d51 back in 2001
there was a good reason to hardcode .o here.  Does this reason still exist?
* Follow a mechanism similar to emitting .obj rules to emit .bc rules
* override OBJEXT for bitcode files and pretend that those .bc files should
really be called .o

Any other suggestions?

Best regards,

rob


tags fail with dummy source for libtool

2019-12-02 Thread Robert Sachunsky
Hi all,

I am using software that specifies a nodist_EXTRA_*_SOURCES dummy source
file in its top-level Makefile.am, as recommended by the Automake manual
to get the C++ linker to combine several libtool convenience libraries
for subdirectories:

https://www.gnu.org/software/automake/manual/html_node/Libtool-Convenience-Libraries.html

https://github.com/tesseract-ocr/tesseract/blob/14c6e387239a679ac5b58f126ab96e851a5e6624/Makefile.am#L137


However, this also causes the `tags`, `ctags` and `cscope` targets to fail:

--->8---
Making tags in .
make[1]: Entering directory 'tesseract/build'
make[1]: *** No rule to make target 'dummy.cxx', needed by 'tags-am'.  Stop.
make[1]: Leaving directory 'tesseract/build'
Makefile:4392: recipe for target 'tags-recursive' failed
---8<---


Does anyone know how to repair this (i.e. get both the desired linking
_and_ recursive tag generation rules)?


Regards,
Robert



Fine-grained install control

2006-05-03 Thread Robert Lowe

Hi,

I have a small project that needs to install executables
under /usr/local//, but the manpages under
/usr/local/man.  If I use AC_PREFIX_DEFAULT in configure.ac
to point to /usr/local/, how can I override it in
a Makefile.am for manpages?  I'd like the default behaviour
not to require command line options to configure.  I'm
thinking there is a right way and a wrong to do this, and
I nothing jumped out at me from the manual.

TIA,
Robert




Disable implicit RCS rule of GNU Make

2006-06-19 Thread Robert Homann
re a better solution that works with all Make
implementations without the help of Autoconf, which is what I would prefer
(because we include Automake snippets in Makefile.am files for this in
multiple projects and wouldn't need to touch our configure.ac files then)?

Sorry for putting short questions into long words... :)

bye, Robert

-- 
Windows is not the answer.
Windows is the question.
The answer is "No".




Re: Weird problem with LIBADD

2006-10-06 Thread Robert Boehne

   Ralf Wildenhues wrote:

* Sylvestre Ledru wrote on Fri, Oct 06, 2006 at 04:24:14PM CEST:
  

Show how you create this library:
  $(top_builddir)/modules/signal_processing/libsignal_processing.la
(i.e., the command line used to create it).  It looks like it depends on
libcore.la (which would be a circular dependency).
  

Pff, Ok, you are right. It was my fault. There are many cyclics deps in
the library of the software that I am building and it can be quite
boring...
I guess it is not very easy for you to detect this kind of things but if
you could, it would be great (the error message is not explicit).


Sounds like a good idea to try this at least.  I added this to Libtool
TODO list: [1]http://wiki.azazil.net/GnuLibtoolProject/RoadMap

Cheers,
Ralf

   The best thing to do with circular dependencies is to untangle them
   so they aren't circular anymore.  There are major problems
   with trying to maintain a set of libraries that have circular dependencies.
   Portability is one issue, another is the fact that you'll get problems
   like the one in your original post.
  I would think that most software engineers
   (as opposed to physicists - *wink*) would liken supporting
   circular lbrary dependencies to designing an automobile that still
   ran even when lage amounts of tar were put in the gas tank.
   It can be done (with great expense), but wouldn't it be better
   to not put the tar in the gas tank in the first place?
   The best way to fix this is to organize your software
   to have separate libraries that have a particular purpose,
   with a "pyramid" structure.  More low level libraries at the
   bottom, and fewer, higher-level libraries at the top.
   If you can't get the time to do it right, you can cheat by
   putting all the libraries in a circular dependency loop
   into a single library.   Another potential method of cheating
   is to wait until runtime to load them (i.e. use dlopen).
   HTH,
   Robert Boehne

References

   1. http://wiki.azazil.net/GnuLibtoolProject/RoadMap


Re: Automake violations of the gnu coding conventions

2007-06-18 Thread Robert Collins
On Mon, 2007-06-18 at 17:27 -0700, K. Richard Pixley wrote:
> 
> My question today is... is there any hope of bringing automake
> generated 
> Makefiles back into line with the GNU coding standards so that these 
> applications will work once again? 

Use AM_MAINTAINER_MODE in your package; this will disable the rules to
rebuild Makefile.in etc unless --enable-maintainer-mode is supplied by
the user.

-Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Strictness

2007-08-12 Thread Robert Collins
On Sat, 2007-08-11 at 22:06 +0200, Carl Fürstenberg wrote:
> On 8/11/07, Noah Slater <[EMAIL PROTECTED]> wrote:
> > > I think you misunderstanding me, it's the generation if the changelog
> > > that will take too long time.
> >
> > Well, yes - what else could I have understood from:
> >
> > > That not an optimal option, as it's illogical to store those files in the 
> > > svn.
> >
> > How long does it take to generate?
> >
> 
> it all depends on subjects connection and computer power, but for a
> normal person, perhaps 10 min.
> 
> I believe the ChangeLog file is a relic from the past when the
> developers made one commit per week generally, it doesn't really fit
> when there can be hundreds of commits per day.

I disagree. In a centralised VCS sure, you can scale to 100's of commits
a day - but in a distributed VCS - e.g. bzr, git, hg, monotone ... you
tend to get 100's of commits on branches, and a much smaller number of
branch merges occuring - branch merges being the point at which code
review is done, regression tests run etc. And its entirely appropriate
to create a human meaningful summary of the aggregate work done on that
branch. 

-Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Strictness

2007-08-12 Thread Robert Collins
On Sun, 2007-08-12 at 23:40 +0100, Noah Slater wrote:
> > I disagree. In a centralised VCS sure, you can scale to 100's of commits
> > a day - but in a distributed VCS - e.g. bzr, git, hg, monotone ... you
> > tend to get 100's of commits on branches, and a much smaller number of
> > branch merges occuring - branch merges being the point at which code
> > review is done, regression tests run etc. And its entirely appropriate
> > to create a human meaningful summary of the aggregate work done on that
> > branch.
> 
> Actually - this argument doesn't support your conclusion.
> 
> In Subversion (as an example) if you were merging branch A to the
> trunk after 10 commits you would have one revision entry for the merge
> - not the aggregate of all 10. So in this instance the commit message
> would be the description of the merge and the ChangeLog would (like I
> previously stated) make sense copying this exactly...

Well, in other VCS's the merged revisions are preserved. So 'log' shows
you the merge revision, and the log for the other revisions. Theres a
certain amount of repetition if you pull everything across, so what I've
seen a number of projects do is to do a fairly trivial commit message
and have the ChangeLog updated by the merge.

> ... which once again begs the question - why bother?

I think this boils down to 'do you think there is any disconnect between
commits to the VCS and telling humans whats happened from release to
release'.

I think there is.
-Rob

-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Automatically Handling Tools that Produce Many Outputs

2007-12-10 Thread Robert Collins

On Mon, 2007-12-10 at 21:51 +, Olly Betts wrote:
> 
> I can't think of a way to easily dig out statistically useful data
> from a VCS or Google code search on how often it happens either to me
> or
> generally.  But I mainly offered it as a more concrete example of the
> sort of issues I had in mind.

One particularly common multiple-output case is generating the
dependencies used during compilation:

gcc -MD -o foo.c
-> foo.o
-> foo.d

:)

-Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Re: Modifying CFLAGS for 'make distcheck'

2008-02-11 Thread Robert Collins

On Sat, 2008-02-09 at 14:52 -0600, Bob Friesenhahn wrote:
> On Sat, 9 Feb 2008, Ralf Wildenhues wrote:
> >
> > If *that* were still a concern for a compression tool (as opposed to
> > various vendor `tar' programs), then heck it should not be promoted at
> > all for wider use.  No, I don't think each package using Automake should
> > be turned into a regression test suite for lzma.  That's more fear,
> 
> Most of the compressors offer an option to test the compressed file. 
> For lzma, gunzip, and bunzip2, that is provided by the -t option, and 
> for zip, it is the -T option.  It is probably not a bad idea to at 
> least verify that the same tool thinks that the package can be 
> properly decompressed.

might be simpler to layer this:

dist-tar to get a tar
dist-$compressor builds a compressed version from that tar.

You can test each compressor has compressed correctly by decompressing
to a temp file and checking cmp $temptar $origtar

-Rob
-- 
GPG key available at: .


signature.asc
Description: This is a digitally signed message part


Using ylwrap in parallel builds fails sometimes

2009-02-18 Thread Robert Homann
Hi all!

I have some trouble building a yacc parser with Automake (using
ylwrap) in a parallel build. Usually, make -j runs fine on my project
with no errors, but it fails occasionally when trying to build the
parser. Here is a stripped down test case that demonstrates what I'm
doing (using Autoconf 2.63, Automake 1.10.2):

===
configure.ac:
---
AC_PREREQ([2.63])
AC_INIT([testing],[0.1],[d...@not.exist])
AM_INIT_AUTOMAKE([-Wall foreign])
AC_CONFIG_SRCDIR([parser.y])
AC_CONFIG_HEADERS([config.h])
AC_PROG_CC
AM_PROG_LEX
AC_PROG_YACC
AC_PROG_RANLIB
YFLAGS="${YFLAGS} -d"
AC_CONFIG_FILES([Makefile])
AC_OUTPUT
===

===
Makefile.am:
---
noinst_LIBRARIES=libparser.a
libparser_a_SOURCES=parser.y scanner.l
===

===
parser.y
---
%{
#ifdef HAVE_CONFIG_H
#include 
#endif /* HAVE_CONFIG_H */
int yylex (void);
void yyerror (char const *);
%}
%token NUM
%token STRING
%token OP
%%
input: NUM
 ;
===

===
scanner.l
---
%{
#ifdef HAVE_CONFIG_H
#include 
#endif /* HAVE_CONFIG_H */
#include 
#include "parser.h"
%}
%%
[0-9]+  return NUM;
%%
int yyerror(const char *err) { fprintf(stderr,"Error.\n"); return -1;
}
===


With only these four files in some directory, execute the following
commands:

$ autoreconf -i
configure.ac:3: installing `./install-sh'
configure.ac:3: installing `./missing'
Makefile.am: installing `./depcomp'
configure.ac: installing `./ylwrap'
$ ./configure
...
$ STOP=0; while test $STOP -eq 0; do make clean && rm -f parser.[ch] scanner.c 
&& make -j || STOP=1; done

The last line cleans up the project and performs a parallel build
again and again until it fails -- and it does fail after a couple of
runs, sometimes straight after the first one, sometimes after 10 or
even more, like this:

[...]
/bin/bash ./ylwrap parser.y y.tab.c parser.c y.tab.h parser.h y.output 
parser.output -- bison -y -d
/bin/bash ./ylwrap scanner.l lex.yy.c scanner.c -- flex
gcc -DHAVE_CONFIG_H -I. -g -O2 -MT scanner.o -MD -MP -MF .deps/scanner.Tpo 
-c -o scanner.o scanner.c
scanner.l:6:20: error: parser.h: No such file or directory
updating parser.h
gcc -DHAVE_CONFIG_H -I. -g -O2 -MT parser.o -MD -MP -MF .deps/parser.Tpo -c 
-o parser.o parser.c
scanner.l: In function `yylex':
scanner.l:9: error: `NUM' undeclared (first use in this function)
scanner.l:9: error: (Each undeclared identifier is reported only once
scanner.l:9: error: for each function it appears in.)
make[1]: *** [scanner.o] Error 1
[...]

The file parser.h should have been generated by ylwrap/bison before
compiling scanner.c, but apparently this is not always the case.

I can reproduce this on Linux/amd64 (Ubuntu 8.10, dual core) and on
Solaris 10/i386 (dual core), using GNU Make 3.81 on both machines. The
problem seems to go away, however, when using pmake -j 4 (NetBSD make)
on my Linux machine.

Now, did I forget to add some additional rules to Makefile.am? Or do
you think I have hit a bug in ylwrap or in GNU Make?

Best regards,
Robert Homann

-- 
Windows is not the answer.
Windows is the question.
The answer is "No".




Re: Using ylwrap in parallel builds fails sometimes

2009-02-23 Thread Robert Homann
On Thu, 19 Feb 2009, Robert Homann wrote:

Hi again!

> I have some trouble building a yacc parser with Automake (using
> ylwrap) in a parallel build. Usually, make -j runs fine on my project

Replying to myself, adding the dependency

scanner.c: parser.c

to Makefile.am works fine (thanks to Ruediger Ranft for the suggestion).


As a side note, I looked through the Automake manual once again and found
another suggestion there (section 8.8, Yacc and Lex support), which told
me to add

BUILT_SOURCES = parser.h

to Makefile.am. This, however, did not work for me (make doesn't know how
to build parser.h). Putting parser.c into BUILT_SOURCES instead was fine
again.

Best regards,
Robert Homann

-- 
Windows is not the answer.
Windows is the question.
The answer is "No".




Dealing with .asm files

2009-04-01 Thread Robert Homann
Hi!

Recently, I worked on a not-yet-autoconfiscated project that contains
files with .asm extensions. These files should be treated just like .s
files by Automake, but Automake doesn't know the .asm extension, urging me
to duplicate the rules for .s files in this project (the maintainer would
like to keep the file names as they are).

Duplicating rules seemed all wrong to me, so I tried adding '.asm' right
after '.s' to the corresponding register_language() call in automake
instead, and it just worked then.

Now, would it hurt to add this extension to regular Automake?

Best regards,
Robert Homann




Re: rebuilding following a change in prefix?

2009-05-07 Thread Robert Collins
On Fri, 2009-05-08 at 06:52 +0200, Jan Engelhardt wrote:


> Well, automake (unfortunately?) does not currently issue a recompile
> when the compiler command changed.
> It would be really cool to have that, though.

Write the compiler command to a file (stamp-compiler). make things
depend on that file ;).

When determining compiler, do the temp, if_not_changed_leave_alone
dance.

-Rob



signature.asc
Description: This is a digitally signed message part


Re: invoke pkg-config with --static

2009-05-16 Thread Robert Collins
On Sat, 2009-05-16 at 19:04 -0500, Bob Friesenhahn wrote:
> On Sat, 16 May 2009, Lorenzo Bettini wrote:
> 
> > when ./configure is run with --disable-shared, is there a way to invoke the 
> > pkg-config macro with --static (so that it does not select private 
> > libraries 
> > in the .pc file)?
> 
> It seems that LT_INIT has been executed, the shell environment 
> variables enable_shared and enable_static are set to 'yes' if 
> shared and/or static libraries will be built.  You can test these 
> environment variables to determine the parameter to pass to 
> pkg-config.

But note, that private libraries are likely needed to like against when
creating a static build; its only shared libraries that the runtime
linker can resolve missing dependencies transitively for you, [and that
only on some platforms like glibc linux].

-Rob


signature.asc
Description: This is a digitally signed message part


Re: My project can't use `silent-rules'

2009-05-17 Thread Robert Collins
On Sun, 2009-05-17 at 15:43 -0500, Bob Friesenhahn wrote:

> The reason why my package can not use AC_INIT is that the package 
> version information is (often) computed by shell script code based on 
> the last entry in the project ChangeLog or other information.  It is 
> (apparently) not possible for user-provided script code to be executed 
> prior to AC_INIT and it is not clear if AC_INIT would allow passing a 
> shell variable in order to obtain the package version value.

This isn't necessarily an answer, but its a step towards one. I've
recently started using the following consistently in my projects - I got
tired of repeating version numbers as hard coded strings in multiple
places.

AC_DEFUN([FOO_MAJOR_VERSION], [0])
AC_DEFUN([FOO_MINOR_VERSION], [0])
AC_DEFUN([FOO_MICRO_VERSION], [1])
AC_DEFUN([FOO_VERSION],
[FOO_MAJOR_VERSION.FOO_MINOR_VERSION.FOO_MICRO_VERSION])
AC_PREREQ([2.59])
AC_INIT([libfoo], [FOO_VERSION], [liboo-de...@example.com])
AC_CONFIG_SRCDIR([foo/foo.h.in])
AM_INIT_AUTOMAKE([-Wall -Werror foreign subdir-objects])
AC_CONFIG_MACRO_DIR([m4])
[FOO_MAJOR_VERSION]=FOO_MAJOR_VERSION
[FOO_MINOR_VERSION]=FOO_MINOR_VERSION
[FOO_MICRO_VERSION]=FOO_MICRO_VERSION
[FOO_VERSION]=FOO_VERSION
AC_SUBST([FOO_MAJOR_VERSION])
AC_SUBST([FOO_MINOR_VERSION])
AC_SUBST([FOO_MICRO_VERSION])
AC_SUBST([FOO_VERSION])

Its pretty verbose, when I get some more time to fiddle with build
systems rather than writing code I'm going to look at reducing the
redundancy - it should be one line for each FOO_VARIABLE.

Anyhow, the key thing is that defining FOO_VARIABLE as a function lets
you do pretty much what you want.

-Rob

-- 


signature.asc
Description: This is a digitally signed message part


Re: invoke pkg-config with --static

2009-05-23 Thread Robert Collins
On Sat, 2009-05-23 at 18:18 +0200, Lorenzo Bettini wrote:
> Ralf Wildenhues wrote:


> > Of course, as soon as you propose your software for packaging at
> > debian.org, they will count not using .Private as bug ... ;-)
> 
> uh!  Good to know that!  Thanks :-)

This is because when you link against something that isn't directly
used, a library reference is made. This causes difficulty on upgrades to
the library-that-isn't-directly-used; and it can be very annoying to
maintainers of distributions.

-Rob



signature.asc
Description: This is a digitally signed message part


distcheck and uninstall

2009-09-18 Thread Robert Collins
It would be nice if there was an option to tell automake not to (do
'uninstall' as part of distcheck | require that uninstall leaves no
files behind)

distcheck is very useful, it catches many distribution related bugs like
missing EXTRA_DIST and so on.

However, uninstall as a target is much less valuable - most users use
external packaging systems like rpm/deb. Further to that it can be time
consuming when integrating with non automake build systems such as
MakeMaker (via a Makefile.PL and -local & -hook rules such as those
appended) to get uninstall working at all - when they neither support
uninstall nor trivially expose appropriate hooks to implement it for
them.

As it stands, making distcheck work in cross-language packages where
external build tool integration is required, is a real nuisance because
of the requirement for uninstall to both work & not leave anything
behind.

There are two projects I'm involved in that this is causing some
nuisance; one is squid, where our uninstall target deliberately leaves
behind the users configuration file - uninstall vs purge, in debian
packaging concepts. The second is subunit, where a contributor has
provided some perl bindings and tools, but MakeMaker doesn't generate an
uninstall target - and even if it did files would likely be [correctly]
left behind, because a perl config file is edited as part of install.

[Please don't let the specific examples become a bikeshed :)]

-Rob
===
all-local: perl/Makefile
$(MAKE) -C perl all

check-local: perl/Makefile
$(MAKE) -C perl check

clean-local:
rm -f perl/Makefile

# Remove perl dir for VPATH builds.
distclean-local:
-rmdir perl > /dev/null
-rm perl/Makefile.PL > /dev/null

install-exec-local: perl/Makefile
$(MAKE) -C perl install

mostlyclean-local:
rm -rf perl/blib
rm -rf perl/pm_to_blib

uninstall-local:
-# there is no uninstall target! $(MAKE) -C perl uninstall

perl/Makefile: perl/Makefile.PL
mkdir -p perl
cd perl && perl Makefile.PL
-rm perl/Makefile.old > /dev/null



signature.asc
Description: This is a digitally signed message part


Re: distcheck and uninstall

2009-09-18 Thread Robert Collins
On Sat, 2009-09-19 at 08:24 +0200, Ralf Wildenhues wrote:
> Hello Robert,
> 
> * Robert Collins wrote on Sat, Sep 19, 2009 at 06:16:25AM CEST:
> > It would be nice if there was an option to tell automake not to (do
> > 'uninstall' as part of distcheck | require that uninstall leaves no
> > files behind)
> 
> You can do the latter by overriding either distuninstallcheck rule of
> the distuninstallcheck_listfiles command, see the lower half of
> <http://www.gnu.org/software/automake/manual/html_node/Checking-the-Distribution.html>.

Sweet, thanks. Do you recall when that target was added? [So I can make
sure that that version of automake is my required minimum..]

-Rob


signature.asc
Description: This is a digitally signed message part


Re: distcheck and uninstall

2009-09-18 Thread Robert Collins
On Sat, 2009-09-19 at 08:33 +0200, Ralf Wildenhues wrote:

> No, I don't, but automake/NEWS indicates that it should've been around
> 1.7, and 'git show Release-1-7:lib/am/distdir.am' looks good, too.

Thanks again,
Rob


signature.asc
Description: This is a digitally signed message part


--gnits, AC_INIT and VERSION

2009-09-23 Thread Robert Collins
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

There was discussion about getting version numbers from VCS recently;
I've done a slightly different thing for a while now:
AC_DEFUN([SUBUNIT_MAJOR_VERSION], [0])
AC_DEFUN([SUBUNIT_MINOR_VERSION], [0])
AC_DEFUN([SUBUNIT_MICRO_VERSION], [2])
AC_DEFUN([SUBUNIT_VERSION],
[SUBUNIT_MAJOR_VERSION.SUBUNIT_MINOR_VERSION.SUBUNIT_MICRO_VERSION])
AC_PREREQ([2.59])
AC_INIT([subunit], [SUBUNIT_VERSION], [contact...])
AC_SUBST([SUBUNIT_MAJOR_VERSION]) ...

This works well, until gnits is added to AM_INIT_AUTOMAKE.

It looks like it expects a literal, not a macro in AC_INIT ?

It seems to me that for gnits to work with dynamic (whether from VCS or
otherwise) version numbers, it would need to actually get the expanded
form, or perhaps more sensibly provide a configure time check that the
version calculation *results* in a good version.

Something like:

all: am-version-check
.PHONY: am-version-check

am-version-check::


- -Rob

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkq51lsACgkQ42zgmrPGrq5moACcCOKo5fleZ2L7/q5IA3ChWLko
fcMAnjYikQhHp1AVTkA6UD0XjXnvjKV4
=lrJs
-END PGP SIGNATURE-




Re: place automake files separately from source files

2009-09-26 Thread Robert Collins
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1


> This way people can build using the GNU automake system if they so desires
> and I do not overwrite the original non-automake Makefiles.   Then how can I
> specify the sources files in source1,c, etc.  Keep in mind that the original
> source tree may be complex with deep subdirectories of its own.

You can do this using relative paths to the sources in your Makefile.am
and configure.ac.

However I suggest you either adopt automake, or don't. Keeping 2 build
systems means you will have twice as many places to record changes to
build dependencies. Why do you want folk to be able to use the old build
system?

- -Rob
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkq+xzoACgkQ42zgmrPGrq7+3gCeOkztJP8Ucgl7P4u1M8YrgBjd
c7gAoLkZv6xUFYfh7Bvr6dPEJAB0BwtU
=u83k
-END PGP SIGNATURE-




Re: how to detect broken install-sh?

2009-09-27 Thread Robert Collins
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Brian Gough wrote:
> Hi,
> 
> I'd like to hear thoughts about the best way to detect a broken install-sh.
..
> Maybe it would be good to have a check for problems with install-sh.

I think that is a waste of cycles for every project except Automake :).

> I can see a couple of ways this could be done:
> 
>   - AC_PROG_INSTALL could confirm that the install program it finds
> works in the way it will be used in "make install" and give an
> error otherwise.

Perhaps it already does for the system install, if so extending that to
the bundled one isn't a great stretch.

> What would be the best way?  Do you think this might cause other
> problems?

I suggest dropping install-sh completely except for the coreutils
package. coreutils is very portable, so its not unreasonable to require
that it is installed to locally build and install other packages.
coreutils of course cannot depend on itself being installed. A more
conservative fix would be to keep install-sh for the transitive closure
of coreutils build dependencies (but given that one can cross compile I
think this is also unnecessary).

- -Rob
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkq/JCsACgkQ42zgmrPGrq6DcwCggZlqS67rlS71viXlFm8iM5pO
WMAAnj9sN8RhFgPRSEeancSsWDvH6pRv
=JJoX
-END PGP SIGNATURE-




Re: how to detect broken install-sh?

2009-09-27 Thread Robert Collins
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Ralf Wildenhues wrote:
 What would be the best way?  Do you think this might cause other
>>> problems?
>> I suggest dropping install-sh completely except for the coreutils
>> package.
> 
> Expecting GNU coreutils to be installed on each system is unreasonable.
> Other systems have quite well-functioning tools, too.  Autotools
> generally strive to produce packages that work well on all kinds of
> Posix and almost-Posix systems.
> 
> Cheers,
> Ralf

So I don't expect coreutils to be installed; I'm saying *packages other
than coreutils* should *depend on a working /usr/bin/install*.

Thats quite a different thing :)

- -Rob
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkq/SOwACgkQ42zgmrPGrq55bQCeKM3laGVIGbsvaxhH/pXpGvTl
AAwAoJt+Tye5QcVTZr8qRhVbxz4kan73
=4YWB
-END PGP SIGNATURE-




Re: how to detect broken install-sh?

2009-09-27 Thread Robert Collins
On Sun, 2009-09-27 at 16:00 -0500, Bob Friesenhahn wrote:
> On Sun, 27 Sep 2009, Robert Collins wrote:
> >
> > I suggest dropping install-sh completely except for the coreutils
> > package. coreutils is very portable, so its not unreasonable to require
> > that it is installed to locally build and install other packages.
> > coreutils of course cannot depend on itself being installed. A more
> 
> This seems like a pretty unreasonable requirement to me.  The 
> install-sh strategy has been working for quite a long time with hardly 
> any complaint until today.

The landscape has changed though, and I suspect that if we gather stats
about this we'll see that install-sh is dead weight for most packages
nearly all of the time.

Its true that it is not a lot of dead weight, but at some point we
should be raising the bar - ever so slightly - on what we bundle into
the tarball. At one point we never required a Make implementation that
does includes, now we do [for dependency tracking] - and sure we degrade
well.

All I'm suggesting is that the time has come to let folk on the small
proportion of machines without a sufficiently useful install, build it -
exactly as they have to build any other dependency they are lacking.

BTW, on solaris, /usr/ucb/install is apparently the right thing to use,
and has been there since SunOS 5.10  Last Revised 14 Sep 1992 :).

-Rob


signature.asc
Description: This is a digitally signed message part


Re: how to detect broken install-sh?

2009-09-27 Thread Robert Collins
On Sun, 2009-09-27 at 18:59 -0500, Bob Friesenhahn wrote:
> On Mon, 28 Sep 2009, Robert Collins wrote:
> >
> > The landscape has changed though, and I suspect that if we gather stats
> > about this we'll see that install-sh is dead weight for most packages
> > nearly all of the time.
> 
> Maybe the landscape has changed for you, but not necessarily for 
> everyone.  Installing "coreutils" could be quite a burden and the 
> tools might conflict with the OS-provided equivalents.

I'm not a strong enough believer in the Copenhagen school to think that
I'm in a different universe. I'll agree that the distribution of OSs is
different for each open source project. But - data needed - for either
of us to reason effectively on this. As far as conflicting, there are
multiple well established places to install things that won't
conflict: /opt /usr/local ~/local - plus you can just make one up and
put it in your path.

> > Its true that it is not a lot of dead weight, but at some point we
> > should be raising the bar - ever so slightly - on what we bundle into
> > the tarball. At one point we never required a Make implementation that
> > does includes, now we do [for dependency tracking] - and sure we degrade
> > well.
> 
> The make implementation that does includes is only for developers of 
> the package.  It is not necessary to have a fancy make to build the 
> software.

It is if you want dependency tracking [and yes, one time builds
shouldn't need that, unless they ship with an unsettled graph]. As a
fraction, amongst your users, who do all of the following:
 - build their own binaries
 - do so with /no/ modifications to the code
 - on a platform with no suitable install program

Thats the key number - the amount of benefit that install-sh gives you.

> > All I'm suggesting is that the time has come to let folk on the small
> > proportion of machines without a sufficiently useful install, build it -
> > exactly as they have to build any other dependency they are lacking.
> 
> What other dependency might they be lacking?  My own package is quite 
> large but all of the dependencies are optional.

Lets start at the ridiculous and propose that they are missing a C
compiler.

-Rob


signature.asc
Description: This is a digitally signed message part


Re: how to detect broken install-sh?

2009-09-27 Thread Robert Collins
On Sun, 2009-09-27 at 20:38 -0500, Bob Friesenhahn wrote:

> > Thats the key number - the amount of benefit that install-sh gives you.
> 
> This violates a core principle of GNU in that "benefits" should be for 
> the benefit of the recipients of the software rather than for the for 
> the developers of it.  GNU is a communistic/Marxist type model rather 
> than a capitalistic model.  In the old days, the benefits were for the 
> developers and the users had to muddle through a difficult procedure 
> for every package that they installed.

I meant the benefit to the community, or even to the folk that end up
needing install-sh. I think portability is a great thing, but I also
think repeatedly solving the same problem isn't: particular when bug
fixes exist :). Anyhow, we're way off the original topic here, and I've
achieved my goal - to put my toe in the water about this sort of
change ;).

> To be sure, I will be quite supportive of a build framework if it is 
> based on a small package which is easily installed, and the build no 
> longer needs to be cobbled together with a mismash of Unix utilities. 
> Of course this build environment needs to be self-contained, well 
> supported, and would probably take five or seven years to fully 
> develop.  There have been a number of independent attempts in this 
> direction but it seems that none has come close to the popularity of 
> autotools.

All the ones I've seen have been 90% (or less) solutions and have often
[but not always] decided to replace Make with something less powerful :
a mistake IMO. I'm fairly sure I know what it would take to do a 100%
solution, but its daunting ;). I'm thinking of cmake, waf, scons,
primarily here, with cook, bake and others coming in as less well known
stabs in the same direction.

-Rob


signature.asc
Description: This is a digitally signed message part


Re: how to detect broken install-sh?

2009-09-28 Thread Robert Collins
On Mon, 2009-09-28 at 08:56 +0200, Ralf Wildenhues wrote:
> 
> You're much better off arguing that packages update to Autoconf 2.64,
> in many cases the configure script will shrink by more than 15K over
> the one generated by 2.63 (and it'll be a bit faster, too). 

Nice! - and I think they should - the two positions are very compatible.

-Rob


signature.asc
Description: This is a digitally signed message part


Re: Non-recursive automake

2009-10-17 Thread Robert Collins
On Sat, 2009-10-17 at 20:09 -0500, Bob Friesenhahn wrote:
> 
> I complained about this perhaps five years ago since it is the most 
> annoying issue related to non-recursive build.  There was some 
> discussion on this list at that time but nothing was done to make 
> things better.
> 
> It seems that a problem is that much of the Makefile.am file is
> simply 
> copied to the output Makefile.in and so these parts would need to be 
> re-written rather than copied.  The good news is that perl is good at 
> re-writing text. 

The way I tackled this in my proof of concept in 2001 was via a
rewriting include:

http://sources.redhat.com/ml/automake/2001-08/msg00112.html

This added a new directive 'subdir_include' which does an include but
adjusts all the paths in the make/automake rules in the included
fragment to the relative path to the included rules.

e.g. subdir_include foo/Makefile.am
would prepend 'foo' to the paths in foo/Makefile.am.

Automake's core has probably changed so much that the patch is not worth
even reading, but the concept worked tolerably well ;).

-Rob


signature.asc
Description: This is a digitally signed message part


Re: Non-recursive automake

2009-10-18 Thread Robert Collins
On Sun, 2009-10-18 at 08:39 +0200, Ralf Wildenhues wrote:

> > http://sources.redhat.com/ml/automake/2001-08/msg00112.html
> > 
> > This added a new directive 'subdir_include' which does an include but
> > adjusts all the paths in the make/automake rules in the included
> > fragment to the relative path to the included rules.
> 
> The devil is in the details.  What about -I paths in *_CPPFLAGS?  What
> with substituted variables?  What about rewritten variable names, such
> as: libfoo_la_SOURCES becomes sub_libfoo_la_SOURCES, and what if the
> user references $(libfoo_la_SOURCES) elsewhere, say, in
> libbar_la_SOURCES?
> 
> No.  Search for several prior discussions on the Automake lists for why
> this cannot be done safely without highly altering the set of allowed
> semantics, and things the user can expect.

I'll take it on faith; I must have missed those discussions (there was a
period while I didn't receive forwarded mail from my old cygwin address
before I resubscribed). Regardless, if something usable is added, +1.

-Rob


signature.asc
Description: This is a digitally signed message part


  1   2   3   >