Bruno Haible <[email protected]> writes: >> What is the status of the Python gnulib tool? I'm not sure how far >> behind it is compared to the shell script but it seems like it would >> be much faster. I would say more maintainable but I might just be bad >> at writing shell scripts. :) > > Yes, it's the hope that it will be faster that is the main motivation > behind the Python rewrite.
Orthogonal to a rewrite in python: is it possible to design a reliable
caching mechanism? Something similar to CONFIG_SITE for autoconf?
I find that ./gnulib-tool takes a long time and 95% of the time I use
it, it ended up doing exactly the same thing as it did last time I ran
it: copying a set of possibly patched files out of the gnulib directory.
How about logic like this:
. $GNULIB_SITE
if test -d $gnulib_cache_dir; then
rsync -av $gnulib_cache_dir .
else if test -n "$gnulib_cache_dir"; then
mkdir $savedir
rsync -av . $savedir
# do whatever gnulib normally is doing
# compare . with $savedir, saving a copy of each modified
# file into $gnulib_cache_dir
fi
then I could put something like this into a $GNULIB_SITE script:
if test -z "$gnulib_cache_dir"; then
hash=`echo $PWD|md5sum|cut -d' ' -f1`
my_cache_dir=$HOME/.cache/gnulib.site
gnulib_cache_dir=$my_cache_dir/cache.`basename $PWD`.$hash
test -d $gnulib_cache_dir || mkdir -p $gnulib_cache_dir
fi
/Simon
signature.asc
Description: PGP signature
