On Tue, 25 May 2004, Christopher Faylor wrote:
> On Tue, May 25, 2004 at 06:55:35PM -0400, Igor Pechtchanski wrote:
> >Cygwin has a 32k command-line length limit.
>
> Cygwin doesn't, AFAIK, have any command-line length limit other than the
> amount of memory available to store the command line. W
[ Accidentally sent the same thing directly to Bruce. Sorry Bruce. ]
In addition to the other replies. In general, I've always used the rule
of thumb that
its a bad idea in these sort of cases to do command line globbing at all.
Seems to me that it doesn't matter much if its echo, ls or some ot
0
/c> for F in /c/DW/out/NIAID/* ; do ls $F ; done | wc
934 934 36055
/c> echo /c/DW/out/NIAID/* | xargs ls | wc
934 934 36055
-Original Message-
From: Christopher Faylor
Sent: Tuesday, May 25, 2004 7:10 PM
To: [EMAIL PROTECTED]
Subject: Re: shell cmds crapping
On Tue, May 25, 2004 at 06:55:35PM -0400, Igor Pechtchanski wrote:
>Cygwin has a 32k command-line length limit.
Cygwin doesn't, AFAIK, have any command-line length limit other than the
amount of memory available to store the command line. Windows has a
limit but Cygwin shouldn't impose one. That
On Tue, 25 May 2004, Bruce Dobrin wrote:
> {uname -a
> CYGWIN_NT-5.1 THEODOLITE 1.5.9(0.112/4/2) 2004-03-18 23:05 i686 unknown
> unknown Cygwin
> }
>
> I need to process very large numbers ( up to 100,000) of imagefiles. I
> noticed my foreach loops start crapping out when the number of files gr
{uname -a
CYGWIN_NT-5.1 THEODOLITE 1.5.9(0.112/4/2) 2004-03-18 23:05 i686 unknown
unknown Cygwin
}
I need to process very large numbers ( up to 100,000) of imagefiles. I
noticed my foreach loops start crapping out when the number of files grows
near 1500. It feels like a 32bit memory addressing
6 matches
Mail list logo