> Hello everyone, Howdy
> > We use the Barracuda Spam appliance (barracudanetworks.com) > to filter our spam and their web based interface is written > in Perl. They have a form that allows the user to search > messages for key words. Evidentally it stores the each > message in a file in a directory and when trying to search > several hundred thousand messages for a word the response back > is: > > egrep: argument list too long > If I was trying to grep a zillion files at once and it wouldn't let me I'd probably grep them one at a time. For instance via the backtivck execution You might have: my @matchedfiles = qx(cat `ls /files/` |grep $string); # a really bad way to do this but for example's sake... You could do: for(`ls /files/`) { if(`cat $_ |grep $string`) { push(@matchedfiles,$_); } } Then you are only greping one file at a time instead of a list of too many. Of course what would be better is to use the readdir() functions to list the files and open() and grep() combo to grep the contents. But the same principle applies. Just make sure the barracuda folks says thanks for fixing their problem :) HTH DMuey > It looks like their using grep via a system command or the > grep function in Perl. > > Now, to my question. How do others get around the > limitations of sending stuff to grep? > > I know they're probably not aware of it yet as we just got > the firmware update the other day. I'm a perl programmer so > I thought I'd try to figure out the solution and send it to > them to incorporate into the firmware. > > Any ideas? > > Thanks, > Kevin > -- > Kevin Old <[EMAIL PROTECTED]> > > > -- > To unsubscribe, e-mail: [EMAIL PROTECTED] > For additional commands, e-mail: [EMAIL PROTECTED] > > -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]