On Wed, Nov 28, 2001 at 11:48:30AM -0500, KEVIN ZEMBOWER wrote:
> find -iname "*.*htm*" -o -iname "*.stm"|xargs egrep -l
> 
>"centernet\.jhuccp\.org/cgi-bin/mail2friend|cgi\.jhuccp\.org/cgi-bin/mail2friend"|xargs
> perl -pi~ -e"s?http://.*\.jhuccp\.org(/cgi-bin/mail2friend)?\1?g;"

Someone suggested you replace the egrep with a Perl equivalent; you can also
replace it with the -path or possibly -regex options to find.


> It seems to run fine and changes many files, but when I go searching
> for the string that was supposed to be changed, I keep finding more
> file. Many were changed correctly, but some were not.

Are you certain it's not because they were filtered out by your egrep? 
Check all of the files output by running just the find and egrep.  If those
have all been properly changed then your problem lies in the filter; if not,
then it may be an argument limit imposed by your OS (see below).

 
> It strikes me that maybe perl can't take too many arguments at once.
> There are options to the xarg command that allow no more than so many
> arguments at a time to be passed. Is this what's wrong? What should I
> set the number of arguments to?

perl can take as many arguments as there is memory to store them.  Your OS
may have some restriction on the number of arguments that can be passed to a
program; I would suggest using the xargs option you mentioned.

 
Michael
--
Administrator                      www.shoebox.net
Programmer, System Administrator   www.gallanttech.com
--

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to