Just to clarify the command:

find -iname "*.*htm*" -o -iname "*.stm" | \
xargs egrep -l \
"centernet\.jhuccp\.org/cgi-bin/mail2friend|cgi\.jhuccp\.org/cgi-bin/mail2friend"
| \

xargs perl -pi~ -e"s?http://.*\.jhuccp\.org(/cgi-bin/mail2friend)?\1?g;"

---------------------

first off, why are you including the egrep statement?  if it is to weed
out files only containing the line:

centernet.jhuccp.org/cgi-bin/mail2friend
or
cgi.jhuccp.org/cgi-bin/mail2friend

why not do this in your perl script, and leave out the egrep (split
across two lines for readibility):

perl -pi~ -e's?http://(?:centernet|cgi)\.jhuccp\.org
/cgi-bin/mail2friend?/cgi-bin/mail2friend?g;'

Also for perl one-liners, it's usually best to enclose them in single
quotes, and then use q() or qq() inside the one-liner for single and
double quotes, respectively.

Also, the backreference is unnecessary as your captured string is just a
literal string.

Let me know if this helps your problem.
Luke

> It seems to run fine and changes many files, but when I go searching
> for the string that was supposed to be changed, I keep finding more
> file. Many were changed correctly, but some were not.
>
> It strikes me that maybe perl can't take too many arguments at once.
> There are options to the xarg command that allow no more than so many
> arguments at a time to be passed. Is this what's wrong? What should I
> set the number of arguments to?
>
> Thanks for trying to help me with this.
>
> -Kevin Zembower
>
> -----
> E. Kevin Zembower
> Unix Administrator
> Johns Hopkins University/Center for Communications Programs
> 111 Market Place, Suite 310
> Baltimore, MD  21202
> 410-659-6139
>
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to