I'm not quite sure what you're trying to do with this... but from what
you've provided, the regular expression you wrote will only remove text
matching "....sub html_base...." on a single line -- and not even the
newline at the end. The '-pi' switches will grab a single line at a time
from the currently open file to compare to the expression. 

It sounds like you've got a subroutine you're trying to remove from some
files:
 sub html_base {
     ...
 }
yes? If so, it might make more sense to make a quick-and-dirty perl
script that will take the fileglob (*cgi), read one file at a time, and
read that entire file into a single string, and then run the regexp you 
wrote against it, with the following addition:
 s/sub html_base[\s\S]*//s
The trailing 's' will allow the regexp to look over newlines (\n) when
matching.

--Matthew

On Thu, 25 Apr 2002 04:10:58 -0400, Alain wrote:
> Hello all,
> 
> I've the following problem:
> I need to erase about 150 lines (always the same lines) at the end of a
> serie of files.
> What I have done in the shell is:
> #perl -pi -e "s/sub html_base[\s\S]*//" *cgi
> 
> But this command only erase one line at once. And I want to erase all
> the lines in one time. Is there anybody who can help?
> 
> Thanks a lot
> 
> Alain Scieur

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to