re: Unix

 From the command-line, I'm currently running a find command piped to
xargs grep:

 find . -f print | xargs egrep "some string to look for"

There is an occassional requirement that this be done and it must parse
a hierarchy of directories and files which number in the many thousands.

Run from the commandline takes a long, long time.  I even re-niced the
command -10 as root, but it still takes hours.

Would perl be able to optimize the search and grep better than what I am
currently doing?  

Ideas, jokes and rants are appreciated...

:-)

deb


-- 
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
          There are 010 types of people in the world:
      those that understand binary, and those that don't.
τΏτ
 ~ 

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to