>>>>> "JG" == Jim Green <student.northwest...@gmail.com> writes:
JG> this is very nice, don't need to call perl in perl anymore.. JG> perl -pi -e '$_ = "" if /foo/' filename JG> but what if the file is very large? slurping the file in to memory will JG> be ok? in the slurp distro is an article i wrote for perl.com that covers that issue. http://search.cpan.org/~uri/File-Slurp-9999.18/extras/slurp_article.pod but look around at all the files you see - text, source, config, html, control, etc. almost all are under 1mb and that is nothing in today's ram of 1GB and up. the issue comes to fore with the large files like logs, genetics, database dumps etc. and you should know that is the type of file you are dealing with and not slurp them. but for maybe 99% of the files out there, you can slurp with no ill effects. also the maximum size you can slurp is really up to you and your ram size. i can see even slurping 10's of MB without too much pain on a decent box. JG> or is there any other alternatives you are aware of? or you can dismiss? for the very large files, you need to stick to line by line or buffered block reads. not much else you can do. for log files if you want to read the lines from the end, File::ReadBackwards (another of my modules) is the way to go and it is very popular. uri -- Uri Guttman ------ u...@stemsystems.com -------- http://www.sysarch.com -- ----- Perl Code Review , Architecture, Development, Training, Support ------ --------- Gourmet Hot Cocoa Mix ---- http://bestfriendscocoa.com --------- -- To unsubscribe, e-mail: beginners-unsubscr...@perl.org For additional commands, e-mail: beginners-h...@perl.org http://learn.perl.org/