Re: opening a big file

2008-04-21 Thread Chas. Owens
On Mon, Apr 21, 2008 at 11:18 AM, Gunnar Hjalmarsson <[EMAIL PROTECTED]> wrote: > Mr. Shawn H. Corey wrote: > > > The fastest way to do this is to read every line into Perl and disregard > everything not relevant. > > > > Don't think so. > > I did a benchmark on a text file with 100,000 lines, wh

Re: opening a big file

2008-04-21 Thread Gunnar Hjalmarsson
Richard Lee wrote: Gunnar Hjalmarsson wrote: Richard Lee wrote: I don't have root access.. and then compiler which is gcc on sun machine messed up my installation. I tried to install expect and didn't work out. Will gather more information Start here: perldoc -q "own module" thanks..

Re: opening a big file

2008-04-21 Thread Richard Lee
Gunnar Hjalmarsson wrote: Richard Lee wrote: Gunnar Hjalmarsson wrote: Richard Lee wrote: Unfortunately however, the system I am on, I cannot install any modules other than standard modules that already come with the perl. Assuming you have at least FTP access, you are wrong. Which are the

Re: opening a big file

2008-04-21 Thread Gunnar Hjalmarsson
Richard Lee wrote: Gunnar Hjalmarsson wrote: Richard Lee wrote: Unfortunately however, the system I am on, I cannot install any modules other than standard modules that already come with the perl. Assuming you have at least FTP access, you are wrong. Which are the restrictions? I guess ev

Re: opening a big file

2008-04-21 Thread Richard Lee
Gunnar Hjalmarsson wrote: Richard Lee wrote: Unfortunately however, the system I am on, I cannot install any modules other than standard modules that already come with the perl. Assuming you have at least FTP access, you are wrong. Which are the restrictions? I guess even that, I should lo

Re: opening a big file

2008-04-21 Thread Gunnar Hjalmarsson
Mr. Shawn H. Corey wrote: The fastest way to do this is to read every line into Perl and disregard everything not relevant. Don't think so. I did a benchmark on a text file with 100,000 lines, where I'm actually only interested in the 5 last lines. Except for Tie::File, which proved to be aw

Re: opening a big file

2008-04-21 Thread Richard Lee
beyhan wrote: The key is : use Tie::File "Tie::File" represents a regular text file as a Perl array. Each ele‐ ment in the array corresponds to a record in the file. The first line of the file is element 0 of the array; the second line is element 1, and so on. The

Re: opening a big file

2008-04-21 Thread Gunnar Hjalmarsson
Richard Lee wrote: Unfortunately however, the system I am on, I cannot install any modules other than standard modules that already come with the perl. Assuming you have at least FTP access, you are wrong. Which are the restrictions? -- Gunnar Hjalmarsson Email: http://www.gunnar.cc/cgi-bin/

Re: opening a big file

2008-04-21 Thread beyhan
The key is : use Tie::File "Tie::File" represents a regular text file as a Perl array. Each ele‐ ment in the array corresponds to a record in the file. The first line of the file is element 0 of the array; the second line is element 1, and so on. The file is not loa

Re: opening a big file

2008-04-21 Thread Mr. Shawn H. Corey
On Sun, 2008-04-20 at 20:22 -0400, Chas. Owens wrote: > No, you obviously don't know how it is implemented. It seeks to the > end of the file and reads it into a buffer where it searches for line > endings. It does not read the entire file until you reach the first > line. > That's not the poin

Re: opening a big file

2008-04-21 Thread Mr. Shawn H. Corey
On Sun, 2008-04-20 at 18:10 -0400, Richard Lee wrote: > There is no way to read say last 10 MB of the file or something? It's > very surprising why no such thing exists.. > No, it's not. It is a text file, not a fixed-sized record fix. There is no way to compute where the lines of text start.

Re: opening a big file

2008-04-21 Thread Mr. Shawn H. Corey
On Sun, 2008-04-20 at 17:02 -0400, Richard Lee wrote: > Chas. Owens wrote: > > On Sun, Apr 20, 2008 at 1:49 PM, Richard Lee <[EMAIL PROTECTED]> wrote: > > snip > > > >> can this be optimized in anyway? > >> open (my $source, '-|', "tail -10 /server/server.log") > >> > >> is this the best

Re: opening a big file

2008-04-21 Thread Jenda Krynicky
From: Richard Lee <[EMAIL PROTECTED]> > Mr. Shawn H. Corey wrote: > > It still has to go through the entire file and mark the offsets to the > > start of every line. > > > > The best way to do this is just to bite the bullet and do it. > > There is no way to read say last 10 MB of the file or some

Re: opening a big file

2008-04-20 Thread Chas. Owens
On Sun, Apr 20, 2008 at 8:55 PM, Mr. Shawn H. Corey <[EMAIL PROTECTED]> wrote: > On Sun, 2008-04-20 at 20:22 -0400, Chas. Owens wrote: > > No, you obviously don't know how it is implemented. It seeks to the > > end of the file and reads it into a buffer where it searches for line > > endings.

Re: opening a big file

2008-04-20 Thread Chas. Owens
On Sun, Apr 20, 2008 at 5:12 PM, David Moreno <[EMAIL PROTECTED]> wrote: > Excerpts from Richard Lee's message of Sun Apr 20 17:02:58 -0400 2008: > > > This looks very useful. > > > > Unfortunately however, the system I am on, I cannot install any modules > > other than standard modules that alr

Re: opening a big file

2008-04-20 Thread Chas. Owens
On Sun, Apr 20, 2008 at 5:55 PM, Mr. Shawn H. Corey <[EMAIL PROTECTED]> wrote: snip > Sadly, even ReadBackwards in no magic bullet. (And BTW, it should be > ReadBackward.) snip No, it is File::ReadBackwards. If you are complaining about the regionalism "backwards", well I bet I can find you us

Re: opening a big file

2008-04-20 Thread John W. Krahn
Richard Lee wrote: Mr. Shawn H. Corey wrote: Sadly, even ReadBackwards in no magic bullet. (And BTW, it should be ReadBackward.) It still has to go through the entire file and mark the offsets to the start of every line. The best way to do this is just to bite the bullet and do it. There i

Re: opening a big file

2008-04-20 Thread David Moreno
Excerpts from Richard Lee's message of Sun Apr 20 17:02:58 -0400 2008: > This looks very useful. > > Unfortunately however, the system I am on, I cannot install any modules > other than standard modules that already come with the perl. > But I will try this at my own system. Well, take a deeper

Re: opening a big file

2008-04-20 Thread Richard Lee
Mr. Shawn H. Corey wrote: On Sun, 2008-04-20 at 17:02 -0400, Richard Lee wrote: Chas. Owens wrote: On Sun, Apr 20, 2008 at 1:49 PM, Richard Lee <[EMAIL PROTECTED]> wrote: snip can this be optimized in anyway? open (my $source, '-|', "tail -10 /server/server.log") is t

Re: opening a big file

2008-04-20 Thread Mr. Shawn H. Corey
On Sun, 2008-04-20 at 13:49 -0400, Richard Lee wrote: > can this be optimized in anyway? > open (my $source, '-|', "tail -10 /server/server.log") > > is this the best way to get large portion(well file itself is over 20 > times) of the file into find handle? > This will not optimize process

Re: opening a big file

2008-04-20 Thread Richard Lee
Chas. Owens wrote: On Sun, Apr 20, 2008 at 1:49 PM, Richard Lee <[EMAIL PROTECTED]> wrote: snip can this be optimized in anyway? open (my $source, '-|', "tail -10 /server/server.log") is this the best way to get large portion(well file itself is over 20 times) of the file into find ha

Re: opening a big file

2008-04-20 Thread Chas. Owens
On Sun, Apr 20, 2008 at 1:49 PM, Richard Lee <[EMAIL PROTECTED]> wrote: snip > can this be optimized in anyway? > open (my $source, '-|', "tail -10 /server/server.log") > > is this the best way to get large portion(well file itself is over 20 > times) of the file into find handle? snip Depe

Re: opening a big file

2008-04-20 Thread Richard Lee
Mr. Shawn H. Corey wrote: On Sun, 2008-04-06 at 22:36 -0400, Richard Lee wrote: I am trying to open a big file and go through line by line while limiting the resource on the system. What is the best way to do it? Does below read the entire file and store them in memory(not good if that's t

Re: [PHP] opening a big file

2008-04-08 Thread Steve Bertrand
def a typo.. sorry about that No problem at all. Just checking in case the PHP question was missed or something. I know all about typos. I keep typing "funeral" instead of "wedding" for June 28th. LOL. That was a coffee on the monitor minute. Congrats and good luck! Thanks for the l

Re: [PHP] opening a big file

2008-04-07 Thread Daniel Brown
On Mon, Apr 7, 2008 at 11:44 AM, Richard Lee <[EMAIL PROTECTED]> wrote: > Daniel Brown wrote: > > > >Was there a reason this was sent to the PHP list as well? Maybe > > just a typo? > > > def a typo.. sorry about that > No problem at all. Just checking in case the PHP question was miss

Re: [PHP] opening a big file

2008-04-07 Thread Daniel Brown
On Sun, Apr 6, 2008 at 10:36 PM, Richard Lee <[EMAIL PROTECTED]> wrote: > I am trying to open a big file and go through line by line while limiting > the resource on the system. > What is the best way to do it? > > Does below read the entire file and store them in memory(not good if that's > the

Re: [PHP] opening a big file

2008-04-07 Thread Richard Lee
Daniel Brown wrote: On Sun, Apr 6, 2008 at 10:36 PM, Richard Lee <[EMAIL PROTECTED]> wrote: I am trying to open a big file and go through line by line while limiting the resource on the system. What is the best way to do it? Does below read the entire file and store them in memory(not good

Re: opening a big file

2008-04-06 Thread Mr. Shawn H. Corey
On Sun, 2008-04-06 at 22:36 -0400, Richard Lee wrote: > I am trying to open a big file and go through line by line while > limiting the resource on the system. > What is the best way to do it? > > Does below read the entire file and store them in memory(not good if > that's the case).. > > open

Re: opening a big file

2008-04-06 Thread Chas. Owens
On Sun, Apr 6, 2008 at 10:36 PM, Richard Lee <[EMAIL PROTECTED]> wrote: > I am trying to open a big file and go through line by line while limiting > the resource on the system. > What is the best way to do it? > > Does below read the entire file and store them in memory(not good if that's > the

Re: opening a big file

2008-04-06 Thread Richard Lee
Richard Lee wrote: I am trying to open a big file and go through line by line while limiting the resource on the system. What is the best way to do it? Does below read the entire file and store them in memory(not good if that's the case).. open(SOURCE, "/tmp/file") || die "not there: $!\n";

opening a big file

2008-04-06 Thread Richard Lee
I am trying to open a big file and go through line by line while limiting the resource on the system. What is the best way to do it? Does below read the entire file and store them in memory(not good if that's the case).. open(SOURCE, "/tmp/file") || die "not there: $!\n"; while () { ## do som