Can you use some sort of 'tail' command to get the last lines?

DA

On 7/16/05, Wiggins d'Anconia <[EMAIL PROTECTED]> wrote:
> Octavian Rasnita wrote:
> > Hi,
> >
> > I need to create a program which reads a file from the LAN, and that file is
> > continuously updated (kind of log).
> > The file increases continuously, and it can become very big.
> >
> > On every read, I need to read just the lines for the current day (which is
> > specified in each line) and jump over the lines from the previous days.
> >
> > The problem is that if I just open() the file and verify each line to see if
> > it is a good one, takes very much time, and I need to find a faster
> > solution.
> >
> > Checking each line might take a long time, but I think that verifying each
> > line means downloading the whole content of the file on my computer, and
> > this will also make the program lose much time.
> >
> > Do you have any idea what can I do to make the program run faster?
> >
> > I am thinking to make the program in such a way that after the first run, it
> > gets the line number of the first line from the current day, or the number
> > of the first bit of the start of that line, then on next times it runs, it
> > jumps directly to that line somehow (?), or directly to that bit (using
> > seek), but I don't know if in that case the content of the file won't be
> > also downloaded to my computer.
> >
> > Do you have any idea how I can download just a part of a file from the
> > network?
> >
> 
> If my understanding is correct that would have to be something built
> into the protocol of whatever network protocol you are using. Your seek
> idea should be correct, and I know for instance modern FTP servers have
> the ability to seek to a different part of a file, in which case the FTP
> server only provides that remaining portion of the file on a get (this
> is how pause/resume downloaders are implemented). And some HTTP servers
> have this capability. You should probably check with the protocol docs
> to see if it has the capability. Assuming the protocol is advanced
> enough then it should respect a seek and not provide the whole file. The
> best thing to do in this kind of case is to setup some sort of network
> monitoring and then just try it. See if you open a large file, seek to a
> location near the end and then see if the network gets slammed or if
> only the small sample of file is sent.
> 
> perldoc -f seek
> 
> For more info.
> 
> > Help!
> >
> > Thank you.
> >
> > Teddy
> >
> 
> Good luck,
> 
> http://danconia.org
> 
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> <http://learn.perl.org/> <http://learn.perl.org/first-response>
> 
> 
>

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to