Magnus Lycka wrote:
> To read the last x bytes of a file, you could do:
>
> >>> import os
> >>> x = 2000 # or whatever...
> >>> f=open('my_big_file')
> >>> l=os.fstat(f.fileno()).st_size
> >>> f.seek(l-x)
> >>> f.read()
You don't need fstat/st_size, you can ask seek to move to an offset
rela
[EMAIL PROTECTED] wrote:
> hi
>
> I have a file which is very large eg over 200Mb , and i am going to use
> python to code a "tail"
> command to get the last few lines of the file. What is a good algorithm
> for this type of task in python for very big files?
> Initially, i thought of reading eve
[EMAIL PROTECTED] wrote:
> hi
>
> I have a file which is very large eg over 200Mb , and i am going to use
> python to code a "tail"
> command to get the last few lines of the file. What is a good algorithm
> for this type of task in python for very big files?
> Initially, i thought of reading eve
On Thu, 08 Dec 2005 02:09:58 -0500, Mike Meyer <[EMAIL PROTECTED]> wrote:
>[EMAIL PROTECTED] writes:
>> I have a file which is very large eg over 200Mb , and i am going to use
>> python to code a "tail"
>> command to get the last few lines of the file. What is a good algorithm
>> for this type of
[EMAIL PROTECTED] wrote:
> hi
>
> I have a file which is very large eg over 200Mb , and i am going to use
> python to code a "tail"
> command to get the last few lines of the file. What is a good algorithm
> for this type of task in python for very big files?
> Initially, i thought of reading ev
Gerald Klix <[EMAIL PROTECTED]> wrote:
> As long as memory mapped files are available, the fastest
> method is to map the whole file into memory and use the
> mappings rfind method to search for an end of line.
Actually mmap doesn't appear to have an rfind method :-(
Here is a tested solution
Gerald Klix <[EMAIL PROTECTED]> wrote:
> As long as memory mapped files are available, the fastest
> method is to map the whole file into memory and use the
> mappings rfind method to search for an end of line.
Excellent idea.
It'll blow up for large >2GB files on a 32bit OS though.
--
Nick C
[EMAIL PROTECTED] writes:
> hi
>
> I have a file which is very large eg over 200Mb , and i am going to
> use python to code a "tail" command to get the last few lines of the
> file. What is a good algorithm for this type of task in python for
> very big files? Initially, i thought of reading eve
As long as memory mapped files are available, the fastest
method is to map the whole file into memory and use the
mappings rfind method to search for an end of line.
The following code snippets may be usefull:
reportFile = open( filename )
length = os.fstat( reportFile.fileno() ).st_size
Mike Meyer wrote:
> It would probably be more efficient to read blocks backwards and paste
> them together, but I'm not going to get into that.
>
That actually is a pretty good idea. just reverse the buffer and do a
split, the last line becomes the first line and so on. The logic then
would be no
[EMAIL PROTECTED] writes:
> I have a file which is very large eg over 200Mb , and i am going to use
> python to code a "tail"
> command to get the last few lines of the file. What is a good algorithm
> for this type of task in python for very big files?
> Initially, i thought of reading everything
[EMAIL PROTECTED] wrote:
> hi
>
> I have a file which is very large eg over 200Mb , and i am going to use
> python to code a "tail"
> command to get the last few lines of the file. What is a good algorithm
> for this type of task in python for very big files?
> Initially, i thought of reading eve
hi
I have a file which is very large eg over 200Mb , and i am going to use
python to code a "tail"
command to get the last few lines of the file. What is a good algorithm
for this type of task in python for very big files?
Initially, i thought of reading everything into an array from the file
and
13 matches
Mail list logo