On Tuesday, January 14, 2014 7:33:08 PM UTC+5:30, Chris Angelico wrote:
> On Wed, Jan 15, 2014 at 12:50 AM, Ayushi Dalmia
> 
> <ayushidalmia2...@gmail.com> wrote:
> 
> > I need to write into a file for a project which will be evaluated on the 
> > basis of time. What is the fastest way to write 200 Mb of data, accumulated 
> > as a list into a file.
> 
> >
> 
> > Presently I am using this:
> 
> >
> 
> > with open('index.txt','w') as f:
> 
> >       f.write("".join(data))
> 
> >       f.close()
> 
> 
> 
> with open('index.txt','w') as f:
> 
>     for hunk in data:
> 
>         f.write(hunk)
> 
> 
> 
> You don't need to f.close() - that's what the 'with' block guarantees.
> 
> Iterating over data and writing each block separately means you don't
> 
> have to first build up a 200MB string. After that, your performance is
> 
> going to be mainly tied to the speed of your disk, not anything that
> 
> Python can affect.
> 
> 
> 
> ChrisA

Thanks for the tip on the closing of the file. I did not know that with ensures 
closing of the file after iteration is over. 

Which is more fast?
Creating a 200 Mb string and then dumping into a file or dividing the 200 Mb 
string into chunks and then writing those chunks. Won't writing the chunks call 
more i/o operation?
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to