I use python in order to analyze my data which are in a text form. The script is fairly simple. It reads a line form the input file, computes what it must compute and then write it it to a buffer/list. When the whole reading file is processed (essential all lines) then the algorithms goes ahead and writes them one by one on the output file. It works fine. But because of the continuous I/O it takes a lot of time to execute. I think that the output phase is more or less optimized. (A loop that reads the solutions list sequentially and puts "/n" in the appropriate intervals). Do you know a way to actually load my data in a more "batch-like" way so I will avoid the constant line by line reading? I guess I could read and store the whole text in a list with each cell being being a line and then process each line one by one again but I don't really think that would offer me a significant time gain. Thanx in advance for the time reading this. Pantelis
-- http://mail.python.org/mailman/listinfo/python-list