psaff...@googlemail.com wrote:
> I'm building a pipeline involving a number of shell tools. In each
> case, I create a temporary file using tempfile.mkstmp() and invoke a
> command ("cmd < /tmp/tmpfile") on it using subprocess.Popen.
>
> At the end of each section, I call close() on the file handl
Thank you every one,
I ended up using a solution similar to what Gary Herron suggested :
Caching the output to a list of lists, one per file, and only doing the
IO when the list reaches a certain treshold.
After playing around with the list threshold I ended up with faster
execution times than
En Mon, 04 Feb 2008 12:50:15 -0200, Christian Heimes <[EMAIL PROTECTED]>
escribi�:
> Jeff wrote:
>> Why don't you start around 50 threads at a time to do the file
>> writes? Threads are effective for IO. You open the source file,
>> start a queue, and start sending data sets to be written to t
AMD wrote:
> Hello,
>
> I need to split a very big file (10 gigabytes) into several thousand
> smaller files according to a hash algorithm, I do this one line at a
> time. The problem I have is that opening a file using append, writing
> the line and closing the file is very time consuming. I'd
Steven D'Aprano <[EMAIL PROTECTED]> wrote:
> On Mon, 04 Feb 2008 13:57:39 +0100, AMD wrote:
>
>> The problem I have under windows is that as soon as I get to 500 files I
>> get the Too many open files message. I tried the same thing in Delphi
>> and I can get to 3000 files. How can I increase the
AMD wrote:
> Hello,
>
> I need to split a very big file (10 gigabytes) into several thousand
> smaller files according to a hash algorithm, I do this one line at a
> time. The problem I have is that opening a file using append, writing
> the line and closing the file is very time consuming. I'd
Jeff wrote:
> Why don't you start around 50 threads at a time to do the file
> writes? Threads are effective for IO. You open the source file,
> start a queue, and start sending data sets to be written to the
> queue. Your source file processing can go on while the writes are
> done in other thr
On Mon, 04 Feb 2008 13:57:39 +0100, AMD wrote:
> The problem I have under windows is that as soon as I get to 500 files I
> get the Too many open files message. I tried the same thing in Delphi
> and I can get to 3000 files. How can I increase the number of open files
> in Python?
Windows XP has
Why don't you start around 50 threads at a time to do the file
writes? Threads are effective for IO. You open the source file,
start a queue, and start sending data sets to be written to the
queue. Your source file processing can go on while the writes are
done in other threads.
--
http://mail.