En Wed, 18 Mar 2009 11:50:32 -0200, venutaurus...@gmail.com <venutaurus...@gmail.com> escribió:

On Mar 18, 6:35 pm, Peter Otten <__pete...@web.de> wrote:
venutaurus...@gmail.com wrote:
> Hello all,
>           I've an application where I need to create 50K files spread
> uniformly across 50 folders in python. The content can be the name of
> file itself repeated 10 times.I wrote a code using normal for loops
> but it is taking hours together for that. Can some one please share
> the code for it using Multithreading. As am new to Python, I know
> little about threading concepts.

I've just tried it, creating the 50,000 text files took 17 seconds on my not
very fast machine. Python is certainly not the bottleneck here.

Really...!!!! I just can't beleive it. I am running my scirpt on an
INTEL Core2duo CPU machine with 2GB of RAM. I've started it two hours
back, but still it is running. This is how my code looks like

You're creating 500000 processes, 500000 shells, just to call 'echo' (!)

def createFiles(path):
    m.write(strftime("%Y-%m-%d %H:%M:%S") +" Creating files in the
folder "+path+"\n")
    global c
    global d
    os.chdir (path)
    k = 1
    for k in range (1,1001):
        p = "%.2d"%(k)
        FName = "TextFile"+c+"_"+d+"_"+p+".txt"
        l =1
        for l in range(1 , 11):
            os.system ("\"echo "+FName+" >> "+FName+"\"")
            l = l +1
        k = k+1

with open(FName, "w") as f:
  for i in range(10):
    f.write(FName)

--
Gabriel Genellina

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to