Re: Best approach to create humongous amount of files

2015-05-21 Thread Cem Karan
On May 20, 2015, at 7:44 AM, Parul Mogra wrote: > Hello everyone, > My objective is to create large amount of data files (say a million *.json > files), using a pre-existing template file (*.json). Each file would have a > unique name, possibly by incorporating time stamp information. The file

Re: Best approach to create humongous amount of files

2015-05-21 Thread Peter Otten
Mario R. Osorio wrote: > On Wednesday, May 20, 2015 at 2:09:59 PM UTC-4, Denis McMahon wrote: >> On Wed, 20 May 2015 17:14:15 +0530, Parul Mogra wrote: >> >> > Hello everyone, >> > My objective is to create large amount of data files (say a million >> > *.json files), using a pre-existing templat

Re: Best approach to create humongous amount of files

2015-05-21 Thread Mario R. Osorio
On Wednesday, May 20, 2015 at 2:09:59 PM UTC-4, Denis McMahon wrote: > On Wed, 20 May 2015 17:14:15 +0530, Parul Mogra wrote: > > > Hello everyone, > > My objective is to create large amount of data files (say a million > > *.json files), using a pre-existing template file (*.json). Each file > >

Re: Best approach to create humongous amount of files

2015-05-20 Thread Denis McMahon
On Wed, 20 May 2015 17:14:15 +0530, Parul Mogra wrote: > Hello everyone, > My objective is to create large amount of data files (say a million > *.json files), using a pre-existing template file (*.json). Each file > would have a unique name, possibly by incorporating time stamp > information. The

Re: Best approach to create humongous amount of files

2015-05-20 Thread Tim Chase
On 2015-05-20 17:59, Peter Otten wrote: > Tim Chase wrote: > > wordlist[:] = [ # just lowercase all-alpha words > > word > > for word in wordlist > > if word.isalpha() and word.islower() > > ] > > Just a quick reminder: if the data is user-provided you have to > sanitize it: Thu

Re: Best approach to create humongous amount of files

2015-05-20 Thread Peter Otten
Tim Chase wrote: > On 2015-05-20 22:58, Chris Angelico wrote: >> On Wed, May 20, 2015 at 9:44 PM, Parul Mogra >> wrote: >> > My objective is to create large amount of data files (say a >> > million *.json files), using a pre-existing template file >> > (*.json). Each file would have a unique name

Re: Best approach to create humongous amount of files

2015-05-20 Thread paul . anton . letnes
There's a module called "template" that I've used before, for the find/replace part. I never investigated its performance, but my script used less than 1 s for 100 files IIRC :-) Paul -- https://mail.python.org/mailman/listinfo/python-list

Re: Best approach to create humongous amount of files

2015-05-20 Thread Tim Chase
On 2015-05-20 22:58, Chris Angelico wrote: > On Wed, May 20, 2015 at 9:44 PM, Parul Mogra > wrote: > > My objective is to create large amount of data files (say a > > million *.json files), using a pre-existing template file > > (*.json). Each file would have a unique name, possibly by > > incorpo

Re: Best approach to create humongous amount of files

2015-05-20 Thread Chris Angelico
On Wed, May 20, 2015 at 9:44 PM, Parul Mogra wrote: > My objective is to create large amount of data files (say a million *.json > files), using a pre-existing template file (*.json). Each file would have a > unique name, possibly by incorporating time stamp information. The files > have to be gen

Best approach to create humongous amount of files

2015-05-20 Thread Parul Mogra
Hello everyone, My objective is to create large amount of data files (say a million *.json files), using a pre-existing template file (*.json). Each file would have a unique name, possibly by incorporating time stamp information. The files have to be generated in a folder specified. What is the be