On 3/19/07, Adonis Vargas <[EMAIL PROTECTED]> wrote:
> Actually, I re-ran this in a terminal and it worked perfectly. I was
> using IDLE to write this code, kinda peculiar. Maybe something to do
> with IDLE and CSV (or writing to files) with lines > ~1000. A socket
> timing out maybe?
It's because
[EMAIL PROTECTED] wrote:
>>
>> -- code --
>>
>> def _scan(self):
>> outFile = file("mp3.dat", "wb")
>> outCSV = csv.writer(outFile)
>> output = list()
>> for root, dirs, files in os.walk(self.directory):
>> files = [x for x in files if x.endswi
On Mar 19, 2:20 pm, Adonis Vargas <[EMAIL PROTECTED]>
wrote:
> I am writing a program that walks a directory full of mp3s reads their
> ID3 data (using Mutagen), this part works perfectly. The problem is I
> write these tags to a CSV file through the CSV module. But when I read
> the file the file
this cgi script write different files, request 1 will write 1.html, 2
will write 2.html and so on.
Its not updating the current file.
For example create.py processes a form and write user.html file and
every user is unique.
so if 10 users fill up the form at the same time and click submit
button, c
On Sun, Apr 09, 2006 at 12:35:21AM -0700, [EMAIL PROTECTED] wrote:
> I have a CGI script on server which process a form and writes its
> content on a file like
> fp = open(fname, 'w')
> fp.write('Cool
> list%s%s
>
> Its working fine, but will it work if the script recieves thousands of
> request s