On Thu, Dec 02, 2004 at 08:39:31PM -0800, Kevin wrote:
> Hello All,
>
> I wanted to thank Roger Binn for his email. He had
> the answer to my issue with writing speed. It's
> actual made an incredible change in the preformace. I
> didn't have to go all the way to implementing the
> synchronous mode(for my app). Previously, I was
> insert one record at a time. The key was to write
> them all at one time. I moved up to a 13 meg file and
> wrote it to the db in secs. Now the issue is the 120
> meg of RAM consumed by PyParse to read in a 13 meg
> file. If anyone has thoughts on that, it would be
> great. Otherwise, I will repost under a more specific
> email.
>
> Thanks,
> Kevin
>
>
>
> db.execute("begin")
>
> while i < TriNum
> db.execute("""insert into TABLE(V1_x)
> values(%f),""" (data[i]))
> i = i + 1If you're using pysqlite 2.0alpha, then .executemany() will boost performance *a lot*. For pysqlite 1.x, unfortunately, it won't make any difference. But generally, .executemany() is a good idea. Also note that the preferred way of using transactions is to let the DB-API adapter BEGIN the connection for you, then invoke .commit() on the connection object. Sending BEGIN/ROLLBACK/COMMIT via .execute() is bad. -- Gerhard
signature.asc
Description: Digital signature
-- http://mail.python.org/mailman/listinfo/python-list
