PySQLLite Speed
Kent Johnson
kent3737 at yahoo.com
Fri Dec 3 06:06:11 EST 2004
Kevin wrote:
> Hello All,
>
> I wanted to thank Roger Binn for his email. He had
> the answer to my issue with writing speed. It's
> actual made an incredible change in the preformace. I
> didn't have to go all the way to implementing the
> synchronous mode(for my app). Previously, I was
> insert one record at a time. The key was to write
> them all at one time. I moved up to a 13 meg file and
> wrote it to the db in secs. Now the issue is the 120
> meg of RAM consumed by PyParse to read in a 13 meg
> file. If anyone has thoughts on that, it would be
> great. Otherwise, I will repost under a more specific
> email.
If your data is (or can be) created by an iterator, you can use this recipe to group the data into
batches of whatever size you choose and write the individual batches to the db.
http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/303279
Kent
>
> Thanks,
> Kevin
>
>
>
> db.execute("begin")
>
> while i < TriNum
> db.execute("""insert into TABLE(V1_x)
> values(%f),""" (data[i]))
> i = i + 1
>
> db.execute("commit")
>
>
>
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! Mail - You care about security. So do we.
> http://promotions.yahoo.com/new_mail
More information about the Python-list
mailing list