PySQLLite Speed
Gerhard Haering
gh at ghaering.de
Fri Dec 3 03:40:22 EST 2004
On Thu, Dec 02, 2004 at 08:39:31PM -0800, Kevin wrote:
> Hello All,
>
> I wanted to thank Roger Binn for his email. He had
> the answer to my issue with writing speed. It's
> actual made an incredible change in the preformace. I
> didn't have to go all the way to implementing the
> synchronous mode(for my app). Previously, I was
> insert one record at a time. The key was to write
> them all at one time. I moved up to a 13 meg file and
> wrote it to the db in secs. Now the issue is the 120
> meg of RAM consumed by PyParse to read in a 13 meg
> file. If anyone has thoughts on that, it would be
> great. Otherwise, I will repost under a more specific
> email.
>
> Thanks,
> Kevin
>
>
>
> db.execute("begin")
>
> while i < TriNum
> db.execute("""insert into TABLE(V1_x)
> values(%f),""" (data[i]))
> i = i + 1
If you're using pysqlite 2.0alpha, then .executemany() will boost performance
*a lot*. For pysqlite 1.x, unfortunately, it won't make any difference. But
generally, .executemany() is a good idea.
Also note that the preferred way of using transactions is to let the DB-API
adapter BEGIN the connection for you, then invoke .commit() on the connection
object.
Sending BEGIN/ROLLBACK/COMMIT via .execute() is bad.
-- Gerhard
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 196 bytes
Desc: Digital signature
URL: <http://mail.python.org/pipermail/python-list/attachments/20041203/e77bf66d/attachment.sig>
More information about the Python-list
mailing list