Slow loading of large in-memory tables

Skip Montanaro skip at pobox.com
Tue Sep 7 09:36:45 EDT 2004


    Ph> I am trying to load a relatively large table (about 1 Million rows)
    Ph> into an sqlite table, which is kept in memory. The load process is
    Ph> very slow - on the order of 15 minutes or so.

    Ph> I am accessing sqlite from Python, using the pysqlite driver.  I am
    Ph> loading all records first using cx.execute( "insert ..." ).  

I've never used sqlite, but I have used a number of other relational
databases.  All I've encountered previously have some sort of "bulk load"
statement that is faster than a bunch of inserts.  The input file in these
cases is generally some sort of TAB- or comma-delimited file.  These
statements are always extensions to SQL (and therefore predictably differ
for each rdbms).  Sqlite may have such a statement.

-- 
Skip Montanaro
Got spam? http://www.spambayes.org/
skip at pobox.com



More information about the Python-list mailing list