Newbie Question: python mysqldb performance question

cjl cjlesh at gmail.com
Sun May 20 19:55:07 EDT 2007


Group:

I'm new to python and new to mysql.

I have a csv file that is about 200,000 rows that I want to add to a
mysql database.  Yes, I know that I can do this directly from the
mysql command line, but I am doing it through a python script so that
I can munge the data before adding it.

I have the following example code:

conn = MySQLdb.connect(db="database", host="localhost", user="root",
passwd="password")
c = conn.cursor()

reader = csv.reader(open(sys.argv[1]))
for row in reader:
    data1, data2, data3, data4 = row
    data = (data1,data2,data3,data4)
    c.execute("""insert into datatable values (%s, %s, %s, %s)""",
data)
    conn.commit()

This takes a really long time to execute, on the order of minutes.
Directly importing the csv file into mysql using 'load infile' takes
seconds.

What am I doing wrong? What can I do to speed up the operation?

Thanks in advance,
cjl




More information about the Python-list mailing list