Performance Problems when selecting HUGE amounts of data from MySQL dbs

Roman Yakovenko romany at actimize.com
Wed Jan 9 10:33:10 EST 2002


Why don't you implement your own string. It's simple and quck solution.
For exapmle string as list of string.  or something like this

-----Original Message-----
From: Gabriel Ambuehl [mailto:gabriel_ambuehl at buz.ch]
Sent: Wednesday, January 09, 2002 5:26 PM
To: python-list at python.org
Subject: Performance Problems when selecting HUGE amounts of data from
MySQL dbs


-----BEGIN PGP SIGNED MESSAGE-----

Hello,
I need to export HUGE amounts (upto 1Mio, but normally some 10000) of
say 150 byte fields from a MySQL db and in the end, I need them as
ONE giant concatenated
string. Working with mysql-python, the only way I see is

db.execute("SELECT myfield FROM table")
data=db.fetchall()
giantstring=""
for record in data:
    giantstring+=record

but this is way too slow. Now I wondering whether there's any other
approach I can't seem to able to find or some other trick to speed
things up (maybe there's some way to have MySQL concatenate the
fields
itself and return only giantstring?)?

An approach for optimizing might be to select only X fields at a time
(using LIMIT) and do multiple selects, but I feel the real problem is
the concatenation, not the select itself...


Any comments, thoughts, tricks would be greatly appreciated.


Best regards,
 Gabriel

-----BEGIN PGP SIGNATURE-----
Version: PGP 6.5i

iQEVAwUBPDxS+8Za2WpymlDxAQExUwf/eJVpK200oZl9E737JT9vVAfcDCGR9EQm
62gcGRGu+0MV6VNJlurSF889ps4oXjGj3S1L1qUD7IssJkEJ0wTNr8fvNHTNUkbl
FVp8XyezuDsQXcyHeRpSz+DxPlt4eGaaSp0Y8mVgqylk9KrYpNBcvQrRXu1GE+2x
cCio+ywpxP2XbZwnyfYVK1KHZb52zhn8GreGDuhCG/329Ee6btJHMQ2w9jR2YIz+
UyVZ2ki3lCj2gwKgrHlTLCREU53HDlWvcLIha1K1GG83n+TLjM51tpjqUmdAujEf
IVOa1fYFq5RLMRV2M+qKXWW5c6BNo8HEmIMjDvhkujMNG8J9H+xBpw==
=5GJt
-----END PGP SIGNATURE-----


-- 
http://mail.python.org/mailman/listinfo/python-list




More information about the Python-list mailing list