Extracting data from dump file

TYR a.harrowell at gmail.com
Fri Nov 23 13:46:12 EST 2007


I have a large dump file that originated in a MySQL db; I need to get
it into an SQLite file.

Various options are suggested around the web; none of them seem to
work (most failing to import the thing in the first place). So I
removed the assorted taggery from each end, leaving just a big text
file taking the following format:

('value', 'value', 'value, 'value'),
('value','value','value','value')...

I planned to find some way of splitting the thing at the commas
outside the bracketed groups, thus giving me a list of tuples; then I
could of course CREATE TABLE foo VALUES  '1', '2, '3', '4' and then
iterate through the list INSERTing INTO.

Then my problems began; I tried using the python csv method, replacing
the string ),( with \t and then using \t as the delimiter. First
problem; there's a size limit coded into the module. No problem, use
csv.field_size_limit() to alter it. Problem; it doesn't actually parse
at all, just sends the whole thing as a string and the SQL INSERT
fails with a "not enough args" error.

Tried using string.split() and re.split(data, r'\t'); first gave the
same error, second failed with a "too many named groups" error. Tried
using ; as a delimiter and going back to csv; this fails to match
the ; for some reason. Any ideas?



More information about the Python-list mailing list