writing to file very slow

Moritz Lennert mlennert at club.worldonline.be
Wed Mar 26 09:54:20 EST 2003


Hello,

I have written a cgi script that uses PyGreSQL to query a PostgreSQL
database on the basis of the input of an html form. The results of the
query are stored in a file for the user to download.

Everything seems to work fine, except for the fact that writing the file
is very slow (example: 4 minutes for 4 thousand lines). I've even had the
problem that the program stopped writing the file before coming to the end
of the results, then launches the same query again and starts writing a
new file, which "crashes" again, triggering a new query, etc.

I use the following modules:

import cgi
import os
import _pg
import string
import tempfile

And here is the relevant function for writing the file:

def fichier_resultats(results):
  temdir="/tmp"
  tfilename = tempfile.mktemp('rec.txt')
  f=open(tfilename,'w')

  varnames=""
  for z in range(len(results.listfields())-1):
    varnames += str(results.fieldname(z))+'|'
  varnames += str(results.fieldname(len(results.listfields())-1))
  f.write(varnames)
  f.write("\n")
  for x in range(results.ntuples()):
   var = ""
   for y in range(len(results.listfields())-1):
     var+=str(results.getresult()[x][y])+'|'
   var+=str(results.getresult()[x][len(results.listfields())-1])
   f.write(var)
   f.write('\n')
  f.close()
  return(tfilename)

So what is wrong ? Is there a limit to the size of a pgqueryobject in
PyGreSQL ? Is my writing routine not very efficient ? Should I use another
module to connect to PostgreSQL ?

Thanks in advance for your help !

Moritz





More information about the Python-list mailing list