HTTP or alternative upload for large files

holger krekel pyth at devel.trillke.net
Tue May 7 11:03:04 EDT 2002


brueckd at tbye.com wrote:
> On Tue, 7 May 2002 robin at execulink.com wrote:
> 
> > This may not be specifically a Python problem, so apologies in
> > advance.
> > 
> > I am attempting to upload large files (think 500 MB) through a web
> > form using a Python CGI process.
> > 
> > It is easy enough for my program to get the file handle of the
> > user-submitted file, and then write this out in chunks to the server.
> > But there are two major problems with this.
> > 
> > 1. Apache first uploads the entire file itself, consuming all
> > available memory, before handing over control to the CGI process. Why
> > it does this is beyond me, since I am not hip to server internals.
> > Apache could at least use a restricted amount of memory to do the
> > upload, but this, unfortunately, is not the case. Bad things happen
> > when memory gets low.
> > 
> > 2. The HTTP connection times out, so the entire file upload is lost.
> > 
> > I imagine the answer to this problem is "don't use HTTP", but I do
> > need a seemless solution from a web application.
> 
> I'm no Apache expert, so hopefully somebody else will till you how to fix
> this from inside Apache. Barring an Apache solution, have your upload
> form's action point to a different port on the same machine. On the server
> you can write a small app that listens on that different port, accepts the
> POST like you want, and generates the response page.

In Apache you can define ProxyPass (use google) to pass 
certain parts of your URL-namespace to a different
server/port (even remote). There sits your python-server
and dispataches to your cgi-script. This way you can
leave the CGI-binaries at the same place, have minimal modification
of the apache-configuration and full (python) control over your uploads.

regards,

    holger





More information about the Python-list mailing list