improving performance of python webserver running python scripts in cgi-bin
Mike Meyer
mwm at mired.org
Fri Jan 11 13:02:20 EST 2008
On Thu, 10 Jan 2008 23:17:28 -0800 (PST) Dale <dalebryan1 at gmail.com> wrote:
> I am using a simple python webserver (see code below) to serve up
> python scripts located in my cgi-bin directory.
>
> import BaseHTTPServer
> import CGIHTTPServer
> class Handler(CGIHTTPServer.CGIHTTPRequestHandler):
> cgi_directories = ['/cgi-bin']
> httpd = BaseHTTPServer.HTTPServer(('',8000), Handler)
> httpd.serve_forever()
>
>
> This works fine, but now I would like to combine the python scripts
> into the server program to eliminate starting the python interpreter
> on each script call. I am new to python, and was wondering if there
> is a better techique that will be faster.
You can use BaseHTTPRequestHandler and override do_GET to handle the
actual request.
> Also, can someone reccommend an alternative approach to
> httpd.serve_forever(). I would like to perform other python functions
> (read a serial port, write to an Ethernet port, write to a file, etc.)
> inside the web server program above. Is there an example of how to
> modify the code for an event loop style of operation where the program
> mostly performs my python I/O functions until an HTTP request comes
> in, and then it breaks out of the I/O operations to handle the HTTP
> request.
Use poll/select/whatever to check if httpd.socket has data available,
and then invoke httpd.handle_request() when it does. Nothing else will
happen while the request is being handled.
The suggestion to check the various frameworks for doing async work is
a good one as well.
<mike
--
Mike Meyer <mwm at mired.org> http://www.mired.org/consulting.html
Independent Network/Unix/Perforce consultant, email for more information.
More information about the Python-list
mailing list