simple httplib and urllib timeout question

musingattheruins at my-deja.com musingattheruins at my-deja.com
Thu Apr 13 12:31:49 EDT 2000


> > Just a quick question, im writing a small webbot for a python
> > learning experience and it comes across some sites that will make it
> > hang for a while, if the site is down or extremely slow etc, is
> > there a way to set a faster timeout for httplib . Or is there a
> > better way to handle it ?

I had the same problem and would get back exceptions that said a
timeout had occurred.  I used a retry as a work-around....

class Page:
	"Page: class to get a page from the web."
	def get(self, host, page):
		"""
			p = Page()
			text = p.get(host, page)
		"""
		for i in range(3):
			try:
				import httplib
				http = httplib.HTTP(host)
				http.putrequest("GET", page)
				http.putheader("Accept", "text/html")
				http.putheader("Accept", "text/plain")
				http.endheaders()
				errcode,errmsg,headers = http.getreply()
				fp = http.getfile()
				sData = fp.read() or ""
				fp.close()
				del http
				if len(sData)>0:
					return sData
			except:
				pass

which will retry up to three times if the returned data is empty or if
an exception is raised.

L8R :-)


Sent via Deja.com http://www.deja.com/
Before you buy.



More information about the Python-list mailing list