urllib.open('http://www.redherring.com/')
Andrew Cooke
andrew at intertrader.com
Fri Feb 9 14:14:46 EST 2001
Just a thought (I'm sure you've checked), but if it needs the socket
closed what's happening at your end about persistence and/or content
length? If it happened on every site it would be pretty convincing
evidence of content-length in the request being too large (IMHO). Maybe
their server is broken wrt content length in some way?
Andrew
Fredrik Lundh wrote:
>
> Andrew Kuchling wrote:
> > I don't believe urllib or httplib were massively rewritten for 2.0
> > (nor will they be in 2.1), so if it's a bug, it's probably still there.
>
> on my box, it sometimes work under 1.5.2, and never
> works under an unmodified 2.0 installation.
>
> however, the following hack works for both versions
> (note the explicit socket shutdown).
>
> import httplib
>
> h = httplib.HTTP("www.redherring.com")
> h.set_debuglevel(1)
> h.putrequest("GET","/")
> h.endheaders()
>
> try:
> h.sock.shutdown(1) # 1.5.2
> except AttributeError:
> h._conn.sock.shutdown(1) # 2.0
>
> errcode, errmsg, headers = h.getreply()
> f = h.getfile()
> s = f.read()
> f.close()
>
> print "content length: %d" % len(s)
>
> this prints:
>
> connect: (www.redherring.com, 80)
> send: 'GET / HTTP/1.0\015\012'
> send: '\015\012'
> reply: 'HTTP/1.1 200 OK\015\012'
> header: Server: Netscape-Enterprise/4.0
> header: Date: Fri, 09 Feb 2001 16:15:36 GMT
> header: Set-cookie: AnalysisUserId=28177981735337; path=/;
> expires=Friday, 31 Dec-2010 23:59:59 GMT
> header: AnalysisUserId: 28177981735337
> header: Content-type: text/html
> header: Connection: close
>
> Cheers /F
More information about the Python-list
mailing list