[issue1580738] httplib hangs reading too much data
Marcos Dione
report at bugs.python.org
Sat Oct 13 03:36:18 CEST 2007
Marcos Dione added the comment:
with facundo we were tracking this bug down. mhammond is right about
that read() should allow 0 size responses, but the bug is that the
response is "not closed". HTTPResponse.read() says:
if amt is None:
# unbounded read
if self.length is None:
s = self.fp.read()
else:
s = self._safe_read(self.length)
self.length = 0
self.close() # we read everything
return s
see that if self.close()s, which really closes the fp created with
makefile() in the constructor. this does not closes the underlying
socket, as you can see trying this short example:
import httplib
c= httplib.HTTPConnection ('www.python.org', 80)
c.request ('GET', '/index.html')
a1= c.getresponse ()
data1= a1.read()
c.request ('GET', '/404.html')
a2= c.getresponse ()
data2= a2.read()
and run it under strace -e network,file.
if the last part of read is changed to this, read(n) works just like
read() does:
# we do not use _safe_read() here because this may be
a .will_close
# connection, and the user is reading more bytes than will be
provided
# (for example, reading in 1k chunks)
s = self.fp.read(amt)
if self.length is not None:
self.length -= len(s)
if len(s)==0:
self.close ()
return s
----------
nosy: +StyXman, facundobatista
_____________________________________
Tracker <report at bugs.python.org>
<http://bugs.python.org/issue1580738>
_____________________________________
More information about the Python-bugs-list
mailing list