A problem while using urllib
Alex Martelli
aleax at mail.comcast.net
Tue Oct 11 07:41:17 EDT 2005
Johnny Lee <johnnyandfiona at hotmail.com> wrote:
...
> try:
> webPage = urllib2.urlopen(url)
> except urllib2.URLError:
...
> webPage.close()
> return True
> ----------------------------------------------------
>
> But every time when I ran to the 70 to 75 urls (that means 70-75
> urls have been tested via this way), the program will crash and all the
> urls left will raise urllib2.URLError until the program exits. I tried
> many ways to work it out, using urllib, set a sleep(1) in the filter (I
> thought it was the massive urls crashed the program). But none works.
> BTW, if I set the url from which the program crashed to base url, the
> program will still crashed at the 70-75 url. How can I solve this
> problem? thanks for your help
Sure looks like a resource leak somewhere (probably leaving a file open
until your program hits some wall of maximum simultaneously open files),
but I can't reproduce it here (MacOSX, tried both Python 2.3.5 and
2.4.1). What version of Python are you using, and on what platform?
Maybe a simple Python upgrade might fix your problem...
Alex
More information about the Python-list
mailing list