urllib hangs
Benjamin Niemann
pink at odahoda.de
Mon Aug 23 13:48:06 EDT 2004
Jay Donnell wrote:
> This is a basic version of my code
>
> for url in urls:
> fp = urllib.urlopen(url)
> lines = fp.readlines()
>
> #print lines
> for line in lines:
> #print line
> if(reUrl.search(line)):
> print 'found'
> return 1
> else:
> print 'not found'
> return 0
>
> this hangs occasionally for some certain url's. If I do a ctrl-c
> (linux) it will move on to the next url. How can I get this to timeout
> and move on to the next url.
since 2.3 there's socket.setdefaulttimeout(), this should to the job:
import socket
socket.setdefaulttimeout(10)
# throw socket.timeout exception after 10s,
# default is to wait a infinitly (or at least a very, very long time...)
For older python versions, ask google for timeoutsocket.py
More information about the Python-list
mailing list