urllib timeout issues
Gabriel Genellina
gagsl-py2 at yahoo.com.ar
Tue Mar 27 16:13:30 EDT 2007
En Tue, 27 Mar 2007 16:21:55 -0300, supercooper <supercooper at gmail.com>
escribió:
> I am downloading images using the script below. Sometimes it will go
> for 10 mins, sometimes 2 hours before timing out with the following
> error:
>
> urllib.urlretrieve(fullurl, localfile)
> IOError: [Errno socket error] (10060, 'Operation timed out')
>
> I have searched this forum extensively and tried to avoid timing out,
> but to no avail. Anyone have any ideas as to why I keep getting a
> timeout? I thought setting the socket timeout did it, but it didnt.
You should do the opposite: timing out *early* -not waiting 2 hours- and
handling the error (maybe using a queue to hold pending requests)
--
Gabriel Genellina
More information about the Python-list
mailing list