urllib timeout issues

supercooper supercooper at gmail.com
Tue Mar 27 16:41:44 EDT 2007


On Mar 27, 3:13 pm, "Gabriel Genellina" <gagsl-... at yahoo.com.ar>
wrote:
> En Tue, 27 Mar 2007 16:21:55 -0300, supercooper <supercoo... at gmail.com>
> escribió:
>
> > I am downloading images using the script below. Sometimes it will go
> > for 10 mins, sometimes 2 hours before timing out with the following
> > error:
>
> >     urllib.urlretrieve(fullurl, localfile)
> > IOError: [Errno socket error] (10060, 'Operation timed out')
>
> > I have searched this forum extensively and tried to avoid timing out,
> > but to no avail. Anyone have any ideas as to why I keep getting a
> > timeout? I thought setting the socket timeout did it, but it didnt.
>
> You should do the opposite: timing out *early* -not waiting 2 hours- and
> handling the error (maybe using a queue to hold pending requests)
>
> --
> Gabriel Genellina

Gabriel, thanks for the input. So are you saying there is no way to
realistically *prevent* the timeout from occurring in the first
place?  And by timing out early, do you mean to set the timeout for x
seconds and if and when the timeout occurs, handle the error and start
the process again somehow on the pending requests?  Thanks.

chad




More information about the Python-list mailing list