urllib timeout

kBob krdean at gmail.com
Wed Jul 28 11:11:41 EDT 2010


On Jul 27, 4:56 pm, MRAB <pyt... at mrabarnett.plus.com> wrote:
> kBob wrote:
> > On Jul 27, 4:23 pm, MRAB <pyt... at mrabarnett.plus.com> wrote:
> >> kBob wrote:
>
> >>>  I created a script to access weather satellite imagery fron NOAA's
> >>> ADDS.
> >>>  It worked fine until recently with Python 2.6.
> >>>  The company changed the Internet LAN connections to "Accept Automatic
> >>> settings" and "Use automatic configuration script"
> >>>  How do you get urllib.urlopen to use the the "automatic script"
> >>> configuration?
> >>>  This code worked recently, until the company implemented these LAN
> >>> connections...
> >>>     SAT_URL = "http://adds.aviationweather.gov/data/satellite/
> >>> latest_BWI_vis.jpg"
> >>>     satpic = urllib.urlopen(SAT_URL, proxies=0 )
> >>>     satimg = satpic.read()
> >> For the record, I got:
>
> >>  >>> import urllib
> >>  >>> SAT_URL =
> >> "http://adds.aviationweather.gov/data/satellite/latest_BWI_vis.jpg"
> >>  >>> satpic = urllib.urlopen(SAT_URL, proxies=0 )
> >> Traceback (most recent call last):
> >>    File "<stdin>", line 1, in <module>
> >>    File "C:\Python26\lib\urllib.py", line 79, in urlopen
> >>      opener = FancyURLopener(proxies=proxies)
> >>    File "C:\Python26\lib\urllib.py", line 617, in __init__
> >>      URLopener.__init__(self, *args, **kwargs)
> >>    File "C:\Python26\lib\urllib.py", line 129, in __init__
> >>      assert hasattr(proxies, 'has_key'), "proxies must be a mapping"
> >> AssertionError: proxies must be a mapping
>
> >> However, urllib.urlretrieve(...) works.- Hide quoted text -
>
> >> - Show quoted text -
>
> > I saw that, but I still get the same error time out error ...
>
> >>>> import urllib
> >>>> SAT_URL = "http://adds.aviationweather.gov/data/satellite/"
> >>>> SAT_FILE = "latest_BWI_vis.jpg"
> >>>> satimg = urllib.urlretrieve( SAT_URL, SAT_FILE )
> > Traceback (most recent call last):
> >   File "<stdin>", line 1, in <module>
> >   File "c:\python26\lib\urllib.py", line 93, in urlretrieve
> >     return _urlopener.retrieve(url, filename, reporthook, data)
> >   File "c:\python26\lib\urllib.py", line 237, in retrieve
> >     fp = self.open(url, data)
> >   File "c:\python26\lib\urllib.py", line 205, in open
> >     return getattr(self, name)(url)
> >   File "c:\python26\lib\urllib.py", line 344, in open_http
> >     h.endheaders()
> >   File "c:\python26\lib\httplib.py", line 904, in endheaders
> >     self._send_output()
> >   File "c:\python26\lib\httplib.py", line 776, in _send_output
> >     self.send(msg)
> >   File "c:\python26\lib\httplib.py", line 735, in send
> >     self.connect()
> >   File "c:\python26\lib\httplib.py", line 716, in connect
> >     self.timeout)
> >   File "c:\python26\lib\socket.py", line 514, in create_connection
> >     raise error, msg
> > IOError: [Errno socket error] [Errno 10060] A connection attempt
> > failed because
> > the connected party did not properly respond after a period of time,
> > or establis
> > hed connection failed because connected host has failed to respond
>
> It should be like this:
>
> SAT_URL =
> "http://adds.aviationweather.gov/data/satellite/latest_BWI_vis.jpg"
> SAT_FILE = r"C:\latest_BWI_vis.jpg"
> urllib.urlretrieve(SAT_URL, SAT_FILE)- Hide quoted text -
>
> - Show quoted text -

It doesn't matter, the same error 10060 appears .

The connection problem has to do with the proxy settings.

 In order for me to use Internet Explorer, the LAN's Automatic
configuration must be turned on and use a script found on the
company's proxy server. I was wondering how to get urllib.urlopen to
access the script on the proxy server.

Thanks for your help.

Kelly



More information about the Python-list mailing list