Question RE urllib

Jeff James jeff at jeffljames.com
Tue Dec 17 10:26:25 EST 2013


So I'm using the following script to check our sites to make sure they are
all up and some of them are reporting they are "down" when, in fact, they
are actually up.   These sites do not require a logon in order for the home
page to come up.  Could this be due to some port being blocked internally ?
 Only one of the sites reporting as down is "https" but all are internal
sites.  Is there some other component I should be including in the script ?
 There are about 30 or 40 sites that I have listed in all.  I just use
those in the following script as examples.   Thanks

import urllib

sites = ["http://www.amazon.com/", "https://internalsite.com/intranet.html",
etc.]

for site in sites:
    try:
        urllib.urlopen(site)
        print site + " "
    except Exception, e:
        print site + " is down"
-- 
https://mail.python.org/mailman/listinfo/python-list


I've never used urllib, although I've done a fair amount of network
programming at lower levels.

Are you sure the report of "down" isn't simply a time out due to the server
being busier than you expect when you hit it?

-Bill

After adding the line suggested by Larry, I was able to determine that the
URLs reporting as "down" were actually sites requiring authentication in
order to provide site content, so adding that line to the handler was at
least enlightening in that respect.  Thanks Larry.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-list/attachments/20131217/675e36fe/attachment.html>


More information about the Python-list mailing list