Using a proxy with urllib2

Jack nospam at invalid.com
Fri Jan 11 12:57:18 EST 2008


Rob,

I tried your code snippet and it worked great. I'm just wondering if 
getopener( ) call
is lightweight so I can just call it in every call to fetchurl( )? Or I 
should try to share
the opener object among fetchurl( ) calls?

Thanks,
Jack


"Rob Wolfe" <rw at smsnet.pl> wrote in message 
news:87ir21o8sj.fsf at merkury.smsnet.pl...
> Try this:
>
> <code>
> import urllib2
>
> def getopener(proxy=None):
>    opener = urllib2.build_opener(urllib2.HTTPHandler)
>    if proxy:
>        proxy_support = urllib2.ProxyHandler({"http": "http://" + proxy})
>        opener.add_handler(proxy_support)
>    return opener
>
> def fetchurl(url, opener):
>    f = opener.open(url)
>    data = f.read()
>    f.close()
>    return data
>
> print fetchurl('http://www.python.org', getopener('127.0.0.1:8081'))
> </code>
>
> HTH,
> Rob 





More information about the Python-list mailing list