urllib.urlopen blocking all threads for seconds when no connection or exception?

Robert k.robert at gmx.de
Thu Dec 12 09:17:32 EST 2002


Running the script below on windows (XP), when the computer has no 
connection to the site, urlopen blocks ALL other threads for seconds 
unpleasantly, as  shown in the output below. When the connection is 
there, everything seems to run smooth and seamless, regardless how 
time-consuming the urlopen or read is.

is there a solution to this problem?

Robert

------ source

import time,threading,urllib

def thread1():
     try:
         while run:
             time.sleep(4)
             try:
                 f=urllib.urlopen("http://www.google.com")
                 f.read()
                 print "good"
             except:
                 print "open error"
     finally:
         print "term thread1"

def thread2():
     while run:
         time.sleep(0.2)
         print ". %.1f" % time.time()

run=1
threading.Thread(target=thread1).start()
threading.Thread(target=thread2).start()
while 1:time.sleep(1)

------ output
...
. 1039701645.0
. 1039701645.2
. 1039701645.5
. 1039701645.7
. 1039701645.9
. 1039701646.1
. 1039701646.3
. 1039701648.7
open error
. 1039701648.9
. 1039701649.1
. 1039701649.3
. 1039701649.5
...




More information about the Python-list mailing list