Almost Have Threads Working for HTTP Scan

Eddie Corns eddie at holyrood.ed.ac.uk
Thu Jul 1 13:20:52 EDT 2004


Some observations:

Since every worker thread is creating another thread for each host you will
end up with 65,535 threads all active, that seem overkill to me.

On closer inspection it's going to be massively skewed towards thread 1, since
it could simply empty the entire url_queue before the others get started.

I presume the network section isn't finished since it's only actually scanning
255 addresses.

Wouldn't it be enough to just try and connect to ports 80,8080, (etc.), using
just a socket?

Why not use seperate queues for failures and successes (not sure what the
failures gives you anyway?)

As for it hanging, I'm guessing most of the threads are sitting on 
   url = url_queue.get()
the exception will never happen on a blocking get().

How about a simpler approach of creating 256 threads (one per subnet) each of
which scans its own subnet sequentially.

def test_http (subnet):
    for i in range(256):
        ip='192.168.%d.%d'%(subnet,i)
	x = probe (ip)    ; returns one of 'timeout','ok','fail'
	Qs[x].put(ip)

Qs = {'timeout':Queue.Queue(),'ok':Queue.Queue(),'fail':Queue.Queue()}
for subnet in range(256):
    workers.append (.. target=test_http, args=(subnet,) )



More information about the Python-list mailing list