Is crawling the stack "bad"? Why?
Russell Warren
russandheather at gmail.com
Tue Feb 26 00:47:31 EST 2008
Thanks Ian... I didn't know about threading.local before but have been
experimenting and it will likely come in quite handy in the future.
For this particular case it does basically seem like a replacement for
the threadID indexed dictionary, though. ie: I'll still need to set
up the RpcContainer, custom request handler, and custom server in
order to get the info handed around properly. I will likely go with
this approach since it lets me customize other aspects at the same
time, but for client IP determination alone I still half think that
the stack crawler is cleaner.
No convincing argument yet on why crawling the stack is considered
bad? I kind of hoped to come out of this with a convincing argument
that would stick with me...
On Feb 25, 12:30 pm, Ian Clark <icl... at mail.ewu.edu> wrote:
> On 2008-02-25, Russell Warren <russandheat... at gmail.com> wrote:
>
>
>
> >> the threading.local class seems defined for that purpose, not that I've ever
> >> used it ;)
>
> > I hadn't heard of that... it seems very useful, but in this case I
> > think it just saves me the trouble of making a stash dictionary...
> > unless successive calls to threading.local return the same instance?
> > I'll have to try that, too.
>
> No, successive calls to threading.local() will return different objects.
> So, you call it once to get your 'data store' and then use that one
> object from all your threads. It takes care of making sure each thread
> gets it's own data.
>
> Here is your example, but using threading.local instead of your own
> version of it. :)
>
> Ian
>
> import xmlrpclib, threading, sys, thread
> from SimpleXMLRPCServer import SimpleXMLRPCServer, SimpleXMLRPCRequestHandler
>
> thread_data = threading.local()
>
> class RpcContainer(object):
> def __init__(self):
> self._Handlers = {} #keys = thread IDs, values=requestHandlers
> def _GetRpcClientIP(self):
> #connection = self._Handlers[thread.get_ident()].connection
> connection = thread_data.request.connection
> ip = connection.getpeername()[0]
> return ip
> def WhatIsMyIP(self):
> return "Your IP is: %s" % self._GetRpcClientIP()
>
> class ThreadCapableRequestHandler(SimpleXMLRPCRequestHandler):
> def do_POST(self, *args, **kwargs):
> #make the handler available to the RPCs, indexed by threadID...
> thread_data.request = self
> SimpleXMLRPCRequestHandler.do_POST(self, *args, **kwargs)
>
> class MyXMLRPCServer(SimpleXMLRPCServer):
> def __init__(self, RpcContainer, *args, **kwargs):
> self.RpcContainer = RpcContainer
> SimpleXMLRPCServer.__init__(self, *args, **kwargs)
>
> class DaemonicServerLaunchThread(threading.Thread):
> def __init__(self, RpcServer, **kwargs):
> threading.Thread.__init__(self, **kwargs)
> self.setDaemon(1)
> self.server = RpcServer
> def run(self):
> self.server.serve_forever()
>
> container = RpcContainer()
> rpcServer = MyXMLRPCServer(
> RpcContainer = container,
> addr = ("", 12390),
> requestHandler = ThreadCapableRequestHandler,
> logRequests = False)
> rpcServer.register_function(container.WhatIsMyIP)
> slt = DaemonicServerLaunchThread(rpcServer)
> slt.start()
>
> sp = xmlrpclib.ServerProxy("http://localhost:12390")
> print sp.WhatIsMyIP()
More information about the Python-list
mailing list