Logging from a multiprocess application

Mark Betz betz.mark at gmail.com
Thu Feb 6 14:45:29 EST 2014


On Thursday, February 6, 2014 1:24:17 PM UTC-5, Joseph L. Casale wrote:
> I have a module that has one operation that benefits greatly from being multiprocessed.
> Its a console based module and as such I have a stream handler and filter associated to
> the console, obviously the mp based instances need special handling, so I have been
> experimenting with a socket server in a thread in order for the remaining application to
> carry on.
> 
> How have others tackled this problem? The portion of the code made to use multiprocessing
> can not be switched to threading as it performs worse than simply serializing each task.
> 
> Thanks,
> jlc

Maybe check out logstash (http://logstash.net/). It's part of elastic search now. Using the python-logstash module (https://pypi.python.org/pypi/python-logstash) you can either log directly to the logstash daemon using the provided event formatter and http, or you can do as I did and set up a redis server to host a list. Logstash has an included redis input plugin that will blpop a list to get events. I use the formatter from the logstash module to format the events and then write them into redis as json. Logstash picks them up from redis and indexes them into its embedded elastic search instance. You can then connect to the embedded kibana web server to view a dashboard with event information. It's pretty cool and doesn't take long to set up.



More information about the Python-list mailing list