[SciPy-Dev] Entropy Calculations

Michael Nowotny nowotnym at gmail.com
Fri Jul 24 22:00:38 EDT 2020


Dear SciPy developers,

I have noticed that the statistical functions for the calculation of entropy and KL divergence currently only support discrete distributions for which the probability mass function is known. I recently needed to compute various information theoretic measures from samples of distributions and created the package `Divergence`. It offers functionality for entropy, cross entropy, relative entropy, Jensen-Shannon divergence, joint entropy, conditional entropy, and mutual information and is available on GitHub at https://github.com/michaelnowotny/divergence <https://github.com/michaelnowotny/divergence>. It supports samples from both discrete and continuous distributions. Continuous distributions are implement via numerical integration of kernel density estimates generated from the sample. I would be happy to contribute some or all of its functionality to SciPy. Please let me know if you are interested.

Thank you,

Michael
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20200724/646e9643/attachment.html>


More information about the SciPy-Dev mailing list