[SciPy-Dev] Entropy Calculations

Matt Haberland mhaberla at calpoly.edu
Wed Jul 29 16:23:06 EDT 2020


Thanks for letting us know. Can you send a reference for the algorithms you
implemented? I didn't see any in a quick look through the notebook and code.

Also, I see that this uses Numba, but we don't have that as a dependency
yet. How important is that speedup? How essential are the other
dependencies - cocos, cubature, quadpy?

---------- Forwarded message ---------
From: Michael Nowotny <nowotnym at gmail.com>
Date: Fri, Jul 24, 2020 at 7:00 PM
Subject: [SciPy-Dev] Entropy Calculations
To: <scipy-dev at python.org>


Dear SciPy developers,

I have noticed that the statistical functions for the calculation of
entropy and KL divergence currently only support discrete distributions for
which the probability mass function is known. I recently needed to compute
various information theoretic measures from samples of distributions and
created the package `Divergence`. It offers functionality for entropy,
cross entropy, relative entropy, Jensen-Shannon divergence, joint entropy,
conditional entropy, and mutual information and is available on GitHub at
https://github.com/michaelnowotny/divergence. It supports samples from both
discrete and continuous distributions. Continuous distributions are
implement via numerical integration of kernel density estimates generated
from the sample. I would be happy to contribute some or all of its
functionality to SciPy. Please let me know if you are interested.

Thank you,

Michael
_______________________________________________
SciPy-Dev mailing list
SciPy-Dev at python.org
https://mail.python.org/mailman/listinfo/scipy-dev


-- 
Matt Haberland
Assistant Professor
BioResource and Agricultural Engineering
08A-3K, Cal Poly
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20200729/77c76cef/attachment.html>


More information about the SciPy-Dev mailing list