[SciPy-User] SciPy.stats.kde.gaussian_kde estimation of information-theoretic measures
Skipper Seabold
jsseabold at gmail.com
Sun Sep 27 17:07:33 EDT 2009
On Thu, Sep 24, 2009 at 7:38 AM, Dan Stowell
<dan.stowell at elec.qmul.ac.uk> wrote:
> Hi -
>
> I'd like to use SciPy.stats.kde.gaussian_kde to estimate
> Kullback-Leibler divergence. In other words, given KDE estimates of two
> different distributions p(x) and q(x) I'd like to evaluate things like
>
> integral of { p(x) log( p(x)/q(x) ) }
>
> Is this possible using gaussian_kde? The method
> kde.integrate_kde(other_kde) gets halfway there. Or if not, are there
> other modules that can do this kind of thing?
>
You should be able to use this for relative entropy. I would be
interested to hear about your experience, as I've just started to
study this recently.
In [1]: from scipy import stats
In [2]: stats.entropy?
Type: function
Base Class: <type 'function'>
String Form: <function entropy at 0x20bbf50>
Namespace: Interactive
File:
/usr/local/lib/python2.6/dist-packages/scipy/stats/distributions.py
Definition: stats.entropy(pk, qk=None)
Docstring:
S = entropy(pk,qk=None)
calculate the entropy of a distribution given the p_k values
S = -sum(pk * log(pk), axis=0)
If qk is not None, then compute a relative entropy
S = sum(pk * log(pk / qk), axis=0)
Routine will normalize pk and qk if they don't sum to 1
Skipper
More information about the SciPy-User
mailing list