Compression of random binary data

Steve D'Aprano steve+python at pearwood.info
Sun Oct 29 08:27:01 EDT 2017


On Sun, 29 Oct 2017 01:56 pm, Stefan Ram wrote:

>   If the entropy of an individual message is not defined,
>   than it is still available to be defined. I define it
>   to be log2(1/p), where p is the probability of this
>   message. I also choose a unit for it, which I call "bit".

That is exactly the definition of self-information:

https://en.wikipedia.org/wiki/Self-information

See also:

https://en.wikipedia.org/wiki/Entropy_(information_theory)

which lists several forms of related measures of information:

- the self-information of an individual message or symbol taken from a 
  given probability distribution;

- the entropy of a given probability distribution of messages or symbols;

- the entropy rate of a stochastic process.


It also suggests a connection between information entropy and thermodynamic
entropy, namely that the information entropy of a system is the amount of
additional information needed to determine the microstate of a system (the
states of all its particles), given the macrostate (identified by the bulk
thermodynamic parameters such as temperature, volume, energy).

More here: 

https://physics.stackexchange.com/questions/263197/is-information-entropy-the-same-as-thermodynamic-entropy




-- 
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.




More information about the Python-list mailing list