[OT] Compression of random binary data

Michael Torrie torriem at gmail.com
Wed Jul 13 15:03:34 EDT 2016


On 07/13/2016 03:46 AM, jonas.thornvall at gmail.com wrote:
> It is not that the data is not compressible i just need more chunks
> or random data, it is the footprint of the algorithm that has a
> certain it is a structure afterall albeit richer in interpretation
> than the numerical field.

Err, no, that does not apply to "random."  If the data is truly random
then it does not matter whether you have 5 bytes or 5 GB.  There is no
pattern to discern, and having more chunks of random data won't make it
possible to compress.

> You exchange size for computational complexity, but that may be true
> for any compression algorithm.

People are taking issue with your claim that you can compress random
data.  Clearly you cannot, by your own admission about requiring more
chunks of data (see above), and also by the irrefutable logical
principles Steven D'Aprano layed out.

Probably time to mute this thread. It's not related to Python.




More information about the Python-list mailing list