| bits? Does entropy have a hard objective definition here, or is it subjective?
Here is another showerthought. Different compression idea.
Every possible sequence of every possible
It is definitely objective in my example. Think of it this way, instead of 100 coinflips, do 2, so in binary: 00,01,10,11. And try to map the info into 1 bit: 0 or 1.
You can map 0 to 00 and 1 to 01, but then what do you map to 10 and 11? If you know that 10 and 11 never occurs, sure, you have compressed the data, but you do not.
This is proof that you cannot reduce entropy, only change the dataformat (and often add some entropy). Even a compression algorithm will INCREASE entropy of a totally random dataset.
About compression: For a given special dataset it may indeed decrease size, but if you use it over and over for different random datasets, you will spend more space than uncompressed.
That is, it is "proof" (not very rigorous and mathematically clear) that you cannot reduce entropy by a whole (information-)bit. It could easily be generalized to be more fine grained.