So I'm self studying information theory, and I have a few doubts on entropy and encoding as a whole. I'm trying to compress a simple 16bit signed int sequence of values the best I can.
I learned about entropy: a lower bound to the expected coded sequence length (bits). However, is that true in the absolute sense? For example, whenever I delta encode (the first value + the difference between each and the next), I get smaller numbers (which by pigeon hole principle means more data have the same value) and a lower entropy value. That is, I managed to exploit a property of the data and losslessly encode the data below its original entropy.
My question is: Is there a sure way of determine the absolute minimum entropy encoding for a system? If not, is there maybe an algorithm that searches for it? Is this a solved problem?
Thank you!