The source coding theorem states that information can be compressed until Shannon entropy and if the compressed code rate is smaller than Shannon entropy, the information loss must be imposed. In mathematical relation, the optimum number of compressed data bits is $ NH(X) $ where the $ N $ is the number of symbols and assumed to be sufficiently large, $ H(X) $ is entropy and $ X $ is information source.