Rate distortion theory is the part of information theory that answers: How many bits are minimally needed to represent a source if we allow some distortion? It is the lossy-compression counterpart of entropy/source coding theory.

For lossless compression, the fundamental limit is the entropy H(X)H(X); for lossy compression, the fundamental limit is the rate-distortion function R(D)R(D), where R=R= number of bits per source symbol, D=D= allowed average distortion, and R(D)R(D) represents minimum achievable rate under distortion level DD.