We consider the asymptotic properties of source code sequences which approach the optimal rate-distortion bound. In this paper we show that for any arbitrary source code sequence which approach the optimal rate-distortion function R(D) of a discrete memoryless source, the empirical conditional distribution of the n-length source sequence given the n length reconstruction sequence is close to the n-product of the unique minimum-mutual-information test channel conditional distribution. This closeness is given by the convergence of the normalized conditional divergence. One of the implications of this result is that it is possible to approximate arbitrary discrete memoryless channels as test channels in source coding. Though our results are presented for stationary discrete memoryless sources, these can be generalized to sources with memory.
展开▼