In this paper, the limit properties of nonhomogeneous Markov information sources are studied, and a limit theorem for the averages of the functions of two variables of these information sources are obtained. We introduce the notion of the average random conditional entropy, and prove that the relative entropy densities of nonhomogeneous Markov information sources and their average random conditional entropies are asymptotically equal. Finally, we prove the AEP for a class of nonhomogeneous Markov information sources which is an extension of the Shannon Theorem, and a source coding theorem follows immediately from the AEP.
展开▼