Asymptotic Coupling and Its Applications in Information Theory

A coupling of two distributions and is a joint distribution with marginal distributions equal to and . Given marginals and and a real-valued function of the joint distribution , what is its minimum over all couplings of and ? We study the asymptotics of such coupling problems with different 's and with and replaced by and where and are i.i.d.\ copies of random variables and with distributions and respectively. These include the maximal coupling, minimum distance coupling, maximal guessing coupling, and minimum entropy coupling problems. We characterize the limiting values of these coupling problems as tends to infinity. We show that they typically converge at least exponentially fast to their limits. Moreover, for the problems of maximal coupling and minimum excess-distance probability coupling, we also characterize (or bound) the optimal convergence rates (exponents). Furthermore, for the maximal guessing coupling problem we show that it is equivalent to the distribution approximation problem. Therefore, some existing results for the latter problem can be used to derive the asymptotics of the maximal guessing coupling problem. We also study the asymptotics of the maximal guessing coupling problem for two \emph{general} sources and a generalization of this problem, named the \emph{maximal guessing coupling through a channel problem}. We apply the preceding results to several new information-theoretic problems, including exact intrinsic randomness, exact resolvability, channel capacity with input distribution constraint, and perfect stealth and secrecy communication.
View on arXiv