In literature, two different common informations were defined by G\'acs and K\"orner and by Wyner, respectively. In this paper, we define a generalized version of common information, information-correlation function, by exploiting conditional maximal correlation as a commonness or privacy measure, which encompasses the notions of G\'acs-K\"orner's and Wyner's common informations as special cases. Furthermore, we also study the problems of common information extraction and private sources synthesis, and show that information-correlation function is the optimal rate under a given conditional maximal correlation constraint for the centralized setting versions of these problems. As a side product, some properties on conditional maximal correlation have been derived as well.
View on arXiv