Jensen's inequality is ubiquitous in measure and probability theory, statistics, machine learning, information theory and many other areas of mathematics and data science. It states that, for any convex function on a convex domain and any random variable taking values in , . In this paper, sharp upper and lower bounds on , termed "graph convex hull bounds", are derived for arbitrary functions on arbitrary domains , thereby strongly generalizing Jensen's inequality. Establishing these bounds requires the investigation of the convex hull of the graph of , which can be difficult for complicated . On the other hand, once these inequalities are established, they hold, just like Jensen's inequality, for any random variable . Hence, these bounds are of particular interest in cases where is fairly simple and is complicated or unknown. Both finite- and infinite-dimensional domains and codomains of are covered, as well as analogous bounds for conditional expectations and Markov operators.
View on arXiv