ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1701.01974
89
70
v1v2v3v4v5 (latest)

Arimoto-Rényi Conditional Entropy and Bayesian MMM-ary Hypothesis Testing

8 January 2017
I. Sason
S. Verdú
ArXiv (abs)PDFHTML
Abstract

This paper gives upper and lower bounds on the minimum error probability of Bayesian MMM-ary hypothesis testing in terms of the Arimoto-R\'enyi conditional entropy of an arbitrary order α\alphaα. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α=1\alpha=1α=1) is demonstrated. In particular, in the case where MMM is finite, we show how to generalize Fano's inequality under both the conventional and list-decision settings. As a counterpart to the generalized Fano's inequality, allowing MMM to be infinite, a lower bound on the Arimoto-R\'enyi conditional entropy is derived as a function of the minimum error probability. Explicit upper and lower bounds on the minimum error probability are obtained as a function of the Arimoto-R\'enyi conditional entropy for both positive and negative α\alphaα. Furthermore, we give upper bounds on the minimum error probability as functions of the R\'enyi divergence and the Chernoff information. In the setup of discrete memoryless channels, we analyze the exponentially vanishing decay of the Arimoto-R\'enyi conditional entropy of the transmitted codeword given the channel output when averaged over a random coding ensemble.

View on arXiv
Comments on this paper