A Numerical Approach to Optimal Sequential Multi-Hypothesis Testing

In this paper we deal with the problem of sequential testing of multiple hypotheses. The main goal is minimising the expected sample size (ESS) under restrictions on the error probabilities. We use a variant of the method of Lagrange multipliers which is based on the minimisation of an auxiliary objective function (called Lagrangian). This function is defined as a weighted sum of all the test characteristics we are interested in: the error probabilities and the ESSs evaluated at some points of interest. In this paper, we use a definition of the Lagrangian function involving the ESS evaluated at any finite number of fixed parameter points (not necessarily those representing the hypotheses). Then we develop a computer-oriented method of minimisation of the Lagrangian function, that provides, depending on the specific choice of the parameter points, optimal tests in different concrete settings, like in Bayesian, Kiefer-Weiss and other settings. To exemplify the proposed methods for the particular case of sampling from a Bernoulli population we develop a set of computer algorithms for designing sequential tests that minimise the Lagrangian function and for the numerical evaluation of test characteristics like the error probabilities and the ESS, and other related. For the Bernoulli model, we made a series of computer evaluations related to the optimality of sequential multi-hypothesis tests, in a particular case of three hypotheses. A numerical comparison with the matrix sequential probability ratio test is carried out.
View on arXiv