A Discrepancy Bound for a Deterministic Acceptance-Rejection Sampler
We consider an acceptance-rejection sampler based on a deterministic driver sequence. The deterministic sequence is chosen such that the discrepancy between the empirical target distribution and the target distribution is small. We use quasi-Monte Carlo (QMC) point sets for this purpose. The empirical evidence shows convergence rates beyond the crude Monte Carlo rate of . We prove that the discrepancy of samples generated by the QMC acceptance-rejection sampler is bounded from above by . A lower bound shows that for any given driver sequence, there always exists a target density such that the star discrepancy is at most . For a general density, whose domain is the real state space , the inverse Rosenblatt transformation can be used to convert samples from the dimensional cube to . We show that this transformation is measure preserving. This way, under certain conditions, we obtain the same convergence rate for a general target density defined in . Moreover, we also consider a deterministic reduced acceptance-rejection algorithm recently introduced by Barekat and Caflisch [F. Barekat and R.Caflisch. Simulation with Fluctuation and Singular Rates. ArXiv:1310.4555[math.NA], 2013.]
View on arXiv