475

Neural Bootstrapper

Neural Information Processing Systems (NeurIPS), 2020
Abstract

Bootstrapping has been a primary tool for uncertainty quantification, and its theoretical and computational properties have been investigated in the field of statistics and machine learning. However, due to its nature of repetitive computations, the computational burden required to bootstrap neural networks is painfully heavy, and this fact seriously hurdles the practical use of these procedures on the uncertainty estimation of modern deep learning. To overcome this computational bottleneck, we propose a procedure called Neural Bootstrapper (NeuBoots) that constructs a generator of bootstrapped networks. Unlike the standard bootstrap, the proposed NeuBoots can be computed on a single loss function from a single training. It thus avoids repetitive training inherited in the standard bootstrap, which significantly improves the efficiency of the bootstrap computation. We theoretically show that the NeuBoots asymptotically approximates the standard bootstrap distribution, and our empirical examples also support this assertion. Consequently, we apply the NeuBoots to uncertainty quantification tasks in machine learning, and these include prediction calibrations, semantic segmentation tasks, detection of out-of-distribution samples, and active learning. Our empirical results show that the NeuBoots performs better than the state-of-the-art procedures in the uncertainty quantification, under a much less computational cost.

View on arXiv
Comments on this paper