Multi-Task Variational Information Bottleneck
Abstract
In this paper we propose a variational information bottleneck (VIB)-based framework for multi-task learning (MTL), where a more accurate latent representation can be obtained from the input data which also learn different tasks in parallel. Moreover, the task-dependent uncertainties are taken into account to learn the relative weights of task loss functions. The proposed method is examined with three publicly available data sets under different adversarial attacks. The overall classification performance of our model is promising. It can achieve comparable classification accuracies as the benchmarked models, and has shown a better robustness against adversarial attacks compared with other MTL models.
View on arXivComments on this paper
