Neural Simpletrons - Minimalistic Directed Generative Networks for
Learning with Few Labels
- BDL
Deep learning is intensively studied using supervised and unsupervised learning, and by applying probabilistic, deterministic, and bio-inspired approaches. Comparisons of different approaches such as generative and discriminative neural networks is made difficult, however, because of differences in the semantics of their graphical descriptions, different learning methods, different benchmarking objectives and different scalability. To allow for a direct functional comparison, we here study a generative multi-layer neural network in a form and setting as similar to standard discriminative networks as possible. Based on normalized Poisson mixtures, we derive a minimalistic deep neural network with local activation and learning rules. The network learns in a semi-supervised setting and can be scaled using standard deep learning tools for parallelized implementations. Empirical evaluations on standard benchmarks show that for weakly labeled data the derived minimalistic network improves on all standard deep learning approaches and is competitive with their recent variants. In comparison to recent bio-inspired approaches it suggests further improvements through top-down connections. Furthermore, we find that the studied network is the best performing monolithic (`non-hybrid') system for few labels, and that it can be applied in the limit of very few labels, where no other system has been reported to operate so far.
View on arXiv