Auto-Encoders for Natural Exponential-Family Dictionary Learning
- BDL

We consider the problem of learning recurring convolutional patterns from data that are not necessarily real-valued, such as binary or count-valued data. We cast the problem as one of learning a convolutional dictionary, subject to sparsity constraints, given observations drawn from a distribution that belongs to the natural exponential family. We propose two general auto-encoder architectures, the weights of which are in one-to-one correspondence with the parameters of the convolutional dictionary, and which differ in the manner in which the encoder enforces sparsity. Our key insight is that, unless the observations are real-valued, the input fed into the encoder ought to be modified non-linearly, and in a specific manner, using the parameters of the dictionary. We apply the architectures to the denoising of binomial images and the learning of the latent stimulus that modulates neural spiking data acquired from the barrel cortex of mice.
View on arXiv