KernelNet: A Data-Dependent Kernel Parameterization for Deep Generative Modeling

Learning with kernels is an important concept in machine learning. Standard approaches for kernel methods often use predefined kernels that require careful selection of hyper-parameters. To mitigate this burden, we propose in this paper a framework to construct and learn a data-dependent kernel based on random features and implicit spectral distributions parameterized by deep neural networks. The constructed network (called KernelNet) can be applied for deep generative modeling in various scenarios, including variants of the MMD-GAN and an implicit Variational Autoencoder (VAE), the two popular learning paradigms in deep generative models. Theoretically, we show that our proposed kernel indeed exists, and the induced Maximum Mean Discrepancy (MMD) endows the continuity in weak topology. Extensive experiments indicate that our proposed KernelNet consistently achieves better performance compared to related methods.
View on arXiv