Generalized Min-Max Kernel and Generalized Consistent Weighted Sampling
We propose the "generalized min-max" (GMM) kernel as a measure of data similarity, where data vectors can have both positive and negative entries. GMM is positive definite as there is an associate hashing method named "generalized consistent weighted sampling" (GCWS) which linearizes this (nonlinear) kernel. A natural competitor of GMM is the radial basis function (RBF) kernel, whose corresponding hashing method is known as the "random Fourier features" (RFF). An extensive experimental study on classifications of \textbf{50} publicly available datasets demonstrates that both the GMM and RBF kernels can often substantially improve over linear classifiers. Furthermore, the GCWS hashing method typically requires substantially fewer samples than RFF in order to achieve similar classification accuracies. To understand the property of random Fourier features (RFF), we derive the theoretical variance of RFF, which reveals that the variance of RFF has a term that does not vanish at any similarity. In comparison, the variance of GCWS approaches zero at certain similarities. Overall, the relative (to the expectation) variance of RFF is substantially larger than the relative variance of GCWS. This helps explain the superb empirical results of GCWS compared to RFF. We expect that GMM and GCWS will be adopted in practice for large-scale statistical machine learning applications and efficient near neighbor search (as GMM generates discrete hash values).
View on arXiv