32
0

Joker: Joint Optimization Framework for Lightweight Kernel Machines

Abstract

Kernel methods are powerful tools for nonlinear learning with well-established theory. The scalability issue has been their long-standing challenge. Despite the existing success, there are two limitations in large-scale kernel methods: (i) The memory overhead is too high for users to afford; (ii) existing efforts mainly focus on kernel ridge regression (KRR), while other models lack study. In this paper, we propose Joker, a joint optimization framework for diverse kernel models, including KRR, logistic regression, and support vector machines. We design a dual block coordinate descent method with trust region (DBCD-TR) and adopt kernel approximation with randomized features, leading to low memory costs and high efficiency in large-scale learning. Experiments show that Joker saves up to 90\% memory but achieves comparable training time and performance (or even better) than the state-of-the-art methods.

View on arXiv
@article{zhang2025_2505.17765,
  title={ Joker: Joint Optimization Framework for Lightweight Kernel Machines },
  author={ Junhong Zhang and Zhihui Lai },
  journal={arXiv preprint arXiv:2505.17765},
  year={ 2025 }
}
Comments on this paper