123
v1v2 (latest)

Almost Tight Approximation Algorithms for Explainable Clustering

Abstract

Recently, due to an increasing interest for transparency in artificial intelligence, several methods of explainable machine learning have been developed with the simultaneous goal of accuracy and interpretability by humans. In this paper, we study a recent framework of explainable clustering first suggested by Dasgupta et al.~\cite{dasgupta2020explainable}. Specifically, we focus on the kk-means and kk-medians problems and provide nearly tight upper and lower bounds. First, we provide an O(logkloglogk)O(\log k \log \log k)-approximation algorithm for explainable kk-medians, improving on the best known algorithm of O(k)O(k)~\cite{dasgupta2020explainable} and nearly matching the known Ω(logk)\Omega(\log k) lower bound~\cite{dasgupta2020explainable}. In addition, in low-dimensional spaces dlogkd \ll \log k, we show that our algorithm also provides an O(dlog2d)O(d \log^2 d)-approximate solution for explainable kk-medians. This improves over the best known bound of O(dlogk)O(d \log k) for low dimensions~\cite{laber2021explainable}, and is a constant for constant dimensional spaces. To complement this, we show a nearly matching Ω(d)\Omega(d) lower bound. Next, we study the kk-means problem in this context and provide an O(klogk)O(k \log k)-approximation algorithm for explainable kk-means, improving over the O(k2)O(k^2) bound of Dasgupta et al. and the O(dklogk)O(d k \log k) bound of \cite{laber2021explainable}. To complement this we provide an almost tight Ω(k)\Omega(k) lower bound, improving over the Ω(logk)\Omega(\log k) lower bound of Dasgupta et al. Given an approximate solution to the classic kk-means and kk-medians, our algorithm for kk-medians runs in time O(kdlog2k)O(kd \log^2 k ) and our algorithm for kk-means runs in time $ O(k^2 d)$.

View on arXiv
Comments on this paper