ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.14718
16
24

Shallow decision trees for explainable kkk-means clustering

29 December 2021
E. Laber
Lucas Murtinho
F. Oliveira
ArXivPDFHTML
Abstract

A number of recent works have employed decision trees for the construction of explainable partitions that aim to minimize the kkk-means cost function. These works, however, largely ignore metrics related to the depths of the leaves in the resulting tree, which is perhaps surprising considering how the explainability of a decision tree depends on these depths. To fill this gap in the literature, we propose an efficient algorithm that takes into account these metrics. In experiments on 16 datasets, our algorithm yields better results than decision-tree clustering algorithms such as the ones presented in \cite{dasgupta2020explainable}, \cite{frost2020exkmc}, \cite{laber2021price} and \cite{DBLP:conf/icml/MakarychevS21}, typically achieving lower or equivalent costs with considerably shallower trees. We also show, through a simple adaptation of existing techniques, that the problem of building explainable partitions induced by binary trees for the kkk-means cost function does not admit an (1+ϵ)(1+\epsilon)(1+ϵ)-approximation in polynomial time unless P=NPP=NPP=NP, which justifies the quest for approximation algorithms and/or heuristics.

View on arXiv
Comments on this paper