In monotone classification, the input is a multi-set of points in , each associated with a hidden label from . The goal is to identify a monotone function , which acts as a classifier, mapping from to with a small {\em error}, measured as the number of points whose labels differ from the function values . The cost of an algorithm is defined as the number of points having their labels revealed. This article presents the first study on the lowest cost required to find a monotone classifier whose error is at most where and is the minimum error achieved by an optimal monotone classifier -- in other words, the error is allowed to exceed the optimal by at most a relative factor. Nearly matching upper and lower bounds are presented for the full range of . All previous work on the problem can only achieve an error higher than the optimal by an absolute factor.
View on arXiv@article{tao2025_2506.10775, title={ Monotone Classification with Relative Approximations }, author={ Yufei Tao }, journal={arXiv preprint arXiv:2506.10775}, year={ 2025 } }