ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.06298
83
8

Budget Learning via Bracketing

14 April 2020
Aditya Gangrade
D. A. E. Acar
Venkatesh Saligrama
ArXiv (abs)PDFHTML
Abstract

Conventional machine learning applications in the mobile/IoT setting transmit data to a cloud-server for predictions. Due to cost considerations (power, latency, monetary), it is desirable to minimise device-to-server transmissions. The budget learning (BL) problem poses the learner's goal as minimising use of the cloud while suffering no discernible loss in accuracy, under the constraint that the methods employed be edge-implementable. We propose a new formulation for the BL problem via the concept of bracketings. Concretely, we propose to sandwich the cloud's prediction, g,g,g, via functions h−,h+h^-, h^+h−,h+ from a `simple' class so that h−≤g≤h+h^- \le g \le h^+h−≤g≤h+ nearly always. On an instance xxx, if h+(x)=h−(x)h^+(x)=h^-(x)h+(x)=h−(x), we leverage local processing, and bypass the cloud. We explore theoretical aspects of this formulation, providing PAC-style learnability definitions; associating the notion of budget learnability to approximability via brackets; and giving VC-theoretic analyses of their properties. We empirically validate our theory on real-world datasets, demonstrating improved performance over prior gating based methods.

View on arXiv
Comments on this paper