ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.08359
22
0

Kernel-Level Energy-Efficient Neural Architecture Search for Tabular Dataset

11 April 2025
Hoang-Loc La
Phuong Hoai Ha
ArXivPDFHTML
Abstract

Many studies estimate energy consumption using proxy metrics like memory usage, FLOPs, and inference latency, with the assumption that reducing these metrics will also lower energy consumption in neural networks. This paper, however, takes a different approach by introducing an energy-efficient Neural Architecture Search (NAS) method that directly focuses on identifying architectures that minimize energy consumption while maintaining acceptable accuracy. Unlike previous methods that primarily target vision and language tasks, the approach proposed here specifically addresses tabular datasets. Remarkably, the optimal architecture suggested by this method can reduce energy consumption by up to 92% compared to architectures recommended by conventional NAS.

View on arXiv
@article{la2025_2504.08359,
  title={ Kernel-Level Energy-Efficient Neural Architecture Search for Tabular Dataset },
  author={ Hoang-Loc La and Phuong Hoai Ha },
  journal={arXiv preprint arXiv:2504.08359},
  year={ 2025 }
}
Comments on this paper