While large language models (LLMs) are increasingly used for generating parallel scientific code, most current efforts emphasize functional correctness, often overlooking performance and energy considerations. In this work, we propose LASSI-EE, an automated LLM-based refactoring framework that generates energy-efficient parallel code on a target parallel system for a given parallel code as input. Through a multi-stage, iterative pipeline process, LASSI-EE achieved an average energy reduction of 47% across 85% of the 20 HeCBench benchmarks tested on NVIDIA A100 GPUs. Our findings demonstrate the broader potential of LLMs, not only for generating correct code but also for enabling energy-aware programming. We also address key insights and limitations within the framework, offering valuable guidance for future improvements.
View on arXiv@article{dearing2025_2505.02184, title={ Leveraging LLMs to Automate Energy-Aware Refactoring of Parallel Scientific Codes }, author={ Matthew T. Dearing and Yiheng Tao and Xingfu Wu and Zhiling Lan and Valerie Taylor }, journal={arXiv preprint arXiv:2505.02184}, year={ 2025 } }