Software optimization refines programs for resource efficiency while preserving functionality. Traditionally, it is a process done by developers and compilers. This paper introduces a third option, automated optimization at the source code level. We present Supersonic, a neural approach targeting minor source code modifications for optimization. Using a seq2seq model, Supersonic is trained on C/C++ program pairs (, ), where is an optimized version of , and outputs a diff. Supersonic's performance is benchmarked against OpenAI's GPT-3.5-Turbo and GPT-4 on competitive programming tasks. The experiments show that Supersonic not only outperforms both models on the code optimization task but also minimizes the extent of the change with a model more than 600x smaller than GPT-3.5-Turbo and 3700x smaller than GPT-4.
View on arXiv@article{chen2025_2309.14846, title={ Supersonic: Learning to Generate Source Code Optimizations in C/C++ }, author={ Zimin Chen and Sen Fang and Martin Monperrus }, journal={arXiv preprint arXiv:2309.14846}, year={ 2025 } }