ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.18693
29
0

Technical Challenges in Maintaining Tax Prep Software with Large Language Models

25 April 2025
Sina Gogani-Khiabani
Varsha Dewangan
Nina Olson
Ashutosh Trivedi
Saeid Tizpaz-Niari
ArXivPDFHTML
Abstract

As the US tax law evolves to adapt to ever-changing politico-economic realities, tax preparation software plays a significant role in helping taxpayers navigate these complexities. The dynamic nature of tax regulations poses a significant challenge to accurately and timely maintaining tax software artifacts. The state-of-the-art in maintaining tax prep software is time-consuming and error-prone as it involves manual code analysis combined with an expert interpretation of tax law amendments. We posit that the rigor and formality of tax amendment language, as expressed in IRS publications, makes it amenable to automatic translation to executable specifications (code). Our research efforts focus on identifying, understanding, and tackling technical challenges in leveraging Large Language Models (LLMs), such as ChatGPT and Llama, to faithfully extract code differentials from IRS publications and automatically integrate them with the prior version of the code to automate tax prep software maintenance.

View on arXiv
@article{gogani-khiabani2025_2504.18693,
  title={ Technical Challenges in Maintaining Tax Prep Software with Large Language Models },
  author={ Sina Gogani-Khiabani and Varsha Dewangan and Nina Olson and Ashutosh Trivedi and Saeid Tizpaz-Niari },
  journal={arXiv preprint arXiv:2504.18693},
  year={ 2025 }
}
Comments on this paper