ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.21060
29
0

Efficient Transformer-based Decoder for Varshamov-Tenengolts Codes

28 February 2025
Yali Wei
Alan J.X. Guo
Zihui Yan
Yufan Dai
ArXivPDFHTML
Abstract

In recent years, the rise of DNA data storage technology has brought significant attention to the challenge of correcting insertion, deletion, and substitution (IDS) errors. Among various coding methods for IDS correction, Varshamov-Tenengolts (VT) codes, primarily designed for single-error correction, have emerged as a central research focus. While existing decoding methods achieve high accuracy in correcting a single error, they often fail to correct multiple IDS errors. In this work, we observe that VT codes retain some capability for addressing multiple errors by introducing a transformer-based VT decoder (TVTD) along with symbol- and statistic-based codeword embedding. Experimental results demonstrate that the proposed TVTD achieves perfect correction of a single error. Furthermore, when decoding multiple errors across various codeword lengths, the bit error rate and frame error rate are significantly improved compared to existing hard decision and soft-in soft-out algorithms. Additionally, through model architecture optimization, the proposed method reduces time consumption by an order of magnitude compared to other soft decoders.

View on arXiv
@article{wei2025_2502.21060,
  title={ Efficient Transformer-based Decoder for Varshamov-Tenengolts Codes },
  author={ Yali Wei and Alan J.X. Guo and Zihui Yan and Yufan Dai },
  journal={arXiv preprint arXiv:2502.21060},
  year={ 2025 }
}
Comments on this paper