ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2511.00999
96
0

Transformer-Based Decoding in Concatenated Coding Schemes Under Synchronization Errors

International Symposium on Information Theory (ISIT), 2025
2 November 2025
Julian Streit
Franziska Weindel
Reinhard Heckel
ArXiv (abs)PDFHTMLGithub (2★)
Main:10 Pages
21 Figures
Bibliography:1 Pages
Appendix:5 Pages
Abstract

We consider the reconstruction of a codeword from multiple noisy copies that are independently corrupted by insertions, deletions, and substitutions. This problem arises, for example, in DNA data storage. A common code construction uses a concatenated coding scheme that combines an outer linear block code with an inner code, which can be either a nonlinear marker code or a convolutional code. Outer decoding is done with Belief Propagation, and inner decoding is done with the Bahl-Cocke-Jelinek-Raviv (BCJR) algorithm. However, the BCJR algorithm scales exponentially with the number of noisy copies, which makes it infeasible to reconstruct a codeword from more than about four copies. In this work, we introduce BCJRFormer, a transformer-based neural inner decoder. BCJRFormer achieves error rates comparable to the BCJR algorithm for binary and quaternary single-message transmissions of marker codes. Importantly, BCJRFormer scales quadratically with the number of noisy copies. This property makes BCJRFormer well-suited for DNA data storage, where multiple reads of the same DNA strand occur. To lower error rates, we replace the Belief Propagation outer decoder with a transformer-based decoder. Together, these modifications yield an efficient and performant end-to-end transformer-based pipeline for decoding multiple noisy copies affected by insertion, deletion, and substitution errors. Additionally, we propose a novel cross-attending transformer architecture called ConvBCJRFormer. This architecture extends BCJRFormer to decode transmissions of convolutional codewords, serving as an initial step toward joint inner and outer decoding for more general linear code classes.

View on arXiv
Comments on this paper