ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.14831
53
0

Robust Transmission of Punctured Text with Large Language Model-based Recovery

19 March 2025
Sojeong Park
Hyeonho Noh
Hyun Jong Yang
    VLM
ArXivPDFHTML
Abstract

With the recent advancements in deep learning, semantic communication which transmits only task-oriented features, has rapidly emerged. However, since feature extraction relies on learning-based models, its performance fundamentally depends on the training dataset or tasks. For practical scenarios, it is essential to design a model that demonstrates robust performance regardless of dataset or tasks. In this correspondence, we propose a novel text transmission model that selects and transmits only a few characters and recovers the missing characters at the receiver using a large language model (LLM). Additionally, we propose a novel importance character extractor (ICE), which selects transmitted characters to enhance LLM recovery performance. Simulations demonstrate that the proposed filter selection by ICE outperforms random filter selection, which selects transmitted characters randomly. Moreover, the proposed model exhibits robust performance across different datasets and tasks and outperforms traditional bit-based communication in low signal-to-noise ratio conditions.

View on arXiv
@article{park2025_2503.14831,
  title={ Robust Transmission of Punctured Text with Large Language Model-based Recovery },
  author={ Sojeong Park and Hyeonho Noh and Hyun Jong Yang },
  journal={arXiv preprint arXiv:2503.14831},
  year={ 2025 }
}
Comments on this paper