196
v1v2 (latest)

Learning Robust Representations for Communications over Noisy Channels

Main:5 Pages
11 Figures
Bibliography:1 Pages
1 Tables
Abstract

We explore the use of FCNNs (Fully Connected Neural Networks) for designing end-to-end communication systems without taking any inspiration from existing classical communications models or error control coding. This work relies solely on the tools of information theory and machine learning. We investigate the impact of using various cost functions based on mutual information and pairwise distances between codewords to generate robust representations for transmission under strict power constraints. Additionally, we introduce a novel encoder structure inspired by the Barlow Twins framework. Our results show that iterative training with randomly chosen noise power levels while minimizing block error rate provides the best error performance.

View on arXiv
Comments on this paper