Committed to connecting the world

WTISD

DRF codes: Deep SNR-robust feedback codes

DRF codes: Deep SNR-robust feedback codes

Authors: Mahdi Boloursaz Mashhadi, Deniz Gunduz, Alberto Perotti, Branislav M. Popovic
Status: Final
Date of publication: 5 September 2023
Published in: ITU Journal on Future and Evolving Technologies, Volume 4 (2023), Issue 3, Pages 447-460
Article DOI : https://doi.org/10.52953/DAPE6014
Abstract:
We present a new Deep Neural Network (DNN)-based error correction code for fading channels with output feedback, called the Deep SNR-Robust Feedback (DRF) code. At the encoder, parity symbols are generated by a Long Short Term Memory (LSTM) network based on the message, as well as the past forward channel outputs observed by the transmitter in a noisy fashion. The decoder uses a bidirectional LSTM architecture along with a Signal to Noise Ratio (SNR)-aware attention NN to decode the message. The proposed code overcomes two major shortcomings of DNN-based codes over channels with passive output feedback: (i) the SNR-aware attention mechanism at the decoder enables reliable application of the same trained NN over a wide range of SNR values; (ii) curriculum training with batch size scheduling is used to speed up and stabilize training while improving the SNR-robustness of the resulting code. We show that the DRF codes outperform the existing DNN-based codes in terms of both the SNR-robustness and the error rate in an Additive White Gaussian Noise (AWGN) channel with noisy output feedback. In fading channels with perfect phase compensation at the receiver, DRF codes learn to efficiently exploit knowledge of the instantaneous fading amplitude (which is available to the encoder through feedback) to reduce the overhead and complexity associated with channel estimation at the decoder. Finally, we show the effectiveness of DRF codes in multicast channels with feedback, where linear feedback codes are known to be strictly suboptimal. These results show the feasibility of automatic design of new channel codes using DNN-based language models.

Keywords: Attention neural networks, channel coding, communication with feedback, curriculum training, LSTM language models
Rights: © International Telecommunication Union, available under the CC BY-NC-ND 3.0 IGO license.
electronic file
ITEM DETAILARTICLEPRICE
ENGLISH
PDF format   Full article (PDF)
Free of chargeDOWNLOAD