|
Neural network compression with feedback magnitude pruning for automatic modulation classification
|
Authors: Jakob Krzyston, Rajib Bhattacharjea, Andrew Stark Status: Final Date of publication: 13 July 2022 Published in: ITU Journal on Future and Evolving Technologies, Volume 3 (2022), Issue 2, Pages 157-164 Article DOI : https://doi.org/10.52953/EUJF4214
|
Abstract: In the past few years, there have been numerous demonstrations of neural networks outperforming traditional signal processing methods in communications, notably for Automatic Modulation Classification (AMC). Despite the increase in accuracy, these algorithms are notoriously infeasible for integrating into edge computing applications. In this work, we propose an enhanced version of a simple neural network pruning technique, Iterative Magnitude Pruning (IMP), called Feedback Magnitude Pruning (FMP) and demonstrate its effectiveness for the "Lightning-Fast Modulation Classification with Hardware-Effficient Neural Network" 2021 AI for Good: Machine Learning in 5G Challenge hosted by the International Telecommunications Union (ITU) and Xilinx. IMP achieved a compression ratio of 9.313, while our proposed FMP achieved a compression ratio of 831 and normalized cost of 0.0419. Our FMP result was awarded second place, demonstrating the compression and classification accuracy benefits of pruning with feedback. |
Keywords: Artificial intelligence, automatic modulation classification, feedback magnitude pruning, neural network compression Rights: © International Telecommunication Union, available under the CC BY-NC-ND 3.0 IGO license.
|
|
Detalle del artículo | Artículo | Precio | |
---|
| 0
| Gratuito | Descargar |
|
| |