Abstract
The implementation of deep learning-based automatic modulation recognition (AMR) on resource-constrained edge devices calls for efficient networks. Unfortunately, existing AMR networks fail to balance parameter scale, inference speed, and recognition accuracy, which hinders their edge applications. In view of this, we propose an expert-assistant network for efficient AMR with small parameter scale and inference time. To exploit the advantages of both convolutional neural networks (CNNs) and recurrent neural networks (RNNs) for parameter scale reduction, we build a lightweight AMR network with a CNN-RNN hybrid architecture. Given the speed-and-accuracy dilemma faced by existing CNN-RNN hybrid networks, we introduce an small-scale plug-in, called the assistants, as well as a temporal shuffling scheme to enable fast and accurate AMR with small parameter scales. Besides, techniques like non-recurrent dropout for the gated recurrent unit (GRU) layer, parameter estimator and transformer (PET) and model pruning are applied to further enhance the performance of the networks. Extensive experimental results demonstrate that the proposed Expert-Assistant (E-A) network achieves the best comprehensive performance on lightweight, computational efficiency and recognition accuracy. Our model performs especially well with extremely low parameters, which achieves an average accuracy near 60% and a highest accuracy near 90% with just 3.5K non-zero parameters on RML2016.10a.
| Original language | English |
|---|---|
| Pages (from-to) | 9046-9061 |
| Number of pages | 16 |
| Journal | IEEE Transactions on Wireless Communications |
| Volume | 25 |
| DOIs | |
| Publication status | Published - 2026 |
| Externally published | Yes |
Keywords
- Automatic modulation recognition
- deep learning
- efficient network architecture
- temporal shuffling
Fingerprint
Dive into the research topics of 'An Expert-Assistant Network With Temporal Shuffling for Efficient Automatic Modulation Recognition'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver