Abstract
TeraHertz wireless communication has been regarded as an effective technology to satisfy the ever-increasing requirements of high-rate services, where the imperfections of TeraHertz devices, including in-phase/quadrature imbalance, phase noise and nonlinearity, regarded as hybrid distortions, have to be investigated. Due to the hybrid distortions, the explicit system model could not be mathematically derived, which limits the performance of the state-of-the-art demodulation methods. Here, a deep learning-assisted demodulation methodology is proposed to improve the bit error rate performance. Specifically, a multiple-output deep feedforward neural network is designed to fit the mapping between the received signal and likelihood information, where the number of outputs equals to the size of the modulation set, thus enabling the demodulation of the overlapped received signals corresponding to different constellation points. Besides, a training set generation method is proposed to generate the training examples without the prior knowledge of likelihood information. Simulation results validate that the proposed learning-assisted methodology could improve the demodulation performance of the TeraHertz wireless systems under severe hybrid distortions.
Original language | English |
---|---|
Pages (from-to) | 325-329 |
Number of pages | 5 |
Journal | IEEE Communications Letters |
Volume | 26 |
Issue number | 2 |
DOIs | |
Publication status | Published - 1 Feb 2022 |
Externally published | Yes |
Keywords
- TeraHertz wireless communication
- deep feedforward neural network
- hybrid distortion
- signal demodulation