A Deep Learning-Aided Detection Method for FTN-Based NOMA

Jianxiong Pan, Neng Ye, Aihua Wang*, Xiangming Li

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

14 Citations (Scopus)

Abstract

The rapid booming of future smart city applications and Internet of things (IoT) has raised higher demands on the next-generation radio access technologies with respect to connection density, spectral efficiency (SE), transmission accuracy, and detection latency. Recently, faster-than-Nyquist (FTN) and nonorthogonal multiple access (NOMA) have been regarded as promising technologies to achieve higher SE and massive connections, respectively. In this paper, we aim to exploit the joint benefits of FTN and NOMA by superimposing multiple FTN-based transmission signals on the same physical recourses. Considering the complicated intra- and interuser interferences introduced by the proposed transmission scheme, the conventional detection methods suffer from high computational complexity. To this end, we develop a novel sliding-window detection method by incorporating the state-of-the-art deep learning (DL) technology. The data-driven offline training is first applied to derive a near-optimal receiver for FTN-based NOMA, which is deployed online to achieve high detection accuracy as well as low latency. Monte Carlo simulation results validate that the proposed detector achieves higher detection accuracy than minimum mean squared error-frequency domain equalization (MMSE-FDE) and can even approach the performance of the maximum likelihood-based receiver with greatly reduced computational complexity, which is suitable for IoT applications in smart city with low latency and high reliability requirements.

Original languageEnglish
Article number5684851
JournalWireless Communications and Mobile Computing
Volume2020
DOIs
Publication statusPublished - 2020

Fingerprint

Dive into the research topics of 'A Deep Learning-Aided Detection Method for FTN-Based NOMA'. Together they form a unique fingerprint.

Cite this