1-bit Precoding Algorithm for Massive MIMO OFDM Downlink Systems with Deep Learning
The base station of a massive Multiple-Input Multiple-Output (MIMO) system is equipped with hundreds of antennas, enhancing the spectral efficiency of the system and increasing the system costs. To address this problem, our research group proposed a Convergence-Guaranteed Multi-Carrier one-bit precoding (CG-MC1bit) iterative algorithm suitable for Orthogonal Frequency-Division Multiplexing (OFDM) downlink massive MIMO systems, which can ensure superior system performance. However, the corresponding computational complexity is high, hindering the practical application of the algorithm in real-time systems. To address this issue, we propose a model-driven, unfolding neural network, which is based on the CG-MC1bit iterative algorithm and introduces trainable parameters to replace high-complexity operations in forward propagation. In particular, we unfold the iterative algorithm into a neural network and introduce trainable parameters to replace high-complexity operations in forward propagation. Simulation results reveal that this method can automatically update parameters. In addition, compared with the traditional precoding algorithms, the proposed method has a higher convergence speed and lower computational complexity.