SOTAVerified

Tensor-based Model Reduction and Identification for Generalized Memory Polynomial

2025-02-28Unverified0· sign in to hype

Yuchao Wang, Yimin Wei

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Power amplifiers (PAs) are essential components in wireless communication systems, and the design of their behavioral models has been an important research topic for many years. The widely used generalized memory polynomial (GMP) model suffers from rapid growth in the number of parameters with increasing memory depths and nonlinearity order, which leads to a significant increase in model complexity and the risk of overfitting. In this study, we introduce tensor networks to compress the unknown coefficient tensor of the GMP model, resulting in three novel tensor-based GMP models. These models can achieve comparable performance to the GMP model, but with far fewer parameters and lower complexity. For the identification of these models, we derive the alternating least-squares (ALS) method to ensure the rapid updates and convergence of model parameters in an iterative manner. In addition, we notice that the horizontal slices of the third-order data tensor constructed from the input signals are Vandermonde matrices, which have a numerically low-rank structure. Hence, we further propose the RP-ALS algorithm, which first performs a truncated higher-order singular value decomposition on the data tensor to generate random projections, then conducts the ALS algorithm for the identification of projected models with downscaled dimensions, thus reducing the computational effort of the iterative process. The experimental results show that the proposed models outperform the full GMP model and sparse GMP model via LASSO regression in terms of the reduction in the number of parameters and running complexity.

Tasks

Reproductions