Trainable Fractional Fourier Transform
Emirhan Koç, Tuna Alikaşifoğlu, Arda Can Aras, Aykut Koç
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/tunakasif/torch-frftOfficialpytorch★ 55
- github.com/koc-lab/TrainableFrFTpytorch★ 19
Abstract
Recently, the fractional Fourier transform (FrFT) has been integrated into distinct deep neural network (DNN) models such as transformers, sequence models, and convolutional neural networks (CNNs). In these studies, the fraction order is considered a hyperparameter and tuned manually to find the suitable values. By taking these one step further, we extend the scope of FrFT and introduce it as a trainable layer in various neural network architectures, where the fraction order is learned in the training stage along with the network weights. First, we mathematically show that fraction order can be updated through backpropagation in the network training phase. To support this formulation, we conduct extensive experiments encompassing image classification and time series prediction tasks on benchmark datasets. Our results show that the trainable FrFT layers alleviate the need to search for suitable fraction orders and improve performance over time and Fourier domain approaches.