SOTAVerified

GFT: Graph Feature Tuning for Efficient Point Cloud Analysis

2025-12-01Code Available0· sign in to hype

Manish Dhakal, Venkat R. Dasari, Rajshekhar Sunderraman, Yi Ding

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Parameter-efficient fine-tuning (PEFT) significantly reduces computational and memory costs by updating only a small subset of the model's parameters, enabling faster adaptation to new tasks with minimal loss in performance. Previous studies have introduced PEFTs tailored for point cloud data, as general approaches are suboptimal. To further reduce the number of trainable parameters, we propose a point-cloud-specific PEFT, termed Graph Features Tuning (GFT), which learns a dynamic graph from initial tokenized inputs of the transformer using a lightweight graph convolution network and passes these graph features to deeper layers via skip connections and efficient cross-attention modules. Extensive experiments on object classification and segmentation tasks show that GFT operates in the same domain, rivalling existing methods, while reducing the trainable parameters. Code is available at https://github.com/manishdhakal/GFT.

Reproductions