SOTAVerified

PCT: Point cloud transformer

2020-12-17Code Available1· sign in to hype

Meng-Hao Guo, Jun-Xiong Cai, Zheng-Ning Liu, Tai-Jiang Mu, Ralph R. Martin, Shi-Min Hu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The irregular domain and lack of ordering make it challenging to design deep neural networks for point cloud processing. This paper presents a novel framework named Point Cloud Transformer(PCT) for point cloud learning. PCT is based on Transformer, which achieves huge success in natural language processing and displays great potential in image processing. It is inherently permutation invariant for processing a sequence of points, making it well-suited for point cloud learning. To better capture local context within the point cloud, we enhance input embedding with the support of farthest point sampling and nearest neighbor search. Extensive experiments demonstrate that the PCT achieves the state-of-the-art performance on shape classification, part segmentation and normal estimation tasks.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
IntrAPCTF1 score (5-fold)0.91Unverified
ModelNet40Point Cloud TransformerOverall Accuracy93.2Unverified
ModelNet40-CPCTError Rate0.26Unverified

Reproductions