SOTAVerified

When Counting Meets HMER: Counting-Aware Network for Handwritten Mathematical Expression Recognition

2022-07-23Code Available2· sign in to hype

Bohan Li, Ye Yuan, Dingkang Liang, Xiao Liu, Zhilong Ji, Jinfeng Bai, Wenyu Liu, Xiang Bai

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recently, most handwritten mathematical expression recognition (HMER) methods adopt the encoder-decoder networks, which directly predict the markup sequences from formula images with the attention mechanism. However, such methods may fail to accurately read formulas with complicated structure or generate long markup sequences, as the attention results are often inaccurate due to the large variance of writing styles or spatial layouts. To alleviate this problem, we propose an unconventional network for HMER named Counting-Aware Network (CAN), which jointly optimizes two tasks: HMER and symbol counting. Specifically, we design a weakly-supervised counting module that can predict the number of each symbol class without the symbol-level position annotations, and then plug it into a typical attention-based encoder-decoder model for HMER. Experiments on the benchmark datasets for HMER validate that both joint optimization and counting results are beneficial for correcting the prediction errors of encoder-decoder models, and CAN consistently outperforms the state-of-the-art methods. In particular, compared with an encoder-decoder model for HMER, the extra time cost caused by the proposed counting module is marginal. The source code is available at https://github.com/LBH1024/CAN.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CROHME 2014CAN-DWAPExpRate57Unverified
CROHME 2014CAN-ABMExpRate57.26Unverified
CROHME 2016CAN-ABMExpRate56.15Unverified
CROHME 2016CAN-DWAPExpRate56.06Unverified
CROHME 2019CAN-ABMExpRate55.96Unverified
CROHME 2019CAN-DWAPExpRate54.88Unverified
HME100KCAN-DWAPExpRate67.31Unverified
HME100KCAN-ABMExpRate68.09Unverified

Reproductions