Sign Language Recognition
Sign Language Recognition is a computer vision and natural language processing task that involves automatically recognizing and translating sign language gestures into written or spoken language. The goal of sign language recognition is to develop algorithms that can understand and interpret sign language, enabling people who use sign language as their primary mode of communication to communicate more easily with non-signers.
( Image credit: Word-level Deep Sign Language Recognition from Video: A New Large-scale Dataset and Methods Comparison )
Papers
Showing 1–10 of 297 papers
All datasetsRWTH-PHOENIX-Weather 2014RWTH-PHOENIX-Weather 2014 TCSL-DailyAUTSLWLASL-2000WLASL100ChicagoFSWildLSA64ZnakiMSASL-1000WLASLBOBSL
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SubUNets | Word Error Rate (WER) | 40.7 | — | Unverified |
| 2 | CTF-MM | Word Error Rate (WER) | 37.8 | — | Unverified |
| 3 | DTN | Word Error Rate (WER) | 36.5 | — | Unverified |
| 4 | SAN | Word Error Rate (WER) | 29.7 | — | Unverified |
| 5 | Stochastic CSLR | Word Error Rate (WER) | 25.3 | — | Unverified |
| 6 | CrossModal | Word Error Rate (WER) | 24 | — | Unverified |
| 7 | SLRGAN | Word Error Rate (WER) | 23.4 | — | Unverified |
| 8 | DNF | Word Error Rate (WER) | 22.86 | — | Unverified |
| 9 | VAC | Word Error Rate (WER) | 22.1 | — | Unverified |
| 10 | MSKA-SLR | Word Error Rate (WER) | 22.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Stochastic CSLR | Word Error Rate (WER) | 26.1 | — | Unverified |
| 2 | CrossModal | Word Error Rate (WER) | 24.3 | — | Unverified |
| 3 | SignBT | Word Error Rate (WER) | 23.9 | — | Unverified |
| 4 | MMTLB | Word Error Rate (WER) | 22.45 | — | Unverified |
| 5 | SMKD | Word Error Rate (WER) | 22.4 | — | Unverified |
| 6 | STMC | Word Error Rate (WER) | 21 | — | Unverified |
| 7 | WRNN + LET | Word Error Rate (WER) | 20.73 | — | Unverified |
| 8 | MSKA-SLR | Word Error Rate (WER) | 20.5 | — | Unverified |
| 9 | C2SLR | Word Error Rate (WER) | 20.4 | — | Unverified |
| 10 | SignBERT+ | Word Error Rate (WER) | 19.9 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | BN-TIN+Transf. | Word Error Rate (WER) | 33.1 | — | Unverified |
| 2 | C2SLR | Word Error Rate (WER) | 31 | — | Unverified |
| 3 | SEN | Word Error Rate (WER) | 30.7 | — | Unverified |
| 4 | AdaBrowse | Word Error Rate (WER) | 30.6 | — | Unverified |
| 5 | CorrNet | Word Error Rate (WER) | 30.1 | — | Unverified |
| 6 | CTCA | Word Error Rate (WER) | 29.4 | — | Unverified |
| 7 | TCNet | Word Error Rate (WER) | 29.3 | — | Unverified |
| 8 | CorrNet+ACDR | Word Error Rate (WER) | 29 | — | Unverified |
| 9 | MSKA-SLR | Word Error Rate (WER) | 27.8 | — | Unverified |
| 10 | Swin-MSTP | Word Error Rate (WER) | 27.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | STF+LSTM | Rank-1 Recognition Rate | 0.99 | — | Unverified |
| 2 | SAM-SLR (RGB-D) | Rank-1 Recognition Rate | 0.99 | — | Unverified |
| 3 | 3D-DCNN + ST-MGCN | Rank-1 Recognition Rate | 0.98 | — | Unverified |
| 4 | Ensemble - NTIS | Rank-1 Recognition Rate | 0.96 | — | Unverified |
| 5 | HWGAT | Rank-1 Recognition Rate | 0.96 | — | Unverified |
| 6 | MViT-SLR | Rank-1 Recognition Rate | 0.96 | — | Unverified |
| 7 | FE+LSTM | Rank-1 Recognition Rate | 0.93 | — | Unverified |
| 8 | VTN-PF | Rank-1 Recognition Rate | 0.93 | — | Unverified |
| 9 | CNN+FPM+BLSTM+Attention (RGB-D) | Rank-1 Recognition Rate | 0.62 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Logos-Pretraining | Top-1 Accuracy | 66.82 | — | Unverified |
| 2 | Uni-Sign | Top-1 Accuracy | 63.52 | — | Unverified |
| 3 | NLA-SLR | Top-1 Accuracy | 61.26 | — | Unverified |
| 4 | StepNet | Top-1 Accuracy | 61.17 | — | Unverified |
| 5 | SAM-SLR | Top-1 Accuracy | 58.73 | — | Unverified |
| 6 | SWIN-SLR | Top-1 Accuracy | 58.51 | — | Unverified |
| 7 | HWGAT | Top-1 Accuracy | 48.49 | — | Unverified |
| 8 | I3D (pretraining: BSL-1K) | Top-1 Accuracy | 46.82 | — | Unverified |
| 9 | I3D | Top-1 Accuracy | 32.48 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | HandReader_RGB | CER (%) | 30.7 | — | Unverified |
| 2 | HandReader_KP | CER (%) | 28 | — | Unverified |
| 3 | HandReader_RGB | CER (%) | 27.6 | — | Unverified |
| 4 | HandReader_RGB_KP | CER (%) | 27.1 | — | Unverified |
| 5 | HandReader_KP | CER (%) | 26.2 | — | Unverified |
| 6 | HandReader_RGB+KP | CER (%) | 24.4 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | HandReader_RGB | CER (%) | 7.61 | — | Unverified |
| 2 | HandReader_KP | CER (%) | 7.35 | — | Unverified |
| 3 | HandReader_RGB_KP | CER (%) | 5.06 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | StepNet | Actions Top-1 | 77.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | MobileNetV2_TSM | Accuracy (Top-1) | 83.6 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | HWGAT | Top-1 Accuracy | 93.86 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | 3D-DCNN + ST-MGCN | Rank-1 Recognition Rate | 0.98 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Skeleton Image Representation | Accuracy | 82 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Skeleton Image Representation | Accuracy | 93 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | mVITv2-S | Mean Accuracy | 64.09 | — | Unverified |