| MeLT: Message-Level Transformer with Masked Document Representations as Pre-Training for Stance Detection | Sep 16, 2021 | AttributeLanguage Modeling | CodeCode Available | 0 |
| Split-and-Rephrase in a Cross-Lingual Manner: A Complete Pipeline | Sep 1, 2021 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Domain-Specific Japanese ELECTRA Model Using a Small Corpus | Sep 1, 2021 | ArticlesComputational Efficiency | —Unverified | 0 |
| Prompt-Learning for Fine-Grained Entity Typing | Aug 24, 2021 | Entity TypingKnowledge Probing | —Unverified | 0 |
| Noobs at Semeval-2021 Task 4: Masked Language Modeling for abstract answer prediction | Aug 1, 2021 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Fine-Grained Emotion Prediction by Modeling Emotion Definitions | Jul 26, 2021 | Language ModelingLanguage Modelling | CodeCode Available | 0 |
| Learning to Sample Replacements for ELECTRA Pre-Training | Jun 25, 2021 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Winner Team Mia at TextVQA Challenge 2021: Vision-and-Language Representation Learning with Pre-trained Sequence-to-Sequence Model | Jun 24, 2021 | DecoderLanguage Modeling | —Unverified | 0 |
| SAS: Self-Augmentation Strategy for Language Model Pre-training | Jun 14, 2021 | Data AugmentationLanguage Modeling | CodeCode Available | 0 |
| MST: Masked Self-Supervised Transformer for Visual Representation | Jun 10, 2021 | Language ModelingLanguage Modelling | —Unverified | 0 |