| Finetuned Language Models Are Zero-Shot Learners | Sep 3, 2021 | ARCCommon Sense Reasoning | CodeCode Available | 3 |
| W2v-BERT: Combining Contrastive Learning and Masked Language Modeling for Self-Supervised Speech Pre-Training | Aug 7, 2021 | Contrastive LearningLanguage Modeling | CodeCode Available | 3 |
| Evaluating Large Language Models Trained on Code | Jul 7, 2021 | Code GenerationHumanEval | CodeCode Available | 3 |
| Multi-objective Asynchronous Successive Halving | Jun 23, 2021 | FairnessHyperparameter Optimization | CodeCode Available | 3 |
| GLM: General Language Model Pretraining with Autoregressive Blank Infilling | Mar 18, 2021 | Abstractive Text SummarizationClassification | CodeCode Available | 3 |
| Prefix-Tuning: Optimizing Continuous Prompts for Generation | Jan 1, 2021 | Language ModelingLanguage Modelling | CodeCode Available | 3 |
| PGL at TextGraphs 2020 Shared Task: Explanation Regeneration using Language and Graph Learning Methods | Dec 1, 2020 | Graph LearningLanguage Modeling | CodeCode Available | 3 |
| ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding | Oct 23, 2020 | Language ModelingLanguage Modelling | CodeCode Available | 3 |
| Language Models are Few-Shot Learners | May 28, 2020 | answerability predictionArticles | CodeCode Available | 3 |
| Conformer: Convolution-augmented Transformer for Speech Recognition | May 16, 2020 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | CodeCode Available | 3 |