| Neural Codec Language Models are Zero-Shot Text to Speech Synthesizers | Jan 5, 2023 | In-Context LearningLanguage Modeling | CodeCode Available | 7 |
| SegGPT: Towards Segmenting Everything in Context | Jan 1, 2023 | Few-Shot Semantic SegmentationIn-Context Learning | —Unverified | 0 |
| A Survey on In-context Learning | Dec 31, 2022 | In-Context LearningSurvey | CodeCode Available | 2 |
| Demonstrate-Search-Predict: Composing retrieval and language models for knowledge-intensive NLP | Dec 28, 2022 | In-Context LearningLanguage Modelling | CodeCode Available | 7 |
| Parallel Context Windows for Large Language Models | Dec 21, 2022 | In-Context LearningPlaying the Game of 2048 | CodeCode Available | 1 |
| Prompt-Augmented Linear Probing: Scaling beyond the Limit of Few-shot In-Context Learners | Dec 21, 2022 | In-Context LearningLanguage Modeling | —Unverified | 0 |
| In-context Learning Distillation: Transferring Few-shot Learning Ability of Pre-trained Language Models | Dec 20, 2022 | Few-Shot LearningIn-Context Learning | —Unverified | 0 |
| Ontologically Faithful Generation of Non-Player Character Dialogues | Dec 20, 2022 | Dialogue GenerationIn-Context Learning | —Unverified | 0 |
| Why Can GPT Learn In-Context? Language Models Implicitly Perform Gradient Descent as Meta-Optimizers | Dec 20, 2022 | In-Context LearningOpen-Ended Question Answering | —Unverified | 0 |
| Data Curation Alone Can Stabilize In-context Learning | Dec 20, 2022 | DiversityIn-Context Learning | CodeCode Available | 1 |