| Are More LLM Calls All You Need? Towards Scaling Laws of Compound Inference Systems | Mar 4, 2024 | AllLanguage Modelling | —Unverified | 0 |
| Cognition is All You Need -- The Next Layer of AI Above Large Language Models | Mar 4, 2024 | AllWorld Knowledge | —Unverified | 0 |
| Not All Layers of LLMs Are Necessary During Inference | Mar 4, 2024 | AllIn-Context Learning | —Unverified | 0 |
| Federated Linear Contextual Bandits with Heterogeneous Clients | Feb 29, 2024 | AllFederated Learning | —Unverified | 0 |
| Tree-Averaging Algorithms for Ensemble-Based Unsupervised Discontinuous Constituency Parsing | Feb 29, 2024 | AllConstituency Parsing | CodeCode Available | 0 |
| Utilizing Local Hierarchy with Adversarial Training for Hierarchical Text Classification | Feb 29, 2024 | AllMulti-Label Classification | CodeCode Available | 0 |
| One model to use them all: Training a segmentation model with complementary datasets | Feb 29, 2024 | AllAnatomy | CodeCode Available | 0 |
| Spatial Coherence Loss: All Objects Matter in Salient and Camouflaged Object Detection | Feb 28, 2024 | AllObject | —Unverified | 0 |
| Why Attention Graphs Are All We Need: Pioneering Hierarchical Classification of Hematologic Cell Populations with LeukoGraph | Feb 28, 2024 | AllGraph Attention | CodeCode Available | 0 |
| Quantum linear algebra is all you need for Transformer architectures | Feb 26, 2024 | All | —Unverified | 0 |