| Little Exploration is All You Need | Oct 26, 2023 | AllThompson Sampling | —Unverified | 0 |
| Enhancing Document Information Analysis with Multi-Task Pre-training: A Robust Approach for Information Extraction in Visually-Rich Documents | Oct 25, 2023 | AllDocument Classification | —Unverified | 0 |
| 19 Parameters Is All You Need: Tiny Neural Networks for Particle Physics | Oct 24, 2023 | 3D AssemblyAll | CodeCode Available | 0 |
| Neural Collapse in Multi-label Learning with Pick-all-label Loss | Oct 24, 2023 | AllMulti-class Classification | CodeCode Available | 0 |
| Is Probing All You Need? Indicator Tasks as an Alternative to Probing Embedding Spaces | Oct 24, 2023 | All | —Unverified | 0 |
| "One-Size-Fits-All"? Examining Expectations around What Constitute "Fair" or "Good" NLG System Behaviors | Oct 23, 2023 | AllFairness | —Unverified | 0 |
| Learning Fair Representations with High-Confidence Guarantees | Oct 23, 2023 | AllFairness | CodeCode Available | 0 |
| One Model for All: Large Language Models are Domain-Agnostic Recommendation Systems | Oct 22, 2023 | AllLanguage Modeling | —Unverified | 0 |
| One-for-All: Towards Universal Domain Translation with a Single StyleGAN | Oct 22, 2023 | AllTranslation | —Unverified | 0 |
| Counterfactual Prediction Under Selective Confounding | Oct 21, 2023 | AllCausal Inference | CodeCode Available | 0 |