What is Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias Oct 10, 2024 Age/Unbiased Fairness
— Unverified 00 What is Lost in Knowledge Distillation? Nov 7, 2023 Knowledge Distillation Model Compression
— Unverified 00 What Knowledge Gets Distilled in Knowledge Distillation? May 31, 2022 Knowledge Distillation
— Unverified 00 What Makes a Good Dataset for Knowledge Distillation? Nov 19, 2024 Continual Learning Knowledge Distillation
— Unverified 00 When Chosen Wisely, More Data Is What You Need: A Universal Sample-Efficient Strategy For Data Augmentation Nov 16, 2021 Data Augmentation HellaSwag
— Unverified 00 When Gradient Descent Meets Derivative-Free Optimization: A Match Made in Black-Box Scenario May 17, 2023 Knowledge Distillation
— Unverified 00 Which Student is Best? A Comprehensive Knowledge Distillation Exam for Task-Specific BERT Models Jan 3, 2022 CPU Data Augmentation
— Unverified 00 DQ-Whisper: Joint Distillation and Quantization for Efficient Multilingual Speech Recognition May 18, 2023 Knowledge Distillation Quantization
— Unverified 00 Whole-Slide Mitosis Detection in H&E Breast Histology Using PHH3 as a Reference to Train Distilled Stain-Invariant Convolutional Networks Aug 17, 2018 Data Augmentation Knowledge Distillation
— Unverified 00 Why distillation helps: a statistical perspective May 21, 2020 Knowledge Distillation Retrieval
— Unverified 00 Why Knowledge Distillation Amplifies Gender Bias and How to Mitigate from the Perspective of DistilBERT Jul 1, 2022 Knowledge Distillation
— Unverified 00 Why Knowledge Distillation Works in Generative Models: A Minimal Working Explanation May 19, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Winning Big with Small Models: Knowledge Distillation vs. Self-Training for Reducing Hallucination in QA Agents Feb 26, 2025 Hallucination Knowledge Distillation
— Unverified 00 Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning Jan 30, 2022 Knowledge Distillation Network Pruning
— Unverified 00 Wired Perspectives: Multi-View Wire Art Embraces Generative AI Nov 26, 2023 Knowledge Distillation
— Unverified 00 Wisdom of Committee: Distilling from Foundation Model to Specialized Application Model Feb 21, 2024 Knowledge Distillation model
— Unverified 00 WK-Pnet: FM-Based Positioning via Wavelet Packet Decomposition and Knowledge Distillation Apr 10, 2025 Knowledge Distillation Position
— Unverified 00 Word Sense Induction with Knowledge Distillation from BERT Apr 20, 2023 Knowledge Distillation Language Modeling
— Unverified 00 X^3KD: Knowledge Distillation Across Modalities, Tasks and Stages for Multi-Camera 3D Object Detection Mar 3, 2023 3D Object Detection Instance Segmentation
— Unverified 00 X3KD: Knowledge Distillation Across Modalities, Tasks and Stages for Multi-Camera 3D Object Detection Jan 1, 2023 3D Object Detection Instance Segmentation
— Unverified 00 XCOMPS: A Multilingual Benchmark of Conceptual Minimal Pairs Feb 27, 2025 Knowledge Distillation
— Unverified 00 XD: Cross-lingual Knowledge Distillation for Polyglot Sentence Embeddings Sep 25, 2019 Knowledge Distillation Language Modeling
— Unverified 00 X-Distill: Improving Self-Supervised Monocular Depth via Cross-Task Distillation Oct 24, 2021 Depth Estimation Knowledge Distillation
— Unverified 00 Xiaomi's Submissions for IWSLT 2020 Open Domain Translation Task Jul 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 00 X Modality Assisting RGBT Object Tracking Dec 27, 2023 Knowledge Distillation Object
— Unverified 00 xVLM2Vec: Adapting LVLM-based embedding models to multilinguality using Self-Knowledge Distillation Mar 12, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Yield Evaluation of Citrus Fruits based on the YoloV5 compressed by Knowledge Distillation Nov 16, 2022 Knowledge Distillation
— Unverified 00 YOLO in the Dark - Domain Adaptation Method for Merging Multiple Models - Aug 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 00 You Can Have Your Data and Balance It Too: Towards Balanced and Efficient Multilingual Models Oct 13, 2022 Cross-Lingual Transfer Knowledge Distillation
— Unverified 00 You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement Jan 1, 2023 Contrastive Learning Image Enhancement
— Unverified 00 Zero shot framework for satellite image restoration Jun 5, 2023 Disentanglement Image Restoration
— Unverified 00 Zero-shot Slot Filling in the Age of LLMs for Dialogue Systems Nov 28, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 00 Zoom-shot: Fast and Efficient Unsupervised Zero-Shot Transfer of CLIP to Vision Encoders with Multimodal Loss Jan 22, 2024 Knowledge Distillation zero-shot-classification
— Unverified 00 Deep Face Recognition Model Compression via Knowledge Transfer and Distillation Jun 3, 2019 Face Recognition Knowledge Distillation
— Unverified 00 1st Place Solution to the EPIC-Kitchens Action Anticipation Challenge 2022 Jul 10, 2022 Action Anticipation Knowledge Distillation
— Unverified 00 AdvFunMatch: When Consistent Teaching Meets Adversarial Robustness May 24, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 00 VIPeR: Visual Incremental Place Recognition with Adaptive Mining and Continual Learning Jul 31, 2024 Continual Learning Knowledge Distillation
— Unverified 00 Learning Effective Representations for Retrieval Using Self-Distillation with Adaptive Relevance Margins Jul 31, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Dynamic Object Queries for Transformer-based Incremental Object Detection Jul 31, 2024 Knowledge Distillation Object
— Unverified 00 Gemma 2: Improving Open Language Models at a Practical Size Jul 31, 2024 Knowledge Distillation
— Unverified 00 StyleRF-VolVis: Style Transfer of Neural Radiance Fields for Expressive Volume Visualization Jul 31, 2024 Knowledge Distillation NeRF
— Unverified 00 Sentence-wise Speech Summarization: Task, Datasets, and End-to-End Modeling with LM Knowledge Distillation Aug 1, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 DistillGrasp: Integrating Features Correlation with Knowledge Distillation for Depth Completion of Transparent Objects Aug 1, 2024 Depth Completion Feature Correlation
— Unverified 00 An approach to optimize inference of the DIART speaker diarization pipeline Aug 5, 2024 Inference Optimization Knowledge Distillation
— Unverified 00 VizECGNet: Visual ECG Image Network for Cardiovascular Diseases Classification with Multi-Modal Training and Knowledge Distillation Aug 6, 2024 ECG Classification Knowledge Distillation
— Unverified 00 Inference Optimizations for Large Language Models: Effects, Challenges, and Practical Considerations Aug 6, 2024 Knowledge Distillation Navigate
— Unverified 00 On Importance of Pruning and Distillation for Efficient Low Resource NLP Sep 21, 2024 Document Classification GPU
— Unverified 00 ATLAS: Autoformalizing Theorems through Lifting, Augmentation, and Synthesis of Data Feb 8, 2025 Knowledge Distillation
— Unverified 00 VRM: Knowledge Distillation via Virtual Relation Matching Feb 28, 2025 Knowledge Distillation Relation
— Unverified 00 Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging May 15, 2025 Continual Learning Diagnostic
— Unverified 00