Wavelet Knowledge Distillation: Towards Efficient Image-to-Image Translation Mar 12, 2022 Image-to-Image Translation Knowledge Distillation
— Unverified 0WAVE: Weight Template for Adaptive Initialization of Variable-sized Models Jun 25, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Weakly Supervised Cross-lingual Semantic Relation Classification via Knowledge Distillation Nov 1, 2019 Classification Cross-Lingual Transfer
— Unverified 0Weakly Supervised Dense Video Captioning via Jointly Usage of Knowledge Distillation and Cross-modal Matching May 18, 2021 Caption Generation Cross-Modal Retrieval
— Unverified 0Weakly-Supervised Domain Adaptation of Deep Regression Trackers via Reinforced Knowledge Distillation Mar 26, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0Weakly-supervised HOI Detection via Prior-guided Bi-level Representation Learning Mar 2, 2023 Human-Object Interaction Detection Knowledge Distillation
— Unverified 0Weakly Supervised Monocular 3D Detection with a Single-View Image Feb 29, 2024 Knowledge Distillation Object Localization
— Unverified 0Weakly Supervised Semantic Segmentation via Alternative Self-Dual Teaching Dec 17, 2021 Knowledge Distillation Semantic Segmentation
— Unverified 0Weak-to-Strong Backdoor Attack for Large Language Models Sep 26, 2024 Backdoor Attack Knowledge Distillation
— Unverified 0Wearable Accelerometer Foundation Models for Health via Knowledge Distillation Dec 15, 2024 Activity Recognition cross-modal alignment
— Unverified 0WebChild 2.0 : Fine-Grained Commonsense Knowledge Distillation Jul 1, 2017 Knowledge Distillation Semantic Parsing
— Unverified 0Web Content Filtering through knowledge distillation of Large Language Models May 8, 2023 Knowledge Distillation
— Unverified 0WebUOT-1M: Advancing Deep Underwater Object Tracking with A Million-Scale Benchmark May 30, 2024 Knowledge Distillation Object Tracking
— Unverified 0WeChat Neural Machine Translation Systems for WMT20 Oct 1, 2020 Knowledge Distillation Machine Translation
— Unverified 0WeChat Neural Machine Translation Systems for WMT21 Aug 5, 2021 Knowledge Distillation Machine Translation
— Unverified 0WeClick: Weakly-Supervised Video Semantic Segmentation with Click Annotations Jul 7, 2021 Knowledge Distillation Model Compression
— Unverified 0Weight Averaging: A Simple Yet Effective Method to Overcome Catastrophic Forgetting in Automatic Speech Recognition Oct 27, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Weight Decay Scheduling and Knowledge Distillation for Active Learning Aug 1, 2020 Active Learning Knowledge Distillation
— Unverified 0Weight Distillation: Transferring the Knowledge in Neural Network Parameters Sep 19, 2020 Knowledge Distillation Machine Translation
— Unverified 0Weighted KL-Divergence for Document Ranking Model Refinement Jun 10, 2024 Contrastive Learning Document Ranking
— Unverified 0Weight Squeezing: Reparameterization for Compression and Fast Inference May 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Robustness Challenges in Model Distillation and Pruning for Natural Language Understanding Oct 16, 2021 Knowledge Distillation Model Compression
— Unverified 0What do larger image classifiers memorise? Oct 9, 2023 image-classification Image Classification
— Unverified 0What Happens When Small Is Made Smaller? Exploring the Impact of Compression on Small Data Pretrained Language Models Apr 6, 2024 Knowledge Distillation Language Modeling
— Unverified 0What is Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias Oct 10, 2024 Age/Unbiased Fairness
— Unverified 0What is Lost in Knowledge Distillation? Nov 7, 2023 Knowledge Distillation Model Compression
— Unverified 0What Knowledge Gets Distilled in Knowledge Distillation? May 31, 2022 Knowledge Distillation
— Unverified 0What Makes a Good Dataset for Knowledge Distillation? Nov 19, 2024 Continual Learning Knowledge Distillation
— Unverified 0When Chosen Wisely, More Data Is What You Need: A Universal Sample-Efficient Strategy For Data Augmentation Nov 16, 2021 Data Augmentation HellaSwag
— Unverified 0When Gradient Descent Meets Derivative-Free Optimization: A Match Made in Black-Box Scenario May 17, 2023 Knowledge Distillation
— Unverified 0Which Student is Best? A Comprehensive Knowledge Distillation Exam for Task-Specific BERT Models Jan 3, 2022 CPU Data Augmentation
— Unverified 0DQ-Whisper: Joint Distillation and Quantization for Efficient Multilingual Speech Recognition May 18, 2023 Knowledge Distillation Quantization
— Unverified 0Whole-Slide Mitosis Detection in H&E Breast Histology Using PHH3 as a Reference to Train Distilled Stain-Invariant Convolutional Networks Aug 17, 2018 Data Augmentation Knowledge Distillation
— Unverified 0Why distillation helps: a statistical perspective May 21, 2020 Knowledge Distillation Retrieval
— Unverified 0Why Knowledge Distillation Amplifies Gender Bias and How to Mitigate from the Perspective of DistilBERT Jul 1, 2022 Knowledge Distillation
— Unverified 0Why Knowledge Distillation Works in Generative Models: A Minimal Working Explanation May 19, 2025 Knowledge Distillation Language Modeling
— Unverified 0Winning Big with Small Models: Knowledge Distillation vs. Self-Training for Reducing Hallucination in QA Agents Feb 26, 2025 Hallucination Knowledge Distillation
— Unverified 0Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning Jan 30, 2022 Knowledge Distillation Network Pruning
— Unverified 0Wired Perspectives: Multi-View Wire Art Embraces Generative AI Nov 26, 2023 Knowledge Distillation
— Unverified 0Wisdom of Committee: Distilling from Foundation Model to Specialized Application Model Feb 21, 2024 Knowledge Distillation model
— Unverified 0WK-Pnet: FM-Based Positioning via Wavelet Packet Decomposition and Knowledge Distillation Apr 10, 2025 Knowledge Distillation Position
— Unverified 0Word Sense Induction with Knowledge Distillation from BERT Apr 20, 2023 Knowledge Distillation Language Modeling
— Unverified 0X^3KD: Knowledge Distillation Across Modalities, Tasks and Stages for Multi-Camera 3D Object Detection Mar 3, 2023 3D Object Detection Instance Segmentation
— Unverified 0X3KD: Knowledge Distillation Across Modalities, Tasks and Stages for Multi-Camera 3D Object Detection Jan 1, 2023 3D Object Detection Instance Segmentation
— Unverified 0XCOMPS: A Multilingual Benchmark of Conceptual Minimal Pairs Feb 27, 2025 Knowledge Distillation
— Unverified 0XD: Cross-lingual Knowledge Distillation for Polyglot Sentence Embeddings Sep 25, 2019 Knowledge Distillation Language Modeling
— Unverified 0X-Distill: Improving Self-Supervised Monocular Depth via Cross-Task Distillation Oct 24, 2021 Depth Estimation Knowledge Distillation
— Unverified 0Xiaomi's Submissions for IWSLT 2020 Open Domain Translation Task Jul 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0X Modality Assisting RGBT Object Tracking Dec 27, 2023 Knowledge Distillation Object
— Unverified 0xVLM2Vec: Adapting LVLM-based embedding models to multilinguality using Self-Knowledge Distillation Mar 12, 2025 Knowledge Distillation Language Modeling
— Unverified 0