Understanding and Improving Knowledge Distillation Feb 10, 2020 Knowledge Distillation Model Compression
— Unverified 00 Understanding and Improving Lexical Choice in Non-Autoregressive Translation Dec 29, 2020 Knowledge Distillation Translation
— Unverified 00 Understanding Knowledge Distillation Jan 1, 2021 Knowledge Distillation
— Unverified 00 Understanding Knowledge Distillation in Non-autoregressive Machine Translation Nov 7, 2019 Knowledge Distillation Machine Translation
— Unverified 00 Understanding the Effect of Data Augmentation on Knowledge Distillation May 21, 2023 Data Augmentation Knowledge Distillation
— Unverified 00 Understanding the Gains from Repeated Self-Distillation Jul 5, 2024 Knowledge Distillation regression
— Unverified 00 Understanding the Overfitting of the Episodic Meta-training Jun 29, 2023 Knowledge Distillation
— Unverified 00 Understanding the Success of Knowledge Distillation -- A Data Augmentation Perspective Sep 29, 2021 Active Learning Data Augmentation
— Unverified 00 UNDO: Understanding Distillation as Optimization Apr 3, 2025 Knowledge Distillation
— Unverified 00 UniCompress: Enhancing Multi-Data Medical Image Compression with Knowledge Distillation May 27, 2024 Image Compression Knowledge Distillation
— Unverified 00 UNIDEAL: Curriculum Knowledge Distillation Federated Learning Sep 16, 2023 Federated Learning Knowledge Distillation
— Unverified 00 Unified and Effective Ensemble Knowledge Distillation Apr 1, 2022 Knowledge Distillation Transfer Learning
— Unverified 00 Unified Anomaly Detection methods on Edge Device using Knowledge Distillation and Quantization Jul 3, 2024 Anomaly Detection CPU
— Unverified 00 Unified Attacks to Large Language Model Watermarks: Spoofing and Scrubbing in Unauthorized Knowledge Distillation Apr 24, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Unified Locomotion Transformer with Simultaneous Sim-to-Real Transfer for Quadrupeds Mar 12, 2025 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 UniKD: Universal Knowledge Distillation for Mimicking Homogeneous or Heterogeneous Object Detectors Jan 1, 2023 Knowledge Distillation
— Unverified 00 Unimodal-driven Distillation in Multimodal Emotion Recognition with Dynamic Fusion Mar 31, 2025 Emotion Recognition Knowledge Distillation
— Unverified 00 UniMS: A Unified Framework for Multimodal Summarization with Knowledge Distillation Sep 13, 2021 Abstractive Text Summarization Decoder
— Unverified 00 Uni-Retriever: Towards Learning The Unified Embedding Based Retriever in Bing Sponsored Search Feb 13, 2022 Contrastive Learning Knowledge Distillation
— Unverified 00 Dual-mode ASR: Unify and Improve Streaming ASR with Full-context Modeling Oct 12, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Universal-KD: Attention-based Output-Grounded Intermediate Layer Knowledge Distillation Nov 1, 2021 Knowledge Distillation
— Unverified 00 Unlabeled Data Deployment for Classification of Diabetic Retinopathy Images Using Knowledge Transfer Feb 9, 2020 General Classification Knowledge Distillation
— Unverified 00 Unlearning Clients, Features and Samples in Vertical Federated Learning Jan 23, 2025 Federated Learning Inference Attack
— Unverified 00 Unlearning via Sparse Representations Nov 26, 2023 Knowledge Distillation
— Unverified 00 Unleashing the Potential of Mamba: Boosting a LiDAR 3D Sparse Detector by Using Cross-Model Knowledge Distillation Sep 17, 2024 3D Object Detection Autonomous Driving
— Unverified 00 Unlimited Knowledge Distillation for Action Recognition in the Dark Aug 18, 2023 Action Recognition GPU
— Unverified 00 Unlocking Real-Time Fluorescence Lifetime Imaging: Multi-Pixel Parallelism for FPGA-Accelerated Processing Oct 9, 2024 Knowledge Distillation Scheduling
— Unverified 00 Unlock the Power: Competitive Distillation for Multi-Modal Large Language Models Nov 14, 2023 Knowledge Distillation Transfer Learning
— Unverified 00 Unpaired Learning for Deep Image Deraining With Rain Direction Regularizer Jan 1, 2021 Knowledge Distillation Rain Removal
— Unverified 00 Unraveling Key Factors of Knowledge Distillation Dec 14, 2023 Knowledge Distillation Machine Translation
— Unverified 00 Unseen Object Instance Segmentation with Fully Test-time RGB-D Embeddings Adaptation Apr 21, 2022 Instance Segmentation Knowledge Distillation
— Unverified 00 Unsupervised 3D Perception with 2D Vision-Language Distillation for Autonomous Driving Sep 25, 2023 Autonomous Driving Knowledge Distillation
— Unverified 00 Unsupervised Deep Digital Staining For Microscopic Cell Images Via Knowledge Distillation Mar 3, 2023 Colorization Knowledge Distillation
— Unverified 00 Unsupervised Domain Adaptation for Segmentation with Black-box Source Model Aug 16, 2022 Domain Adaptation Knowledge Distillation
— Unverified 00 Unsupervised Learning of Neural Networks to Explain Neural Networks (extended abstract) Jan 21, 2019 Knowledge Distillation Object
— Unverified 00 Unsupervised Representation Transfer for Small Networks: I Believe I Can Distill On-the-Fly Dec 1, 2021 Knowledge Distillation Linear evaluation
— Unverified 00 Unsupervised Text Style Transfer via LLMs and Attention Masking with Multi-way Interactions Feb 21, 2024 In-Context Learning Knowledge Distillation
— Unverified 00 Unveiling Context-Aware Criteria in Self-Assessing LLMs Oct 28, 2024 Knowledge Distillation
— Unverified 00 Unveiling Incomplete Modality Brain Tumor Segmentation: Leveraging Masked Predicted Auto-Encoder and Divergence Learning Jun 12, 2024 Brain Tumor Segmentation Knowledge Distillation
— Unverified 00 Unveiling the Unseen Potential of Graph Learning through MLPs: Effective Graph Learners Using Propagation-Embracing MLPs Nov 20, 2023 Graph Learning Graph Neural Network
— Unverified 00 Uplifting Range-View-based 3D Semantic Segmentation in Real-Time with Multi-Sensor Fusion Jul 12, 2024 3D Semantic Segmentation Autonomous Driving
— Unverified 00 Using Advanced LLMs to Enhance Smaller LLMs: An Interpretable Knowledge Distillation Approach Aug 13, 2024 Knowledge Distillation
— Unverified 00 Using a GAN to Generate Adversarial Examples to Facial Image Recognition Nov 30, 2021 Face Recognition Generative Adversarial Network
— Unverified 00 Using Explainable Boosting Machine to Compare Idiographic and Nomothetic Approaches for Ecological Momentary Assessment Data Apr 4, 2022 Interpretable Machine Learning Knowledge Distillation
— Unverified 00 Using Knowledge Distillation to improve interpretable models in a retail banking context Sep 30, 2022 Data Augmentation Knowledge Distillation
— Unverified 00 Using Perturbed Length-aware Positional Encoding for Non-autoregressive Neural Machine Translation Jul 29, 2021 Knowledge Distillation Machine Translation
— Unverified 00 Using the Past Knowledge to Improve Sentiment Classification Nov 1, 2020 Classification Knowledge Distillation
— Unverified 00 V2X-VLM: End-to-End V2X Cooperative Autonomous Driving Through Large Vision-Language Models Aug 17, 2024 Autonomous Driving Contrastive Learning
— Unverified 00 Vanilla Feature Distillation for Improving the Accuracy-Robustness Trade-Off in Adversarial Training Jun 5, 2022 Knowledge Distillation
— Unverified 00 Variational Information Distillation for Knowledge Transfer Apr 11, 2019 Knowledge Distillation Transfer Learning
— Unverified 00