IL-NeRF: Incremental Learning for Neural Radiance Fields with Camera Pose Alignment Dec 10, 2023 Incremental Learning Knowledge Distillation
— Unverified 00 Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks Apr 29, 2025 Knowledge Distillation Transfer Learning
— Unverified 00 Decoupled Transformer for Scalable Inference in Open-domain Question Answering Sep 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Image-to-Video Re-Identification via Mutual Discriminative Knowledge Transfer Jan 21, 2022 Knowledge Distillation Transfer Learning
— Unverified 00 Headache to Overstock? Promoting Long-tail Items through Debiased Product Bundling Nov 28, 2024 Knowledge Distillation Navigate
— Unverified 00 Decoupled Transformer for Scalable Inference in Open-domain Question Answering Aug 5, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks Apr 19, 2023 Knowledge Distillation
— Unverified 00 Impossible Triangle: What's Next for Pre-trained Language Models? Apr 13, 2022 Data Augmentation Few-Shot Learning
— Unverified 00 AMD: Automatic Multi-step Distillation of Large-scale Vision Models Jul 5, 2024 image-classification Image Classification
— Unverified 00 hdl2v: A Code Translation Dataset for Enhanced LLM Verilog Generation Jun 5, 2025 Code Generation Code Translation
— Unverified 00 Spectral Maps for Learning on Subgraphs May 30, 2022 Graph Learning Knowledge Distillation
— Unverified 00 Harnessing Increased Client Participation with Cohort-Parallel Federated Learning May 24, 2024 Federated Learning image-classification
— Unverified 00 Harmonizing knowledge Transfer in Neural Network with Unified Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 Improved implicit diffusion model with knowledge distillation to estimate the spatial distribution density of carbon stock in remote sensing imagery Nov 27, 2024 Knowledge Distillation
— Unverified 00 HARD: Hard Augmentations for Robust Distillation May 24, 2023 Data Augmentation Domain Generalization
— Unverified 00 Hard Gate Knowledge Distillation -- Leverage Calibration for Robust and Reliable Language Model Oct 22, 2022 Knowledge Distillation Language Modeling
— Unverified 00 BiM-VFI: Bidirectional Motion Field-Guided Frame Interpolation for Video with Non-uniform Motions Jan 1, 2025 Knowledge Distillation Motion Estimation
— Unverified 00 Improved Knowledge Distillation via Adversarial Collaboration Nov 29, 2021 Knowledge Distillation
— Unverified 00 AMD: Adaptive Masked Distillation for Object Detection Jan 31, 2023 Knowledge Distillation Model Compression
— Unverified 00 HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training Jul 15, 2025 Cross-Lingual Transfer Knowledge Distillation
— Unverified 00 Hands-on Guidance for Distilling Object Detectors Mar 26, 2021 Knowledge Distillation Object
— Unverified 00 Decoupled Alignment for Robust Plug-and-Play Adaptation Jun 3, 2024 Knowledge Distillation
— Unverified 00 Handling Long-tailed Feature Distribution in AdderNets Dec 1, 2021 Knowledge Distillation
— Unverified 00 Improve Knowledge Distillation via Label Revision and Data Selection Apr 3, 2024 Knowledge Distillation Model Compression
— Unverified 00 De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts Mar 28, 2024 Causal Inference Data-free Knowledge Distillation
— Unverified 00 Improving Acoustic Scene Classification in Low-Resource Conditions Dec 30, 2024 Acoustic Scene Classification Classification
— Unverified 00 GVP: Generative Volumetric Primitives Mar 31, 2023 Image Generation Knowledge Distillation
— Unverified 00 Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation Jun 12, 2021 Decoder Knowledge Distillation
— Unverified 00 Improving Autoregressive NMT with Non-Autoregressive Model Jul 1, 2020 Decoder de-en
— Unverified 00 Improving CLIP Robustness with Knowledge Distillation and Self-Training Sep 19, 2023 Knowledge Distillation
— Unverified 00 Bilateral Memory Consolidation for Continual Learning Jan 1, 2023 Continual Learning Knowledge Distillation
— Unverified 00 Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and Knowledge Distillation Apr 17, 2019 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Guided Deep Metric Learning Jun 4, 2022 Few-Shot Learning Knowledge Distillation
— Unverified 00 GTCOM Neural Machine Translation Systems for WMT19 Aug 1, 2019 Knowledge Distillation Language Modeling
— Unverified 00 Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner Jun 5, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 Improving De-Raining Generalization via Neural Reorganization Jan 1, 2021 Knowledge Distillation
— Unverified 00 Growing Deep Neural Network Considering with Similarity between Neurons Aug 23, 2024 Decision Making Knowledge Distillation
— Unverified 00 Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space Apr 1, 2021 Federated Learning Knowledge Distillation
— Unverified 00 Debias the Black-box: A Fair Ranking Framework via Knowledge Distillation Aug 24, 2022 Fairness Information Retrieval
— Unverified 00 Improving Facial Landmark Detection Accuracy and Efficiency with Knowledge Distillation Apr 9, 2024 Emotion Recognition Facial Landmark Detection
— Unverified 00 Improving Feature Generalizability with Multitask Learning in Class Incremental Learning Apr 26, 2022 class-incremental learning Class Incremental Learning
— Unverified 00 Improving Frame-level Classifier for Word Timings with Non-peaky CTC in End-to-End Automatic Speech Recognition Jun 9, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Always Strengthen Your Strengths: A Drift-Aware Incremental Learning Framework for CTR Prediction Apr 17, 2023 Click-Through Rate Prediction Diversity
— Unverified 00 Adaptively Integrated Knowledge Distillation and Prediction Uncertainty for Continual Learning Jan 18, 2023 Continual Learning Knowledge Distillation
— Unverified 00 Improving Generalization of Pre-trained Language Models via Stochastic Weight Averaging Dec 12, 2022 Knowledge Distillation Question Answering
— Unverified 00 Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning Aug 26, 2023 Knowledge Distillation Model Compression
— Unverified 00 A Closer Look at Knowledge Distillation with Features, Logits, and Gradients Mar 18, 2022 Incremental Learning Knowledge Distillation
— Unverified 00 Sentence-wise Speech Summarization: Task, Datasets, and End-to-End Modeling with LM Knowledge Distillation Aug 1, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 AdvFunMatch: When Consistent Teaching Meets Adversarial Robustness May 24, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 00 Group-Mix SAM: Lightweight Solution for Industrial Assembly Line Applications Mar 15, 2024 Knowledge Distillation
— Unverified 00