Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation Apr 2, 2022 class-incremental learning Class Incremental Learning
Code Code Available 1A Dual-Contrastive Framework for Low-Resource Cross-Lingual Named Entity Recognition Apr 2, 2022 Contrastive Learning Cross-Lingual NER
Code Code Available 0Fast Real-time Personalized Speech Enhancement: End-to-End Enhancement Network (E3Net) and Knowledge Distillation Apr 2, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Feature Structure Distillation with Centered Kernel Alignment in BERT Transferring Apr 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 1End-to-End Zero-Shot HOI Detection via Vision and Language Knowledge Distillation Apr 1, 2022 Human-Object Interaction Detection Knowledge Distillation
Code Code Available 1Knowledge distillation with error-correcting transfer learning for wind power prediction Apr 1, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Unified and Effective Ensemble Knowledge Distillation Apr 1, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Rethinking Position Bias Modeling with Knowledge Distillation for CTR Prediction Apr 1, 2022 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Preventing Distillation-based Attacks on Neural Network IP Apr 1, 2022 Knowledge Distillation
— Unverified 0Distill-VQ: Learning Retrieval Oriented Vector Quantization By Distilling Knowledge from Dense Embeddings Apr 1, 2022 Contrastive Learning Knowledge Distillation
Code Code Available 1Conditional Autoregressors are Interpretable Classifiers Mar 31, 2022 Classification image-classification
— Unverified 0A Closer Look at Rehearsal-Free Continual Learning Mar 31, 2022 Continual Learning Knowledge Distillation
— Unverified 0It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher Mar 31, 2022 All Data Free Quantization
Code Code Available 1Adversarial Speaker Distillation for Countermeasure Model on Automatic Speaker Verification Mar 31, 2022 Knowledge Distillation Speaker Verification
— Unverified 0Rainbow Keywords: Efficient Incremental Learning for Online Spoken Keyword Spotting Mar 30, 2022 Data Augmentation Diversity
Code Code Available 1Device-Directed Speech Detection: Regularization via Distillation for Weakly-Supervised Models Mar 30, 2022 Knowledge Distillation
— Unverified 0Monitored Distillation for Positive Congruent Depth Completion Mar 30, 2022 Depth Completion Image Reconstruction
Code Code Available 1Self-Distillation from the Last Mini-Batch for Consistency Regularization Mar 30, 2022 Knowledge Distillation
Code Code Available 1Nix-TTS: Lightweight and End-to-End Text-to-Speech via Module-wise Distillation Mar 29, 2022 CPU Decoder
Code Code Available 2Instance Relation Graph Guided Source-Free Domain Adaptive Object Detection Mar 29, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 1Knowledge Distillation: Bad Models Can Be Good Role Models Mar 28, 2022 Knowledge Distillation Learning Theory
— Unverified 0RAVIR: A Dataset and Methodology for the Semantic Segmentation and Quantitative Analysis of Retinal Arteries and Veins in Infrared Reflectance Imaging Mar 28, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0Doodle It Yourself: Class Incremental Learning by Drawing a Few Sketches Mar 28, 2022 class-incremental learning Class Incremental Learning
— Unverified 0Uncertainty-aware Contrastive Distillation for Incremental Semantic Segmentation Mar 26, 2022 Contrastive Learning image-classification
Code Code Available 1Knowledge Distillation with the Reused Teacher Classifier Mar 26, 2022 Knowledge Distillation
Code Code Available 1Model LEGO: Creating Models Like Disassembling and Assembling Building Blocks Mar 25, 2022 Incremental Learning Knowledge Distillation
Code Code Available 1PCA-Based Knowledge Distillation Towards Lightweight and Content-Style Balanced Photorealistic Style Transfer Models Mar 25, 2022 Knowledge Distillation Style Transfer
Code Code Available 1A Cross-Domain Approach for Continuous Impression Recognition from Dyadic Audio-Visual-Physio Signals Mar 25, 2022 Knowledge Distillation Spoken Dialogue Systems
— Unverified 0Class-Incremental Learning for Action Recognition in Videos Mar 25, 2022 Action Recognition Action Recognition In Videos
— Unverified 0Rich Feature Construction for the Optimization-Generalization Dilemma Mar 24, 2022 Inductive Bias Knowledge Distillation
Code Code Available 1Ensembling and Knowledge Distilling of Large Sequence Taggers for Grammatical Error Correction Mar 24, 2022 Grammatical Error Correction Knowledge Distillation
Code Code Available 1R-DFCIL: Relation-Guided Representation Learning for Data-Free Class Incremental Learning Mar 24, 2022 class-incremental learning Class Incremental Learning
Code Code Available 1Multitask Emotion Recognition Model with Knowledge Distillation and Task Discriminator Mar 24, 2022 Emotion Recognition Knowledge Distillation
— Unverified 0Mitigating Gender Bias in Distilled Language Models via Counterfactual Role Reversal Mar 23, 2022 counterfactual Fairness
— Unverified 0Towards Expressive Speaking Style Modelling with Hierarchical Context Information for Mandarin Speech Synthesis Mar 23, 2022 Expressive Speech Synthesis Knowledge Distillation
— Unverified 0Scale-Equivalent Distillation for Semi-Supervised Object Detection Mar 23, 2022 Knowledge Distillation Object
— Unverified 0On Neural Network Equivalence Checking using SMT Solvers Mar 22, 2022 Knowledge Distillation
— Unverified 0Channel Self-Supervision for Online Knowledge Distillation Mar 22, 2022 Diversity Knowledge Distillation
— Unverified 0SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images Mar 22, 2022 Knowledge Distillation Lesion Classification
Code Code Available 1DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization Mar 21, 2022 Knowledge Distillation Model Compression
Code Code Available 1Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation Mar 21, 2022 Document-level Relation Extraction Knowledge Distillation
Code Code Available 1Open-Vocabulary One-Stage Detection with Hierarchical Visual-Language Knowledge Distillation Mar 20, 2022 Knowledge Distillation Language Modelling
Code Code Available 1Emulating Quantum Dynamics with Neural Networks via Knowledge Distillation Mar 19, 2022 Knowledge Distillation
Code Code Available 0A Closer Look at Knowledge Distillation with Features, Logits, and Gradients Mar 18, 2022 Incremental Learning Knowledge Distillation
— Unverified 0Delta Distillation for Efficient Video Processing Mar 17, 2022 Knowledge Distillation object-detection
Code Code Available 0When Chosen Wisely, More Data Is What You Need: A Universal Sample-Efficient Strategy For Data Augmentation Mar 17, 2022 Data Augmentation HellaSwag
Code Code Available 1Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning Mar 17, 2022 Data-free Knowledge Distillation Federated Learning
Code Code Available 1Sample, Translate, Recombine: Leveraging Audio Alignments for Data Augmentation in End-to-end Speech Translation Mar 16, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Domain Adaptive Hand Keypoint and Pixel Localization in the Wild Mar 16, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0Decoupled Knowledge Distillation Mar 16, 2022 image-classification Image Classification
Code Code Available 2