Better Supervisory Signals by Observing Learning Paths Mar 4, 2022 Knowledge Distillation
Code Code Available 0MIAShield: Defending Membership Inference Attacks via Preemptive Exclusion of Members Mar 2, 2022 image-classification Image Classification
— Unverified 0Dual Embodied-Symbolic Concept Representations for Deep Learning Mar 1, 2022 class-incremental learning Class Incremental Learning
— Unverified 0TRILLsson: Distilled Universal Paralinguistic Speech Representations Mar 1, 2022 Emotion Recognition Knowledge Distillation
— Unverified 0Confidence Based Bidirectional Global Context Aware Training Framework for Neural Machine Translation Feb 28, 2022 Decoder Knowledge Distillation
— Unverified 0Joint Answering and Explanation for Visual Commonsense Reasoning Feb 25, 2022 Knowledge Distillation Question Answering
Code Code Available 0Learn From the Past: Experience Ensemble Knowledge Distillation Feb 25, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Bridging the Gap Between Patient-specific and Patient-independent Seizure Prediction via Knowledge Distillation Feb 25, 2022 Knowledge Distillation Prediction
— Unverified 0Efficient Video Segmentation Models with Per-frame Inference Feb 24, 2022 Image Matting Instance Segmentation
— Unverified 0Are All Linear Regions Created Equal? Feb 23, 2022 All Knowledge Distillation
Code Code Available 0Multi-Teacher Knowledge Distillation for Incremental Implicitly-Refined Classification Feb 23, 2022 Classification Incremental Learning
— Unverified 0Distilled Neural Networks for Efficient Learning to Rank Feb 22, 2022 CPU Information Retrieval
Code Code Available 0Learning Bayesian Sparse Networks with Full Experience Replay for Continual Learning Feb 21, 2022 Continual Learning Knowledge Distillation
— Unverified 0A Novel Architecture Slimming Method for Network Pruning and Knowledge Distillation Feb 21, 2022 Knowledge Distillation Model Compression
— Unverified 0Cross-Task Knowledge Distillation in Multi-Task Recommendation Feb 20, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0Knowledge Distillation with Deep Supervision Feb 16, 2022 Knowledge Distillation Transfer Learning
Code Code Available 0No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices Feb 16, 2022 Federated Learning Knowledge Distillation
— Unverified 0EdgeFormer: A Parameter-Efficient Transformer for On-Device Seq2seq Generation Feb 16, 2022 Grammatical Error Correction Knowledge Distillation
— Unverified 0Meta Knowledge Distillation Feb 16, 2022 Data Augmentation Image Classification
— Unverified 0Uni-Retriever: Towards Learning The Unified Embedding Based Retriever in Bing Sponsored Search Feb 13, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0AI can evolve without labels: self-evolving vision transformer for chest X-ray diagnosis through knowledge distillation Feb 13, 2022 Deep Learning Diagnostic
— Unverified 0Distillation with Contrast is All You Need for Self-Supervised Point Cloud Representation Learning Feb 9, 2022 All Contrastive Learning
— Unverified 0Locally Differentially Private Distributed Deep Learning via Knowledge Distillation Feb 7, 2022 Deep Learning Knowledge Distillation
Code Code Available 0Adaptive Mixing of Auxiliary Losses in Supervised Learning Feb 7, 2022 Denoising Knowledge Distillation
Code Code Available 0Measuring and Reducing Model Update Regression in Structured Prediction for NLP Feb 7, 2022 Dependency Parsing Knowledge Distillation
— Unverified 0Cross domain knowledge compression in realtime optical flow prediction on ultrasound sequences Feb 4, 2022 Knowledge Distillation Optical Flow Estimation
— Unverified 0Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition Feb 4, 2022 Classification Knowledge Distillation
— Unverified 0Bootstrapped Representation Learning for Skeleton-Based Action Recognition Feb 4, 2022 Action Recognition Data Augmentation
— Unverified 0Deep-Disaster: Unsupervised Disaster Detection and Localization Using Visual Data Jan 31, 2022 Humanitarian Knowledge Distillation
Code Code Available 0Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning Jan 30, 2022 Knowledge Distillation Network Pruning
— Unverified 0Improving Robustness by Enhancing Weak Subnets Jan 30, 2022 Adversarial Robustness Data Augmentation
Code Code Available 0AutoDistil: Few-shot Task-agnostic Neural Architecture Search for Distilling Large Language Models Jan 29, 2022 Inductive Bias Knowledge Distillation
— Unverified 0Dynamic Rectification Knowledge Distillation Jan 27, 2022 Edge-computing Knowledge Distillation
Code Code Available 0Adaptive Instance Distillation for Object Detection in Autonomous Driving Jan 26, 2022 Autonomous Driving Knowledge Distillation
— Unverified 0TrustAL: Trustworthy Active Learning using Knowledge Distillation Jan 26, 2022 Active Learning Diversity
— Unverified 0One Student Knows All Experts Know: From Sparse to Dense Jan 26, 2022 All Knowledge Distillation
— Unverified 0Jointly Learning Knowledge Embedding and Neighborhood Consensus with Relational Knowledge Distillation for Entity Alignment Jan 25, 2022 Benchmarking Entity Alignment
— Unverified 0Attentive Task Interaction Network for Multi-Task Learning Jan 25, 2022 Decoder Knowledge Distillation
Code Code Available 0Federated Unlearning with Knowledge Distillation Jan 24, 2022 Federated Learning Knowledge Distillation
— Unverified 0Can Model Compression Improve NLP Fairness Jan 21, 2022 Fairness Knowledge Distillation
— Unverified 0AutoDistill: an End-to-End Framework to Explore and Distill Hardware-Efficient Language Models Jan 21, 2022 Bayesian Optimization Knowledge Distillation
— Unverified 0Image-to-Video Re-Identification via Mutual Discriminative Knowledge Transfer Jan 21, 2022 Knowledge Distillation Transfer Learning
— Unverified 0UKD: Debiasing Conversion Rate Estimation via Uncertainty-regularized Knowledge Distillation Jan 20, 2022 Knowledge Distillation Selection bias
— Unverified 0Improving Neural Machine Translation by Denoising Training Jan 19, 2022 Denoising Knowledge Distillation
— Unverified 0Continual Coarse-to-Fine Domain Adaptation in Semantic Segmentation Jan 18, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 0Cross-modal Contrastive Distillation for Instructional Activity Anticipation Jan 18, 2022 Knowledge Distillation
— Unverified 0Knowledge Distillation as Self-Supervised Learning Jan 17, 2022 Knowledge Distillation Self-Supervised Learning
— Unverified 0KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Jan 16, 2022 cross-modal alignment Knowledge Distillation
— Unverified 0Re2G: Retrieve, Rerank, Generate Jan 16, 2022 Fact Checking GPU
— Unverified 0Learning Cross-Lingual IR from an English Retriever Jan 16, 2022 Cross-Lingual Information Retrieval Information Retrieval
— Unverified 0