Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation Dec 9, 2021 image-classification Image Classification
— Unverified 0Boosting Contrastive Learning with Relation Knowledge Distillation Dec 8, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Auxiliary Learning for Self-Supervised Video Representation via Similarity-based Knowledge Distillation Dec 7, 2021 Auxiliary Learning Knowledge Distillation
Code Code Available 0A Contrastive Distillation Approach for Incremental Semantic Segmentation in Aerial Images Dec 7, 2021 image-classification Image Classification
Code Code Available 1Improving Neural Cross-Lingual Summarization via Employing Optimal Transport Distance for Knowledge Distillation Dec 7, 2021 Knowledge Distillation Multi-Task Learning
Code Code Available 1ADD: Frequency Attention and Multi-View based Knowledge Distillation to Detect Low-Quality Compressed Deepfake Images Dec 7, 2021 DeepFake Detection Face Swapping
Code Code Available 0Safe Distillation Box Dec 5, 2021 Knowledge Distillation
— Unverified 0CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks Dec 5, 2021 Classification Continual Learning
— Unverified 0Extracting knowledge from features with multilevel abstraction Dec 4, 2021 Data Augmentation Knowledge Distillation
— Unverified 0KDCTime: Knowledge Distillation with Calibration on InceptionTime for Time-series Classification Dec 4, 2021 Knowledge Distillation Time Series
— Unverified 0Tiny-NewsRec: Effective and Efficient PLM-based News Recommendation Dec 2, 2021 Knowledge Distillation Natural Language Understanding
Code Code Available 1FedRAD: Federated Robust Adaptive Distillation Dec 2, 2021 Federated Learning Knowledge Distillation
— Unverified 0A Fast Knowledge Distillation Framework for Visual Recognition Dec 2, 2021 image-classification Image Classification
Code Code Available 1Information Theoretic Representation Distillation Dec 1, 2021 Classification with Binary Weight Network Knowledge Distillation
Code Code Available 1The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image Dec 1, 2021 Knowledge Distillation
Code Code Available 1Distilling Meta Knowledge on Heterogeneous Graph for Illicit Drug Trafficker Detection on Social Media Dec 1, 2021 Knowledge Distillation Marketing
Code Code Available 1Aligned Structured Sparsity Learning for Efficient Image Super-Resolution Dec 1, 2021 Image Super-Resolution Knowledge Distillation
Code Code Available 1Shapeshifter: a Parameter-efficient Transformer using Factorized Reshaped Matrices Dec 1, 2021 Knowledge Distillation Model Compression
Code Code Available 0Handling Long-tailed Feature Distribution in AdderNets Dec 1, 2021 Knowledge Distillation
— Unverified 0Slow Learning and Fast Inference: Efficient Graph Similarity Computation via Knowledge Distillation Dec 1, 2021 Anomaly Detection Graph Neural Network
Code Code Available 1Comprehensive Knowledge Distillation with Causal Intervention Dec 1, 2021 Causal Inference Knowledge Distillation
Code Code Available 1Analyzing the Confidentiality of Undistillable Teachers in Knowledge Distillation Dec 1, 2021 Knowledge Distillation
Code Code Available 0Adversarial Teacher-Student Representation Learning for Domain Generalization Dec 1, 2021 Data Augmentation Domain Generalization
Code Code Available 0Unsupervised Representation Transfer for Small Networks: I Believe I Can Distill On-the-Fly Dec 1, 2021 Knowledge Distillation Linear evaluation
— Unverified 0Using a GAN to Generate Adversarial Examples to Facial Image Recognition Nov 30, 2021 Face Recognition Generative Adversarial Network
— Unverified 0Improved Knowledge Distillation via Adversarial Collaboration Nov 29, 2021 Knowledge Distillation
— Unverified 0Efficient Federated Learning for AIoT Applications Using Knowledge Distillation Nov 29, 2021 Federated Learning Knowledge Distillation
— Unverified 0ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection Nov 28, 2021 3D Object Detection Knowledge Distillation
— Unverified 0WiFi-based Multi-task Sensing Nov 26, 2021 Gesture Recognition Indoor Localization
Code Code Available 1Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs Nov 26, 2021 Knowledge Distillation Translation
— Unverified 0EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation Nov 24, 2021 Event-based Object Segmentation Knowledge Distillation
Code Code Available 1Self-slimmed Vision Transformer Nov 24, 2021 Knowledge Distillation
Code Code Available 1Domain-Agnostic Clustering with Self-Distillation Nov 23, 2021 Clustering Data Augmentation
— Unverified 0Semi-Online Knowledge Distillation Nov 23, 2021 Knowledge Distillation Model Compression
Code Code Available 0Focal and Global Knowledge Distillation for Detectors Nov 23, 2021 image-classification Image Classification
Code Code Available 1Hierarchical Knowledge Distillation for Dialogue Sequence Labeling Nov 22, 2021 Knowledge Distillation Scene Segmentation
— Unverified 0Contrast-reconstruction Representation Learning for Self-supervised Skeleton-based Action Recognition Nov 22, 2021 Action Recognition Contrastive Learning
— Unverified 0Local-Selective Feature Distillation for Single Image Super-Resolution Nov 22, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Teacher-Student Training and Triplet Loss to Reduce the Effect of Drastic Face Occlusion Nov 20, 2021 Age Estimation Facial Expression Recognition
— Unverified 0Toxicity Detection can be Sensitive to the Conversational Context Nov 19, 2021 Data Augmentation Knowledge Distillation
— Unverified 0Dynamically pruning segformer for efficient semantic segmentation Nov 18, 2021 Knowledge Distillation Segmentation
— Unverified 0Hierarchical Knowledge Guided Learning for Real-world Retinal Diseases Recognition Nov 17, 2021 Knowledge Distillation
— Unverified 0An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition Nov 16, 2021 Cross-Lingual NER Knowledge Distillation
Code Code Available 0Redistributing Low-Frequency Words: Making the Most of Monolingual Data in Non-Autoregressive Translation Nov 16, 2021 Knowledge Distillation Translation
— Unverified 0When Chosen Wisely, More Data Is What You Need: A Universal Sample-Efficient Strategy For Data Augmentation Nov 16, 2021 Data Augmentation HellaSwag
— Unverified 0Multi-Granularity Contrastive Knowledge Distillation for Multimodal Named Entity Recognition Nov 16, 2021 Knowledge Distillation Multi-modal Named Entity Recognition
— Unverified 0Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Nov 16, 2021 Image Captioning Knowledge Distillation
— Unverified 0A Flexible Multi-Task Model for BERT Serving Nov 16, 2021 Knowledge Distillation model
— Unverified 0Compositional Data Augmentation for Abstractive Conversation Summarization Nov 16, 2021 Conversation Summarization Data Augmentation
— Unverified 0Deep-to-bottom Weights Decay: A Systemic Knowledge Review Learning Technique for Transformer Layers in Knowledge Distillation Nov 16, 2021 Knowledge Distillation
— Unverified 0