Consensual Collaborative Training And Knowledge Distillation Based Facial Expression Recognition Under Noisy Annotations Jul 10, 2021 Facial Expression Recognition Facial Expression Recognition (FER)
Code Code Available 1Lifelong Twin Generative Adversarial Networks Jul 9, 2021 Knowledge Distillation
— Unverified 0WeClick: Weakly-Supervised Video Semantic Segmentation with Click Annotations Jul 7, 2021 Knowledge Distillation Model Compression
— Unverified 0Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification Jul 7, 2021 Classification image-classification
Code Code Available 1Novel Visual Category Discovery with Dual Ranking Statistics and Mutual Knowledge Distillation Jul 7, 2021 Fine-Grained Visual Recognition Knowledge Distillation
— Unverified 0Confidence Conditioned Knowledge Distillation Jul 6, 2021 Knowledge Distillation
— Unverified 0A Light-weight Deep Human Activity Recognition Algorithm Using Multi-knowledge Distillation Jul 6, 2021 Activity Recognition Classification
— Unverified 0Embracing the Dark Knowledge: Domain Generalization Using Regularized Knowledge Distillation Jul 6, 2021 Domain Generalization image-classification
— Unverified 0VidLanKD: Improving Language Understanding via Video-Distilled Knowledge Transfer Jul 6, 2021 Image Retrieval Knowledge Distillation
Code Code Available 1CoReD: Generalizing Fake Media Detection with Continual Representation using Distillation Jul 6, 2021 Continual Learning Domain Adaptation
Code Code Available 0On The Distribution of Penultimate Activations of Classification Networks Jul 5, 2021 Classification Conditional Image Generation
— Unverified 0Continual Contrastive Learning for Image Classification Jul 5, 2021 Classification Continual Learning
Code Code Available 0Audio-Oriented Multimodal Machine Comprehension: Task, Dataset and Model Jul 4, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Split-and-Bridge: Adaptable Class Incremental Learning within a Single Neural Network Jul 3, 2021 class-incremental learning Class Incremental Learning
Code Code Available 1Pool of Experts: Realtime Querying Specialized Knowledge in Massive Neural Networks Jul 3, 2021 Knowledge Distillation Model Compression
Code Code Available 0Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation Jul 3, 2021 Knowledge Distillation Model Compression
Code Code Available 1Isotonic Data Augmentation for Knowledge Distillation Jul 3, 2021 Attribute Data Augmentation
— Unverified 0ESPnet-ST IWSLT 2021 Offline Speech Translation System Jul 1, 2021 Decoder Knowledge Distillation
— Unverified 0Revisiting Knowledge Distillation: An Inheritance and Exploration Framework Jul 1, 2021 Knowledge Distillation
Code Code Available 0Knowledge Distillation for Quality Estimation Jul 1, 2021 Data Augmentation Knowledge Distillation
Code Code Available 0Local-Global Knowledge Distillation in Heterogeneous Federated Learning with Non-IID Data Jun 30, 2021 Federated Learning Knowledge Distillation
— Unverified 0Learning without Forgetting for 3D Point Cloud Objects Jun 27, 2021 Knowledge Distillation
Code Code Available 0Reward-Based 1-bit Compressed Federated Distillation on Blockchain Jun 27, 2021 Federated Learning Knowledge Distillation
— Unverified 0PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation Jun 25, 2021 Keyword Spotting Knowledge Distillation
— Unverified 0Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains Jun 25, 2021 Knowledge Distillation
— Unverified 0DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval Jun 24, 2021 Computational Efficiency Knowledge Distillation
Code Code Available 1Dealing with training and test segmentation mismatch: FBK@IWSLT2021 Jun 23, 2021 Action Detection Activity Detection
— Unverified 0SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning Jun 22, 2021 class-incremental learning Class Incremental Learning
Code Code Available 1Efficient Inference via Universal LSH Kernel Jun 21, 2021 Knowledge Distillation Quantization
— Unverified 0Structured Sparse R-CNN for Direct Scene Graph Generation Jun 21, 2021 graph construction Graph Generation
Code Code Available 1Knowledge Distillation via Instance-level Sequence Learning Jun 21, 2021 General Knowledge Knowledge Distillation
— Unverified 0Minimally Invasive Surgery for Sparse Neural Networks in Contrastive Manner Jun 19, 2021 Knowledge Distillation Model Compression
— Unverified 0Tree-Like Decision Distillation Jun 19, 2021 Decision Making Knowledge Distillation
— Unverified 0Data-Free Knowledge Distillation for Image Super-Resolution Jun 19, 2021 Data-free Knowledge Distillation Image Super-Resolution
Code Code Available 0Learning Student Networks in the Wild Jun 19, 2021 Knowledge Distillation Model Compression
Code Code Available 2Positive-Unlabeled Data Purification in the Wild for Object Detection Jun 19, 2021 Knowledge Distillation object-detection
— Unverified 0CapsuleRRT: Relationships-Aware Regression Tracking via Capsules Jun 19, 2021 image-classification Image Classification
— Unverified 0Space-Time Distillation for Video Super-Resolution Jun 19, 2021 Knowledge Distillation Super-Resolution
— Unverified 0Teacher's pet: understanding and mitigating biases in distillation Jun 19, 2021 image-classification Image Classification
— Unverified 0Cross Modality Knowledge Distillation for Multi-Modal Aerial View Object Classification Jun 19, 2021 Image Classification Knowledge Distillation
Code Code Available 0Recurrent Stacking of Layers in Neural Networks: An Application to Neural Machine Translation Jun 18, 2021 Knowledge Distillation Machine Translation
— Unverified 0Dual-Teacher Class-Incremental Learning With Data-Free Generative Replay Jun 17, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Dynamic Knowledge Distillation With Noise Elimination for RGB-D Salient Object Detection Jun 17, 2021 Knowledge Distillation object-detection
— Unverified 0Knowledge distillation from multi-modal to mono-modal segmentation networks Jun 17, 2021 Brain Tumor Segmentation Image Segmentation
— Unverified 0Topology Distillation for Recommender System Jun 16, 2021 Knowledge Distillation Model Compression
— Unverified 0Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation Jun 15, 2021 Fairness Knowledge Distillation
Code Code Available 0CoDERT: Distilling Encoder Representations with Co-learning for Transducer-based Speech Recognition Jun 14, 2021 Decoder Knowledge Distillation
— Unverified 0Context-Aware Image Inpainting with Learned Semantic Priors Jun 14, 2021 Image Inpainting Knowledge Distillation
Code Code Available 1Energy-efficient Knowledge Distillation for Spiking Neural Networks Jun 14, 2021 Knowledge Distillation Model Compression
— Unverified 0Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation Jun 12, 2021 Decoder Knowledge Distillation
— Unverified 0