Learning an Augmented RGB Representation with Cross-Modal Knowledge Distillation for Action Detection Aug 8, 2021 Action Detection Knowledge Distillation
— Unverified 0A distillation based approach for the diagnosis of diseases Aug 7, 2021 Knowledge Distillation
— Unverified 0Spatio-Temporal Attention Mechanism and Knowledge Distillation for Lip Reading Aug 7, 2021 Audio-Visual Speech Recognition Knowledge Distillation
— Unverified 0Decoupled Transformer for Scalable Inference in Open-domain Question Answering Aug 5, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0MS-KD: Multi-Organ Segmentation with Multiple Binary-Labeled Datasets Aug 5, 2021 Knowledge Distillation Organ Segmentation
— Unverified 0WeChat Neural Machine Translation Systems for WMT21 Aug 5, 2021 Knowledge Distillation Machine Translation
— Unverified 0Semi-Supervising Learning, Transfer Learning, and Knowledge Distillation with SimCLR Aug 2, 2021 Data Augmentation Knowledge Distillation
— Unverified 0On Knowledge Distillation for Translating Erroneous Speech Transcriptions Aug 1, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0In-Batch Negatives for Knowledge Distillation with Tightly-Coupled Teachers for Dense Retrieval Aug 1, 2021 Document Ranking Knowledge Distillation
— Unverified 0Multi-Strategy Knowledge Distillation Based Teacher-Student Framework for Machine Reading Comprehension Aug 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Samsung R&D Institute Poland submission to WAT 2021 Indic Language Multilingual Task Aug 1, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0The USYD-JD Speech Translation System for IWSLT2021 Aug 1, 2021 Knowledge Distillation NMT
— Unverified 0NAIST English-to-Japanese Simultaneous Translation System for IWSLT 2021 Simultaneous Text-to-text Task Aug 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0Trigger is Not Sufficient: Exploiting Frame-aware Knowledge for Implicit Event Argument Extraction Aug 1, 2021 Event Argument Extraction Knowledge Distillation
— Unverified 0基于层间知识蒸馏的神经机器翻译(Inter-layer Knowledge Distillation for Neural Machine Translation) Aug 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0Matching Distributions between Model and Data: Cross-domain Knowledge Distillation for Unsupervised Domain Adaptation Aug 1, 2021 Cross-Domain Text Classification Domain Adaptation
— Unverified 0POS-Constrained Parallel Decoding for Non-autoregressive Generation Aug 1, 2021 Knowledge Distillation POS
Code Code Available 0PRAL: A Tailored Pre-Training Model for Task-Oriented Dialog Generation Aug 1, 2021 Knowledge Distillation Language Modeling
— Unverified 0Pose-Guided Feature Learning with Knowledge Distillation for Occluded Person Re-Identification Jul 31, 2021 Knowledge Distillation Occluded Person Re-Identification
— Unverified 0On the Efficacy of Small Self-Supervised Contrastive Models without Distillation Signals Jul 30, 2021 Clustering Contrastive Learning
Code Code Available 0QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning Jul 29, 2021 Federated Learning Knowledge Distillation
— Unverified 0Using Perturbed Length-aware Positional Encoding for Non-autoregressive Neural Machine Translation Jul 29, 2021 Knowledge Distillation Machine Translation
— Unverified 0In Defense of the Learning Without Forgetting for Task Incremental Learning Jul 26, 2021 Continual Learning Incremental Learning
— Unverified 0Text is Text, No Matter What: Unifying Text Recognition using Knowledge Distillation Jul 26, 2021 Handwriting Recognition HTR
— Unverified 0ROD: Reception-aware Online Distillation for Sparse Graphs Jul 25, 2021 Clustering Graph Learning
Code Code Available 0IE-GAN: An Improved Evolutionary Generative Adversarial Network Using a New Fitness Function and a Generic Crossover Operator Jul 25, 2021 Evolutionary Algorithms Generative Adversarial Network
Code Code Available 0The USYD-JD Speech Translation System for IWSLT 2021 Jul 24, 2021 Knowledge Distillation NMT
— Unverified 0Learning ULMFiT and Self-Distillation with Calibration for Medical Dialogue System Jul 20, 2021 Decision Making Knowledge Distillation
— Unverified 0Follow Your Path: a Progressive Method for Knowledge Distillation Jul 20, 2021 Knowledge Distillation
— Unverified 0Double Similarity Distillation for Semantic Image Segmentation Jul 19, 2021 Image Segmentation Knowledge Distillation
— Unverified 0Federated Action Recognition on Heterogeneous Embedded Devices Jul 18, 2021 Action Recognition Federated Learning
— Unverified 0Scene-adaptive Knowledge Distillation for Sequential Recommendation via Differentiable Architecture Search Jul 15, 2021 Knowledge Distillation Neural Architecture Search
— Unverified 0Technical Report of Team GraphMIRAcles in the WikiKG90M-LSC Track of OGB-LSC @ KDD Cup 2021 Jul 12, 2021 Knowledge Distillation Knowledge Graphs
— Unverified 0Improving Speech Translation by Understanding and Learning from the Auxiliary Text Translation Task Jul 12, 2021 Decoder Knowledge Distillation
— Unverified 0A Flexible Multi-Task Model for BERT Serving Jul 12, 2021 Knowledge Distillation model
Code Code Available 0Contrast R-CNN for Continual Learning in Object Detection Jul 11, 2021 Continual Learning image-classification
— Unverified 0Lifelong Twin Generative Adversarial Networks Jul 9, 2021 Knowledge Distillation
— Unverified 0Novel Visual Category Discovery with Dual Ranking Statistics and Mutual Knowledge Distillation Jul 7, 2021 Fine-Grained Visual Recognition Knowledge Distillation
— Unverified 0WeClick: Weakly-Supervised Video Semantic Segmentation with Click Annotations Jul 7, 2021 Knowledge Distillation Model Compression
— Unverified 0Confidence Conditioned Knowledge Distillation Jul 6, 2021 Knowledge Distillation
— Unverified 0CoReD: Generalizing Fake Media Detection with Continual Representation using Distillation Jul 6, 2021 Continual Learning Domain Adaptation
Code Code Available 0Embracing the Dark Knowledge: Domain Generalization Using Regularized Knowledge Distillation Jul 6, 2021 Domain Generalization image-classification
— Unverified 0A Light-weight Deep Human Activity Recognition Algorithm Using Multi-knowledge Distillation Jul 6, 2021 Activity Recognition Classification
— Unverified 0On The Distribution of Penultimate Activations of Classification Networks Jul 5, 2021 Classification Conditional Image Generation
— Unverified 0Continual Contrastive Learning for Image Classification Jul 5, 2021 Classification Continual Learning
Code Code Available 0Audio-Oriented Multimodal Machine Comprehension: Task, Dataset and Model Jul 4, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Isotonic Data Augmentation for Knowledge Distillation Jul 3, 2021 Attribute Data Augmentation
— Unverified 0Pool of Experts: Realtime Querying Specialized Knowledge in Massive Neural Networks Jul 3, 2021 Knowledge Distillation Model Compression
Code Code Available 0Revisiting Knowledge Distillation: An Inheritance and Exploration Framework Jul 1, 2021 Knowledge Distillation
Code Code Available 0Knowledge Distillation for Quality Estimation Jul 1, 2021 Data Augmentation Knowledge Distillation
Code Code Available 0