Multi-granularity for knowledge distillation Aug 15, 2021 Knowledge Distillation Person Re-Identification
Code Code Available 0Online Continual Learning For Visual Food Classification Aug 15, 2021 Classification Continual Learning
— Unverified 0AGKD-BML: Defense Against Adversarial Attack by Attention Guided Knowledge Distillation and Bi-directional Metric Learning Aug 13, 2021 Adversarial Attack Adversarial Robustness
Code Code Available 1PAIR: Leveraging Passage-Centric Similarity Relation for Improving Dense Passage Retrieval Aug 13, 2021 Knowledge Distillation Natural Questions
— Unverified 0Learning from Matured Dumb Teacher for Fine Generalization Aug 12, 2021 image-classification Image Classification
— Unverified 0Distilling Holistic Knowledge with Graph Neural Networks Aug 12, 2021 Knowledge Distillation
Code Code Available 1Semi-Supervised Domain Generalizable Person Re-Identification Aug 11, 2021 Generalizable Person Re-identification Knowledge Distillation
Code Code Available 2Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data Aug 11, 2021 Knowledge Distillation Model Compression
— Unverified 0Lifelong Intent Detection via Multi-Strategy Rebalancing Aug 10, 2021 Intent Detection Knowledge Distillation
— Unverified 0Learning an Augmented RGB Representation with Cross-Modal Knowledge Distillation for Action Detection Aug 8, 2021 Action Detection Knowledge Distillation
— Unverified 0A distillation based approach for the diagnosis of diseases Aug 7, 2021 Knowledge Distillation
— Unverified 0Spatio-Temporal Attention Mechanism and Knowledge Distillation for Lip Reading Aug 7, 2021 Audio-Visual Speech Recognition Knowledge Distillation
— Unverified 0Transferring Knowledge Distillation for Multilingual Social Event Detection Aug 6, 2021 Cross-Lingual Word Embeddings Event Detection
Code Code Available 1Decoupled Transformer for Scalable Inference in Open-domain Question Answering Aug 5, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Knowledge Distillation from BERT Transformer to Speech Transformer for Intent Classification Aug 5, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 1WeChat Neural Machine Translation Systems for WMT21 Aug 5, 2021 Knowledge Distillation Machine Translation
— Unverified 0MS-KD: Multi-Organ Segmentation with Multiple Binary-Labeled Datasets Aug 5, 2021 Knowledge Distillation Organ Segmentation
— Unverified 0Online Knowledge Distillation for Efficient Pose Estimation Aug 4, 2021 Knowledge Distillation Pose Estimation
Code Code Available 1Learning Compatible Embeddings Aug 4, 2021 Knowledge Distillation Retrieval
Code Code Available 1Semi-Supervising Learning, Transfer Learning, and Knowledge Distillation with SimCLR Aug 2, 2021 Data Augmentation Knowledge Distillation
— Unverified 0In-Batch Negatives for Knowledge Distillation with Tightly-Coupled Teachers for Dense Retrieval Aug 1, 2021 Document Ranking Knowledge Distillation
— Unverified 0On Knowledge Distillation for Translating Erroneous Speech Transcriptions Aug 1, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0NAIST English-to-Japanese Simultaneous Translation System for IWSLT 2021 Simultaneous Text-to-text Task Aug 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0The USYD-JD Speech Translation System for IWSLT2021 Aug 1, 2021 Knowledge Distillation NMT
— Unverified 0基于层间知识蒸馏的神经机器翻译(Inter-layer Knowledge Distillation for Neural Machine Translation) Aug 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0Multi-Strategy Knowledge Distillation Based Teacher-Student Framework for Machine Reading Comprehension Aug 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Samsung R&D Institute Poland submission to WAT 2021 Indic Language Multilingual Task Aug 1, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0POS-Constrained Parallel Decoding for Non-autoregressive Generation Aug 1, 2021 Knowledge Distillation POS
Code Code Available 0Trigger is Not Sufficient: Exploiting Frame-aware Knowledge for Implicit Event Argument Extraction Aug 1, 2021 Event Argument Extraction Knowledge Distillation
— Unverified 0PRAL: A Tailored Pre-Training Model for Task-Oriented Dialog Generation Aug 1, 2021 Knowledge Distillation Language Modeling
— Unverified 0Matching Distributions between Model and Data: Cross-domain Knowledge Distillation for Unsupervised Domain Adaptation Aug 1, 2021 Cross-Domain Text Classification Domain Adaptation
— Unverified 0Pose-Guided Feature Learning with Knowledge Distillation for Occluded Person Re-Identification Jul 31, 2021 Knowledge Distillation Occluded Person Re-Identification
— Unverified 0On the Efficacy of Small Self-Supervised Contrastive Models without Distillation Signals Jul 30, 2021 Clustering Contrastive Learning
Code Code Available 0QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning Jul 29, 2021 Federated Learning Knowledge Distillation
— Unverified 0Hierarchical Self-supervised Augmented Knowledge Distillation Jul 29, 2021 Knowledge Distillation Representation Learning
Code Code Available 1Using Perturbed Length-aware Positional Encoding for Non-autoregressive Neural Machine Translation Jul 29, 2021 Knowledge Distillation Machine Translation
— Unverified 0In Defense of the Learning Without Forgetting for Task Incremental Learning Jul 26, 2021 Continual Learning Incremental Learning
— Unverified 0Text is Text, No Matter What: Unifying Text Recognition using Knowledge Distillation Jul 26, 2021 Handwriting Recognition HTR
— Unverified 0IE-GAN: An Improved Evolutionary Generative Adversarial Network Using a New Fitness Function and a Generic Crossover Operator Jul 25, 2021 Evolutionary Algorithms Generative Adversarial Network
Code Code Available 0ROD: Reception-aware Online Distillation for Sparse Graphs Jul 25, 2021 Clustering Graph Learning
Code Code Available 0The USYD-JD Speech Translation System for IWSLT 2021 Jul 24, 2021 Knowledge Distillation NMT
— Unverified 0Follow Your Path: a Progressive Method for Knowledge Distillation Jul 20, 2021 Knowledge Distillation
— Unverified 0Learning ULMFiT and Self-Distillation with Calibration for Medical Dialogue System Jul 20, 2021 Decision Making Knowledge Distillation
— Unverified 0Double Similarity Distillation for Semantic Image Segmentation Jul 19, 2021 Image Segmentation Knowledge Distillation
— Unverified 0Federated Action Recognition on Heterogeneous Embedded Devices Jul 18, 2021 Action Recognition Federated Learning
— Unverified 0Scene-adaptive Knowledge Distillation for Sequential Recommendation via Differentiable Architecture Search Jul 15, 2021 Knowledge Distillation Neural Architecture Search
— Unverified 0Improving Speech Translation by Understanding and Learning from the Auxiliary Text Translation Task Jul 12, 2021 Decoder Knowledge Distillation
— Unverified 0A Flexible Multi-Task Model for BERT Serving Jul 12, 2021 Knowledge Distillation model
Code Code Available 0Technical Report of Team GraphMIRAcles in the WikiKG90M-LSC Track of OGB-LSC @ KDD Cup 2021 Jul 12, 2021 Knowledge Distillation Knowledge Graphs
— Unverified 0Contrast R-CNN for Continual Learning in Object Detection Jul 11, 2021 Continual Learning image-classification
— Unverified 0