SA-MLP: Distilling Graph Knowledge from GNNs into Structure-Aware MLP Oct 18, 2022 Knowledge Distillation Node Classification
Code Code Available 0Distilling Object Detectors With Global Knowledge Oct 17, 2022 Knowledge Distillation Object
Code Code Available 0Federated Learning with Privacy-Preserving Ensemble Attention Distillation Oct 16, 2022 Federated Learning image-classification
— Unverified 0RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging Oct 15, 2022 Classification Knowledge Distillation
— Unverified 0Improving generalizability of distilled self-supervised speech processing models under distorted settings Oct 14, 2022 Knowledge Distillation
Code Code Available 0Knowledge Distillation approach towards Melanoma Detection Oct 14, 2022 Knowledge Distillation TAG
Code Code Available 0You Can Have Your Data and Balance It Too: Towards Balanced and Efficient Multilingual Models Oct 13, 2022 Cross-Lingual Transfer Knowledge Distillation
— Unverified 0Probabilistic Integration of Object Level Annotations in Chest X-ray Classification Oct 13, 2022 Knowledge Distillation Variational Inference
— Unverified 0Boosting Graph Neural Networks via Adaptive Knowledge Distillation Oct 12, 2022 Graph Classification Graph Mining
— Unverified 0Integrating Translation Memories into Non-Autoregressive Machine Translation Oct 12, 2022 Knowledge Distillation Machine Translation
Code Code Available 0SaiT: Sparse Vision Transformers through Adaptive Token Pruning Oct 11, 2022 Knowledge Distillation
Code Code Available 0Comparison of Soft and Hard Target RNN-T Distillation for Large-scale ASR Oct 11, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes Oct 11, 2022 Active Learning Knowledge Distillation
— Unverified 0Detect, Distill and Update: Learned DB Systems Facing Out of Distribution Data Oct 11, 2022 Knowledge Distillation Synthetic Data Generation
Code Code Available 0Linkless Link Prediction via Relational Distillation Oct 11, 2022 Knowledge Distillation Link Prediction
— Unverified 0PP-StructureV2: A Stronger Document Analysis System Oct 11, 2022 Key Information Extraction Knowledge Distillation
— Unverified 0Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again Oct 10, 2022 Knowledge Distillation
— Unverified 0Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks Oct 10, 2022 domain classification intent-classification
— Unverified 0Students taught by multimodal teachers are superior action recognizers Oct 9, 2022 Action Recognition Knowledge Distillation
— Unverified 0Mutual Learning of Single- and Multi-Channel End-to-End Neural Diarization Oct 7, 2022 Knowledge Distillation speaker-diarization
— Unverified 0Automated Graph Self-supervised Learning via Multi-teacher Knowledge Distillation Oct 5, 2022 Graph Representation Learning Knowledge Distillation
— Unverified 0Meta-Ensemble Parameter Learning Oct 5, 2022 Knowledge Distillation Meta-Learning
— Unverified 0A Study on the Efficiency and Generalization of Light Hybrid Retrievers Oct 4, 2022 Adversarial Attack Contrastive Learning
— Unverified 0Domain Discrepancy Aware Distillation for Model Aggregation in Federated Learning Oct 4, 2022 Federated Learning Knowledge Distillation
— Unverified 0Positive Pair Distillation Considered Harmful: Continual Meta Metric Learning for Lifelong Object Re-Identification Oct 4, 2022 Knowledge Distillation Metric Learning
Code Code Available 0Knowledge Distillation based Contextual Relevance Matching for E-commerce Product Search Oct 4, 2022 Knowledge Distillation
— Unverified 0Robust Active Distillation Oct 3, 2022 Active Learning Informativeness
— Unverified 0One-Teacher and Multiple-Student Knowledge Distillation on Sentiment Classification Oct 1, 2022 Ensemble Learning Knowledge Distillation
Code Code Available 0Improving Zero-Shot Multilingual Text Generation via Iterative Distillation Oct 1, 2022 Knowledge Distillation Text Generation
— Unverified 0Sentiment Interpretable Logic Tensor Network for Aspect-Term Sentiment Analysis Oct 1, 2022 Computational Efficiency Knowledge Distillation
— Unverified 0Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression Oct 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 0Knowledge Transfer with Visual Prompt in multi-modal Dialogue Understanding and Generation Oct 1, 2022 Dialogue Understanding Knowledge Distillation
— Unverified 0Transferring Knowledge from Structure-aware Self-attention Language Model to Sequence-to-Sequence Semantic Parsing Oct 1, 2022 Code Generation Knowledge Distillation
— Unverified 0Multi-stage Progressive Compression of Conformer Transducer for On-device Speech Recognition Oct 1, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0TAKE: Topic-shift Aware Knowledge sElection for Dialogue Generation Oct 1, 2022 Dialogue Generation Knowledge Distillation
Code Code Available 0F-VLM: Open-Vocabulary Object Detection upon Frozen Vision and Language Models Sep 30, 2022 Knowledge Distillation object-detection
Code Code Available 0Towards a Unified View of Affinity-Based Knowledge Distillation Sep 30, 2022 image-classification Image Classification
— Unverified 0Slimmable Networks for Contrastive Self-supervised Learning Sep 30, 2022 Contrastive Learning Knowledge Distillation
Code Code Available 0Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation Sep 30, 2022 Knowledge Distillation
— Unverified 0Using Knowledge Distillation to improve interpretable models in a retail banking context Sep 30, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Label driven Knowledge Distillation for Federated Learning with non-IID Data Sep 29, 2022 Federated Learning Knowledge Distillation
— Unverified 0Towards Explaining Autonomy with Verbalised Decision Tree States Sep 28, 2022 Knowledge Distillation
— Unverified 0PROD: Progressive Distillation for Dense Retrieval Sep 27, 2022 Knowledge Distillation Natural Questions
— Unverified 0Knowledge Distillation to Ensemble Global and Interpretable Prototype-Based Mammogram Classification Models Sep 26, 2022 Diversity Knowledge Distillation
— Unverified 0Joint Speech Activity and Overlap Detection with Multi-Exit Architecture Sep 24, 2022 Action Detection Activity Detection
— Unverified 0DRKF: Distilled Rotated Kernel Fusion for Efficient Rotation Invariant Descriptors in Local Feature Matching Sep 22, 2022 Knowledge Distillation
— Unverified 0Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation Sep 21, 2022 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Exploring Inconsistent Knowledge Distillation for Object Detection with Data Augmentation Sep 20, 2022 Data Augmentation Knowledge Distillation
Code Code Available 0Parameter-Efficient Conformers via Sharing Sparsely-Gated Experts for End-to-End Speech Recognition Sep 17, 2022 Knowledge Distillation Mixture-of-Experts
— Unverified 0Causes of Catastrophic Forgetting in Class-Incremental Semantic Segmentation Sep 16, 2022 class-incremental learning Class Incremental Learning
— Unverified 0