Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation Oct 12, 2022 Class-Incremental Semantic Segmentation Knowledge Distillation
Code Code Available 1SaiT: Sparse Vision Transformers through Adaptive Token Pruning Oct 11, 2022 Knowledge Distillation
Code Code Available 0Linkless Link Prediction via Relational Distillation Oct 11, 2022 Knowledge Distillation Link Prediction
— Unverified 0Comparison of Soft and Hard Target RNN-T Distillation for Large-scale ASR Oct 11, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Detect, Distill and Update: Learned DB Systems Facing Out of Distribution Data Oct 11, 2022 Knowledge Distillation Synthetic Data Generation
Code Code Available 0Hybrid Inverted Index Is a Robust Accelerator for Dense Retrieval Oct 11, 2022 Knowledge Distillation Quantization
Code Code Available 1APSNet: Attention Based Point Cloud Sampling Oct 11, 2022 3D Point Cloud Classification Knowledge Distillation
Code Code Available 1PP-StructureV2: A Stronger Document Analysis System Oct 11, 2022 Key Information Extraction Knowledge Distillation
— Unverified 0Meta-Learning with Self-Improving Momentum Target Oct 11, 2022 Knowledge Distillation Meta-Learning
Code Code Available 1ME-D2N: Multi-Expert Domain Decompositional Network for Cross-Domain Few-Shot Learning Oct 11, 2022 Cross-Domain Few-Shot cross-domain few-shot learning
Code Code Available 1The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes Oct 11, 2022 Active Learning Knowledge Distillation
— Unverified 0Patch-based Knowledge Distillation for Lifelong Person Re-Identification Oct 10, 2022 Continual Learning Knowledge Distillation
Code Code Available 1Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again Oct 10, 2022 Knowledge Distillation
— Unverified 0Distill the Image to Nowhere: Inversion Knowledge Distillation for Multimodal Machine Translation Oct 10, 2022 Knowledge Distillation Machine Translation
Code Code Available 1Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks Oct 10, 2022 domain classification intent-classification
— Unverified 0Let Images Give You More:Point Cloud Cross-Modal Training for Shape Analysis Oct 9, 2022 3D Point Cloud Classification Knowledge Distillation
Code Code Available 2Students taught by multimodal teachers are superior action recognizers Oct 9, 2022 Action Recognition Knowledge Distillation
— Unverified 0Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts Oct 8, 2022 Domain Generalization Knowledge Distillation
Code Code Available 1Mutual Learning of Single- and Multi-Channel End-to-End Neural Diarization Oct 7, 2022 Knowledge Distillation speaker-diarization
— Unverified 0C2KD: Cross-Lingual Cross-Modal Knowledge Distillation for Multilingual Text-Video Retrieval Oct 7, 2022 Knowledge Distillation Retrieval
Code Code Available 1Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide Image Classification Oct 7, 2022 Classification image-classification
Code Code Available 1IDa-Det: An Information Discrepancy-aware Distillation for 1-bit Detectors Oct 7, 2022 Knowledge Distillation object-detection
Code Code Available 1CLIP model is an Efficient Continual Learner Oct 6, 2022 Continual Learning Incremental Learning
Code Code Available 1Effective Self-supervised Pre-training on Low-compute Networks without Distillation Oct 6, 2022 Attribute Instance Segmentation
Code Code Available 1AlphaFold Distillation for Protein Design Oct 5, 2022 Diversity Drug Discovery
Code Code Available 1Meta-Ensemble Parameter Learning Oct 5, 2022 Knowledge Distillation Meta-Learning
— Unverified 0Automated Graph Self-supervised Learning via Multi-teacher Knowledge Distillation Oct 5, 2022 Graph Representation Learning Knowledge Distillation
— Unverified 0Domain Discrepancy Aware Distillation for Model Aggregation in Federated Learning Oct 4, 2022 Federated Learning Knowledge Distillation
— Unverified 0Positive Pair Distillation Considered Harmful: Continual Meta Metric Learning for Lifelong Object Re-Identification Oct 4, 2022 Knowledge Distillation Metric Learning
Code Code Available 0Knowledge Distillation based Contextual Relevance Matching for E-commerce Product Search Oct 4, 2022 Knowledge Distillation
— Unverified 0A Study on the Efficiency and Generalization of Light Hybrid Retrievers Oct 4, 2022 Adversarial Attack Contrastive Learning
— Unverified 0Robust Active Distillation Oct 3, 2022 Active Learning Informativeness
— Unverified 0Attention Distillation: self-supervised vision transformer students need more guidance Oct 3, 2022 Knowledge Distillation Self-Supervised Learning
Code Code Available 1Knowledge Transfer with Visual Prompt in multi-modal Dialogue Understanding and Generation Oct 1, 2022 Dialogue Understanding Knowledge Distillation
— Unverified 0Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression Oct 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 0Improving Zero-Shot Multilingual Text Generation via Iterative Distillation Oct 1, 2022 Knowledge Distillation Text Generation
— Unverified 0TAKE: Topic-shift Aware Knowledge sElection for Dialogue Generation Oct 1, 2022 Dialogue Generation Knowledge Distillation
Code Code Available 0Transferring Knowledge from Structure-aware Self-attention Language Model to Sequence-to-Sequence Semantic Parsing Oct 1, 2022 Code Generation Knowledge Distillation
— Unverified 0One-Teacher and Multiple-Student Knowledge Distillation on Sentiment Classification Oct 1, 2022 Ensemble Learning Knowledge Distillation
Code Code Available 0Sentiment Interpretable Logic Tensor Network for Aspect-Term Sentiment Analysis Oct 1, 2022 Computational Efficiency Knowledge Distillation
— Unverified 0Multi-stage Progressive Compression of Conformer Transducer for On-device Speech Recognition Oct 1, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Multimodality Multi-Lead ECG Arrhythmia Classification using Self-Supervised Learning Sep 30, 2022 ECG Classification Knowledge Distillation
Code Code Available 1F-VLM: Open-Vocabulary Object Detection upon Frozen Vision and Language Models Sep 30, 2022 Knowledge Distillation object-detection
Code Code Available 0Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation Sep 30, 2022 Knowledge Distillation
— Unverified 0Slimmable Networks for Contrastive Self-supervised Learning Sep 30, 2022 Contrastive Learning Knowledge Distillation
Code Code Available 0Using Knowledge Distillation to improve interpretable models in a retail banking context Sep 30, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Towards a Unified View of Affinity-Based Knowledge Distillation Sep 30, 2022 image-classification Image Classification
— Unverified 0Label driven Knowledge Distillation for Federated Learning with non-IID Data Sep 29, 2022 Federated Learning Knowledge Distillation
— Unverified 0Teaching Where to Look: Attention Similarity Knowledge Distillation for Low Resolution Face Recognition Sep 29, 2022 Face Recognition Knowledge Distillation
Code Code Available 1Hyper-Representations as Generative Models: Sampling Unseen Neural Network Weights Sep 29, 2022 Knowledge Distillation Neural Architecture Search
Code Code Available 1