Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages May 25, 2023 Knowledge Distillation Machine Translation
— Unverified 0Fairness Continual Learning Approach to Semantic Scene Understanding in Open-World Environments May 25, 2023 Continual Learning Continual Semantic Segmentation
— Unverified 0Triplet Knowledge Distillation May 25, 2023 Face Recognition image-classification
— Unverified 0Collective Knowledge Graph Completion with Mutual Knowledge Distillation May 25, 2023 Knowledge Distillation Knowledge Graph Completion
— Unverified 0On the Impact of Knowledge Distillation for Model Interpretability May 25, 2023 Knowledge Distillation
— Unverified 0PruMUX: Augmenting Data Multiplexing with Model Compression May 24, 2023 Knowledge Distillation model
Code Code Available 0Incorporating Ultrasound Tongue Images for Audio-Visual Speech Enhancement through Knowledge Distillation May 24, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Deakin RF-Sensing: Experiments on Correlated Knowledge Distillation for Monitoring Human Postures with Radios May 24, 2023 Knowledge Distillation
— Unverified 0HARD: Hard Augmentations for Robust Distillation May 24, 2023 Data Augmentation Domain Generalization
— Unverified 0AdvFunMatch: When Consistent Teaching Meets Adversarial Robustness May 24, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 0Just CHOP: Embarrassingly Simple LLM Compression May 24, 2023 Knowledge Distillation Language Modeling
— Unverified 0Masked Modeling Duo for Speech: Specializing General-Purpose Audio Representation to Speech using Denoising Distillation May 23, 2023 Denoising Knowledge Distillation
— Unverified 0Sequence-Level Knowledge Distillation for Class-Incremental End-to-End Spoken Language Understanding May 23, 2023 Continual Learning Decoder
— Unverified 0Transferring Learning Trajectories of Neural Networks May 23, 2023 Knowledge Distillation
— Unverified 0One-stop Training of Multiple Capacity Models May 23, 2023 Knowledge Distillation Machine Translation
— Unverified 0D^2TV: Dual Knowledge Distillation and Target-oriented Vision Modeling for Many-to-Many Multimodal Summarization May 22, 2023 Knowledge Distillation
Code Code Available 0EnSiam: Self-Supervised Learning With Ensemble Representations May 22, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0Distilling Robustness into Natural Language Inference Models with Domain-Targeted Augmentation May 22, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Revisiting Data Augmentation in Model Compression: An Empirical and Comprehensive Study May 22, 2023 Data Augmentation Knowledge Distillation
— Unverified 0One-Shot Federated Learning for LEO Constellations that Reduces Convergence Time from Days to 90 Minutes May 21, 2023 Federated Learning Knowledge Distillation
— Unverified 0DualVC: Dual-mode Voice Conversion using Intra-model Knowledge Distillation and Hybrid Predictive Coding May 21, 2023 Data Augmentation Decoder
— Unverified 0Understanding the Effect of Data Augmentation on Knowledge Distillation May 21, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Accurate Knowledge Distillation with n-best Reranking May 20, 2023 Knowledge Distillation Reranking
— Unverified 0Sentence Embedder Guided Utterance Encoder (SEGUE) for Spoken Language Understanding May 20, 2023 Knowledge Distillation Sentence
Code Code Available 0Pseudo-Label Training and Model Inertia in Neural Machine Translation May 19, 2023 Knowledge Distillation Machine Translation
— Unverified 0Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling May 18, 2023 Knowledge Distillation
Code Code Available 0BERM: Training the Balanced and Extractable Representation for Matching to Improve Generalization Ability of Dense Retrieval May 18, 2023 Information Retrieval Knowledge Distillation
— Unverified 0DQ-Whisper: Joint Distillation and Quantization for Efficient Multilingual Speech Recognition May 18, 2023 Knowledge Distillation Quantization
— Unverified 0Boost Vision Transformer with GPU-Friendly Sparsity and Quantization May 18, 2023 Benchmarking GPU
— Unverified 0Student-friendly Knowledge Distillation May 18, 2023 Knowledge Distillation
— Unverified 0When Gradient Descent Meets Derivative-Free Optimization: A Match Made in Black-Box Scenario May 17, 2023 Knowledge Distillation
— Unverified 0Lightweight Self-Knowledge Distillation with Multi-source Information Fusion May 16, 2023 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 0Weight-Inherited Distillation for Task-Agnostic BERT Compression May 16, 2023 Knowledge Distillation
Code Code Available 0Distilling Knowledge for Short-to-Long Term Trajectory Prediction May 15, 2023 Knowledge Distillation Prediction
— Unverified 0Soft Prompt Decoding for Multilingual Dense Retrieval May 15, 2023 Cross-Lingual Information Retrieval Information Retrieval
— Unverified 0Improving Defensive Distillation using Teacher Assistant May 14, 2023 Face Recognition Knowledge Distillation
— Unverified 0On enhancing the robustness of Vision Transformers: Defensive Diffusion May 14, 2023 Computational Efficiency Denoising
Code Code Available 0Towards Understanding and Improving Knowledge Distillation for Neural Machine Translation May 14, 2023 Knowledge Distillation Machine Translation
Code Code Available 0Analyzing Compression Techniques for Computer Vision May 14, 2023 Knowledge Distillation Quantization
— Unverified 0GSB: Group Superposition Binarization for Vision Transformer with Limited Training Samples May 13, 2023 Binarization Knowledge Distillation
Code Code Available 0Black-box Source-free Domain Adaptation via Two-stage Knowledge Distillation May 13, 2023 Domain Adaptation Knowledge Distillation
— Unverified 0AMTSS: An Adaptive Multi-Teacher Single-Student Knowledge Distillation Framework For Multilingual Language Inference May 13, 2023 Knowledge Distillation
— Unverified 0Knowledge distillation with Segment Anything (SAM) model for Planetary Geological Mapping May 12, 2023 Decoder Image Segmentation
— Unverified 0A Lightweight Domain Adversarial Neural Network Based on Knowledge Distillation for EEG-based Cross-subject Emotion Recognition May 12, 2023 EEG Electroencephalogram (EEG)
— Unverified 0Long-Tailed Question Answering in an Open World May 11, 2023 Knowledge Distillation Language Modelling
— Unverified 0A Survey on the Robustness of Computer Vision Models against Common Corruptions May 10, 2023 Data Augmentation Knowledge Distillation
Code Code Available 0Explainable Knowledge Distillation for On-device Chest X-Ray Classification May 10, 2023 Explainable artificial intelligence Explainable Artificial Intelligence (XAI)
— Unverified 0DynamicKD: An Effective Knowledge Distillation via Dynamic Entropy Correction-Based Distillation for Gap Optimizing May 9, 2023 Knowledge Distillation
— Unverified 0SRIL: Selective Regularization for Class-Incremental Learning May 9, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Multi-Teacher Knowledge Distillation For Text Image Machine Translation May 9, 2023 Decoder Knowledge Distillation
Code Code Available 0