Boosting Summarization with Normalizing Flows and Aggressive Training Nov 1, 2023 Decoder Knowledge Distillation
Code Code Available 0Interactive Multi-fidelity Learning for Cost-effective Adaptation of Language Model with Sparse Human Supervision Oct 31, 2023 Informativeness Knowledge Distillation
— Unverified 0AMLNet: Adversarial Mutual Learning Neural Network for Non-AutoRegressive Multi-Horizon Time Series Forecasting Oct 30, 2023 Decoder Diversity
Code Code Available 0MUST: A Multilingual Student-Teacher Learning approach for low-resource speech recognition Oct 29, 2023 Knowledge Distillation speech-recognition
— Unverified 0RCKD: Response-Based Cross-Task Knowledge Distillation for Pathological Image Analysis Oct 29, 2023 Image Classification Knowledge Distillation
— Unverified 0Ever Evolving Evaluator (EV3): Towards Flexible and Reliable Meta-Optimization for Knowledge Distillation Oct 29, 2023 Diversity Evolutionary Algorithms
— Unverified 0ODM3D: Alleviating Foreground Sparsity for Semi-Supervised Monocular 3D Object Detection Oct 28, 2023 3D Object Detection Autonomous Driving
Code Code Available 0Efficient Object Detection in Optical Remote Sensing Imagery via Attention-based Feature Distillation Oct 28, 2023 Knowledge Distillation Object
— Unverified 0Discourse Structures Guided Fine-grained Propaganda Identification Oct 28, 2023 Attribute Knowledge Distillation
Code Code Available 0Towards a Unified Conversational Recommendation System: Multi-task Learning via Contextualized Knowledge Distillation Oct 27, 2023 Conversational Recommendation Diversity
Code Code Available 0Multi-label Emotion Analysis in Conversation via Multimodal Knowledge Distillation Oct 27, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP Oct 26, 2023 image-classification Image Classification
— Unverified 0Fantastic Gains and Where to Find Them: On the Existence and Prospect of General Knowledge Transfer between Any Pretrained Model Oct 26, 2023 Data Augmentation General Knowledge
Code Code Available 0SonoSAMTrack -- Segment and Track Anything on Ultrasound Images Oct 25, 2023 Knowledge Distillation
— Unverified 0TOP-Training: Target-Oriented Pretraining for Medical Extractive Question Answering Oct 25, 2023 Domain Adaptation Extractive Question-Answering
Code Code Available 0Cross-feature Contrastive Loss for Decentralized Deep Learning on Heterogeneous Data Oct 24, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0Wakening Past Concepts without Past Data: Class-Incremental Learning from Online Placebos Oct 24, 2023 class-incremental learning Class Incremental Learning
— Unverified 0ABKD: Graph Neural Network Compression with Attention-Based Knowledge Distillation Oct 24, 2023 Drug Discovery Fake News Detection
— Unverified 0MCC-KD: Multi-CoT Consistent Knowledge Distillation Oct 23, 2023 Diversity Knowledge Distillation
Code Code Available 0Leveraging Complementary Attention maps in vision transformers for OCT image analysis Oct 21, 2023 Knowledge Distillation
— Unverified 0Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images Oct 20, 2023 Data Augmentation Data-free Knowledge Distillation
— Unverified 0DistillCSE: Distilled Contrastive Learning for Sentence Embeddings Oct 20, 2023 Contrastive Learning Knowledge Distillation
Code Code Available 0GenDistiller: Distilling Pre-trained Language Models based on Generative Models Oct 20, 2023 Knowledge Distillation Language Modeling
— Unverified 0Enhancing Abstractiveness of Summarization Models through Calibrated Distillation Oct 20, 2023 Abstractive Text Summarization Informativeness
— Unverified 0Leveraging Knowledge Distillation for Efficient Deep Reinforcement Learning in Resource-Constrained Environments Oct 16, 2023 Decision Making Deep Reinforcement Learning
Code Code Available 0A Comparative Analysis of Task-Agnostic Distillation Methods for Compressing Transformer Language Models Oct 13, 2023 Knowledge Distillation
— Unverified 0Revisiting Multi-modal 3D Semantic Segmentation in Real-world Autonomous Driving Oct 13, 2023 3D Semantic Segmentation Autonomous Driving
— Unverified 0DistillSpec: Improving Speculative Decoding via Knowledge Distillation Oct 12, 2023 Knowledge Distillation Language Modelling
— Unverified 0Retrieve Anything To Augment Large Language Models Oct 11, 2023 Knowledge Distillation Retrieval
— Unverified 0Distilling Efficient Vision Transformers from CNNs for Semantic Segmentation Oct 11, 2023 Knowledge Distillation Semantic Segmentation
— Unverified 0Distillation Improves Visual Place Recognition for Low Quality Images Oct 10, 2023 Knowledge Distillation Quantization
Code Code Available 0Leveraging Diffusion-Based Image Variations for Robust Training on Poisoned Data Oct 10, 2023 Knowledge Distillation
Code Code Available 0Knowledge Distillation for Anomaly Detection Oct 9, 2023 Anomaly Detection Knowledge Distillation
— Unverified 0What do larger image classifiers memorise? Oct 9, 2023 image-classification Image Classification
— Unverified 0Applying Knowledge Distillation to Improve Weed Mapping With Drones Oct 8, 2023 Knowledge Distillation Management
Code Code Available 0Fair Feature Importance Scores for Interpreting Tree-Based Methods and Surrogates Oct 6, 2023 Fairness Feature Importance
— Unverified 0DED: Diagnostic Evidence Distillation for acne severity grading on face images Oct 5, 2023 Acne Severity Grading Diagnostic
Code Code Available 0Improving Knowledge Distillation with Teacher's Explanation Oct 4, 2023 Knowledge Distillation
— Unverified 0Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication Oct 4, 2023 Decoder Knowledge Distillation
— Unverified 0I^2KD-SLU: An Intra-Inter Knowledge Distillation Framework for Zero-Shot Cross-Lingual Spoken Language Understanding Oct 4, 2023 Intent Detection Knowledge Distillation
— Unverified 0Heterogeneous Federated Learning Using Knowledge Codistillation Oct 4, 2023 Federated Learning image-classification
— Unverified 0Can a student Large Language Model perform as well as it's teacher? Oct 3, 2023 Knowledge Distillation Language Modeling
— Unverified 0Towards LogiGLUE: A Brief Survey and A Benchmark for Analyzing Logical Reasoning Capabilities of Language Models Oct 2, 2023 Knowledge Distillation Language Modelling
— Unverified 0KGEx: Explaining Knowledge Graph Embeddings via Subgraph Sampling and Knowledge Distillation Oct 2, 2023 Knowledge Distillation Knowledge Graph Embeddings
— Unverified 0Learnable Cross-modal Knowledge Distillation for Multi-modal Learning with Missing Modality Oct 2, 2023 Knowledge Distillation
— Unverified 0Towards Fixing Clever-Hans Predictors with Counterfactual Knowledge Distillation Oct 2, 2023 counterfactual Knowledge Distillation
— Unverified 0Distilling Influences to Mitigate Prediction Churn in Graph Neural Networks Oct 2, 2023 Knowledge Distillation Node Classification
Code Code Available 0Adaptive Decoupled Pose Knowledge Distillation Oct 1, 2023 Knowledge Distillation Pose Estimation
Code Code Available 0Distilling Inductive Bias: Knowledge Distillation Beyond Model Compression Sep 30, 2023 Inductive Bias Knowledge Distillation
— Unverified 0Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation Sep 29, 2023 Cross-Lingual Question Answering Cross-Lingual Transfer
Code Code Available 0