Exploring compressibility of transformer based text-to-music (TTM) models Jun 24, 2024 Decoder FAD
— Unverified 0Leveraging Knowledge Distillation for Lightweight Skin Cancer Classification: Balancing Accuracy and Computational Efficiency Jun 24, 2024 Cancer Classification Computational Efficiency
— Unverified 0The Privileged Students: On the Value of Initialization in Multilingual Knowledge Distillation Jun 24, 2024 Knowledge Distillation
— Unverified 0Enhancing OOD Detection Using Latent Diffusion Jun 24, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 0Continual Learning with Diffusion-based Generative Replay for Industrial Streaming Data Jun 22, 2024 Continual Learning Knowledge Distillation
— Unverified 0Reinforced Knowledge Distillation for Time Series Regression Jun 21, 2024 Knowledge Distillation Model Compression
Code Code Available 0Fair Text to Medical Image Diffusion Model with Subgroup Distribution Aligned Tuning Jun 21, 2024 Knowledge Distillation
— Unverified 0Apprenticeship-Inspired Elegance: Synergistic Knowledge Distillation Empowers Spiking Neural Networks for Efficient Single-Eye Emotion Recognition Jun 20, 2024 Emotion Recognition Knowledge Distillation
— Unverified 0Factual Dialogue Summarization via Learning from Large Language Models Jun 20, 2024 Contrastive Learning Data Augmentation
— Unverified 0Can LLMs Learn by Teaching for Better Reasoning? A Preliminary Study Jun 20, 2024 In-Context Learning Knowledge Distillation
Code Code Available 2SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots Jun 20, 2024 In-Context Learning Knowledge Distillation
— Unverified 0Failure-Resilient Distributed Inference with Model Compression over Heterogeneous Edge Devices Jun 20, 2024 Knowledge Distillation Model Compression
— Unverified 0Learning to Plan for Retrieval-Augmented Large Language Models from Knowledge Graphs Jun 20, 2024 Knowledge Distillation Knowledge Graphs
Code Code Available 1BiLD: Bi-directional Logits Difference Loss for Large Language Model Distillation Jun 19, 2024 Knowledge Distillation Language Modeling
Code Code Available 1WaterMono: Teacher-Guided Anomaly Masking and Enhancement Boosting for Robust Underwater Self-Supervised Monocular Depth Estimation Jun 19, 2024 Depth Estimation Image Enhancement
Code Code Available 0Can Low-Rank Knowledge Distillation in LLMs be Useful for Microelectronic Reasoning? Jun 19, 2024 Knowledge Distillation
— Unverified 0Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation Jun 19, 2024 Knowledge Distillation
Code Code Available 0Intermediate Distillation: Data-Efficient Distillation from Black-Box LLMs for Information Retrieval Jun 18, 2024 Information Retrieval Knowledge Distillation
— Unverified 0Vernacular? I Barely Know Her: Challenges with Style Control and Stereotyping Jun 18, 2024 Knowledge Distillation
— Unverified 0From Instance Training to Instruction Learning: Task Adapters Generation from Instructions Jun 18, 2024 Knowledge Distillation
Code Code Available 2Enhancing Single-Slice Segmentation with 3D-to-2D Unpaired Scan Distillation Jun 18, 2024 Computed Tomography (CT) Knowledge Distillation
— Unverified 0Federated Learning with a Single Shared Image Jun 18, 2024 Federated Learning Knowledge Distillation
Code Code Available 0Mutual Learning for Finetuning Click-Through Rate Prediction Models Jun 17, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Graph Knowledge Distillation to Mixture of Experts Jun 17, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 0Lightweight Model Pre-training via Language Guided Knowledge Distillation Jun 17, 2024 Knowledge Distillation
Code Code Available 1STEVE Series: Step-by-Step Construction of Agent Systems in Minecraft Jun 17, 2024 Knowledge Distillation Language Modeling
— Unverified 0NLDF: Neural Light Dynamic Fields for Efficient 3D Talking Head Generation Jun 17, 2024 Knowledge Distillation NeRF
— Unverified 0Knowledge Distillation in Federated Learning: a Survey on Long Lasting Challenges and New Solutions Jun 16, 2024 Federated Learning Knowledge Distillation
— Unverified 0Self-Knowledge Distillation for Learning Ambiguity Jun 14, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0Contextual Distillation Model for Diversified Recommendation Jun 13, 2024 Diversity Knowledge Distillation
— Unverified 0PC-LoRA: Low-Rank Adaptation for Progressive Model Compression with Knowledge Distillation Jun 13, 2024 Knowledge Distillation Model Compression
— Unverified 0GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model Jun 12, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Low-Complexity Acoustic Scene Classification Using Parallel Attention-Convolution Network Jun 12, 2024 Acoustic Scene Classification Data Augmentation
Code Code Available 0Adaptive Teaching with Shared Classifier for Knowledge Distillation Jun 12, 2024 Knowledge Distillation
Code Code Available 0Unveiling Incomplete Modality Brain Tumor Segmentation: Leveraging Masked Predicted Auto-Encoder and Divergence Learning Jun 12, 2024 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation Jun 12, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0DistilDoc: Knowledge Distillation for Visually-Rich Document Applications Jun 12, 2024 document-image-classification Document Image Classification
— Unverified 0Self-Distillation Learning Based on Temporal-Spatial Consistency for Spiking Neural Networks Jun 12, 2024 Knowledge Distillation
— Unverified 0Small Scale Data-Free Knowledge Distillation Jun 12, 2024 Data-free Knowledge Distillation Generative Adversarial Network
Code Code Available 1FastAST: Accelerating Audio Spectrogram Transformer via Token Merging and Cross-Model Knowledge Distillation Jun 11, 2024 Audio Classification Knowledge Distillation
Code Code Available 0CTC-based Non-autoregressive Textless Speech-to-Speech Translation Jun 11, 2024 Knowledge Distillation Machine Translation
Code Code Available 1TernaryLLM: Ternarized Large Language Model Jun 11, 2024 Knowledge Distillation Language Modeling
— Unverified 0Hydra-MDP: End-to-end Multimodal Planning with Multi-target Hydra-Distillation Jun 11, 2024 Decoder Knowledge Distillation
Code Code Available 3Teaching with Uncertainty: Unleashing the Potential of Knowledge Distillation in Object Detection Jun 11, 2024 Knowledge Distillation object-detection
— Unverified 0BS-PLCNet 2: Two-stage Band-split Packet Loss Concealment Network with Intra-model Knowledge Distillation Jun 10, 2024 Knowledge Distillation Packet Loss Concealment
— Unverified 0DKDL-Net: A Lightweight Bearing Fault Detection Model via Decoupled Knowledge Distillation and Low-Rank Adaptation Fine-tuning Jun 10, 2024 Fault Detection Fault Diagnosis
Code Code Available 1Weighted KL-Divergence for Document Ranking Model Refinement Jun 10, 2024 Contrastive Learning Document Ranking
— Unverified 0Online Policy Distillation with Decision-Attention Jun 8, 2024 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Teaching-Assistant-in-the-Loop: Improving Knowledge Distillation from Imperfect Teacher Models in Low-Budget Scenarios Jun 8, 2024 Knowledge Distillation
— Unverified 0Data-Free Generative Replay for Class-Incremental Learning on Imbalanced Data Jun 7, 2024 class-incremental learning Class Incremental Learning
Code Code Available 0