FedDW: Distilling Weights through Consistency Optimization in Heterogeneous Federated Learning Dec 5, 2024 Federated Learning Knowledge Distillation
Code Code Available 0Diffusion-Augmented Coreset Expansion for Scalable Dataset Distillation Dec 5, 2024 Bilevel Optimization Computational Efficiency
— Unverified 0Expanding Deep Learning-based Sensing Systems with Multi-Source Knowledge Transfer Dec 5, 2024 Deep Learning Knowledge Distillation
— Unverified 0Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure Subtype Classification Dec 4, 2024 EEG Electroencephalogram (EEG)
— Unverified 0Distillation of Diffusion Features for Semantic Correspondence Dec 4, 2024 3D Reconstruction Data Augmentation
— Unverified 0Enhancing CLIP Conceptual Embedding through Knowledge Distillation Dec 4, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Align-KD: Distilling Cross-Modal Alignment Knowledge for Mobile Vision-Language Model Dec 2, 2024 cross-modal alignment Knowledge Distillation
Code Code Available 1Mutli-View 3D Reconstruction using Knowledge Distillation Dec 2, 2024 3D Reconstruction Depth Estimation
Code Code Available 0QABISAR: Query-Article Bipartite Interactions for Statutory Article Retrieval Dec 1, 2024 Articles Knowledge Distillation
— Unverified 0Local vs. Global: Local Land-Use and Land-Cover Models Deliver Higher Quality Maps Dec 1, 2024 Earth Observation Knowledge Distillation
— Unverified 0Continuous Concepts Removal in Text-to-image Diffusion Models Nov 30, 2024 Knowledge Distillation
— Unverified 0Toward Fair Graph Neural Networks Via Dual-Teacher Knowledge Distillation Nov 30, 2024 Fairness Graph Representation Learning
— Unverified 0Reverse Thinking Makes LLMs Stronger Reasoners Nov 29, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Headache to Overstock? Promoting Long-tail Items through Debiased Product Bundling Nov 28, 2024 Knowledge Distillation Navigate
— Unverified 0Puzzle: Distillation-Based NAS for Inference-Optimized LLMs Nov 28, 2024 GPU Knowledge Distillation
— Unverified 0Zero-shot Slot Filling in the Age of LLMs for Dialogue Systems Nov 28, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0Pre-Training Graph Contrastive Masked Autoencoders are Strong Distillers for EEG Nov 28, 2024 EEG Knowledge Distillation
— Unverified 0Active Data Curation Effectively Distills Large-Scale Multimodal Models Nov 27, 2024 Decoder Image Captioning
— Unverified 0Vision Mamba Distillation for Low-resolution Fine-grained Image Classification Nov 27, 2024 Classification Fine-Grained Image Classification
Code Code Available 1Improved implicit diffusion model with knowledge distillation to estimate the spatial distribution density of carbon stock in remote sensing imagery Nov 27, 2024 Knowledge Distillation
— Unverified 0Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation Nov 26, 2024 Data-free Knowledge Distillation Diversity
Code Code Available 0Words Matter: Leveraging Individual Text Embeddings for Code Generation in CLIP Test-Time Adaptation Nov 26, 2024 Code Generation Contrastive Learning
Code Code Available 0Leveraging Foundation Models To learn the shape of semi-fluid deformable objects Nov 25, 2024 Knowledge Distillation Object
— Unverified 0Dynamic Self-Distillation via Previous Mini-batches for Fine-tuning Small Language Models Nov 25, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0Ensemble Learning via Knowledge Transfer for CTR Prediction Nov 25, 2024 Click-Through Rate Prediction Ensemble Learning
Code Code Available 0Beyond Task Vectors: Selective Task Arithmetic Based on Importance Metrics Nov 25, 2024 Knowledge Distillation Multi-Task Learning
— Unverified 0O1 Replication Journey -- Part 2: Surpassing O1-preview through Simple Distillation, Big Progress or Bitter Lesson? Nov 25, 2024 Hallucination Knowledge Distillation
Code Code Available 7When Babies Teach Babies: Can student knowledge sharing outperform Teacher-Guided Distillation on small datasets? Nov 25, 2024 Knowledge Distillation Language Modeling
Code Code Available 0Learn from Foundation Model: Fruit Detection Model without Manual Annotation Nov 25, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 1TransFair: Transferring Fairness from Ocular Disease Classification to Progression Prediction Nov 24, 2024 Classification Fairness
— Unverified 0Efficient Ternary Weight Embedding Model: Bridging Scalability and Performance Nov 23, 2024 Computational Efficiency Knowledge Distillation
Code Code Available 0Partial Knowledge Distillation for Alleviating the Inherent Inter-Class Discrepancy in Federated Learning Nov 23, 2024 Federated Learning Knowledge Distillation
— Unverified 0Faithful Label-free Knowledge Distillation Nov 22, 2024 Inductive Bias Knowledge Distillation
Code Code Available 0BanglaEmbed: Efficient Sentence Embedding Models for a Low-Resource Language Using Cross-Lingual Distillation Techniques Nov 22, 2024 Hate Speech Detection Knowledge Distillation
— Unverified 0Adversarial Prompt Distillation for Vision-Language Models Nov 22, 2024 Adversarial Robustness Autonomous Driving
— Unverified 0RankByGene: Gene-Guided Histopathology Representation Learning Through Cross-Modal Ranking Consistency Nov 22, 2024 Knowledge Distillation Representation Learning
— Unverified 0Simplifying CLIP: Unleashing the Power of Large-Scale Models on Consumer-level Computers Nov 22, 2024 Data Augmentation GPU
— Unverified 0Adaptive Group Robust Ensemble Knowledge Distillation Nov 22, 2024 Knowledge Distillation
— Unverified 0Improving Mathematical Reasoning Capabilities of Small Language Models via Feedback-Driven Distillation Nov 22, 2024 Knowledge Distillation Mathematical Reasoning
— Unverified 0Information Extraction from Heterogeneous Documents without Ground Truth Labels using Synthetic Label Generation and Knowledge Distillation Nov 22, 2024 Anomaly Detection document understanding
— Unverified 0BiomedCoOp: Learning to Prompt for Biomedical Vision-Language Models Nov 21, 2024 image-classification Image Classification
Code Code Available 2WARLearn: Weather-Adaptive Representation Learning Nov 21, 2024 2D Object Detection Adversarial Robustness
Code Code Available 0Teaching MLPs to Master Heterogeneous Graph-Structured Knowledge for Efficient and Accurate Inference Nov 21, 2024 Graph Learning Knowledge Distillation
Code Code Available 0CLFace: A Scalable and Resource-Efficient Continual Learning Framework for Lifelong Face Recognition Nov 21, 2024 Continual Learning Face Recognition
— Unverified 0Explainable LLM-driven Multi-dimensional Distillation for E-Commerce Relevance Learning Nov 20, 2024 Knowledge Distillation Large Language Model
— Unverified 0RTSR: A Real-Time Super-Resolution Model for AV1 Compressed Content Nov 20, 2024 4k Knowledge Distillation
— Unverified 0What Makes a Good Dataset for Knowledge Distillation? Nov 19, 2024 Continual Learning Knowledge Distillation
— Unverified 0Just KIDDIN: Knowledge Infusion and Distillation for Detection of INdecent Memes Nov 19, 2024 Knowledge Distillation Knowledge Graphs
— Unverified 0Reward Modeling with Ordinal Feedback: Wisdom of the Crowd Nov 19, 2024 Knowledge Distillation
— Unverified 0KDC-MAE: Knowledge Distilled Contrastive Mask Auto-Encoder Nov 19, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0