Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head Nov 13, 2024 Attribute Knowledge Distillation
— Unverified 0UIFormer: A Unified Transformer-based Framework for Incremental Few-Shot Object Detection and Instance Segmentation Nov 13, 2024 Decoder Few-Shot Object Detection
— Unverified 0Feature Interaction Fusion Self-Distillation Network For CTR Prediction Nov 12, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Query Optimization for Parametric Knowledge Refinement in Retrieval-Augmented Large Language Models Nov 12, 2024 Knowledge Distillation Question Answering
— Unverified 0Learning with Less: Knowledge Distillation from Large Language Models via Unlabeled Data Nov 12, 2024 Knowledge Distillation
— Unverified 0Joint Diffusion models in Continual Learning Nov 12, 2024 Continual Learning Knowledge Distillation
— Unverified 0Quantifying Knowledge Distillation Using Partial Information Decomposition Nov 12, 2024 Knowledge Distillation Transfer Learning
— Unverified 0An Efficient Memory Module for Graph Few-Shot Class-Incremental Learning Nov 11, 2024 class-incremental learning Class Incremental Learning
Code Code Available 0CULL-MT: Compression Using Language and Layer pruning for Machine Translation Nov 10, 2024 Knowledge Distillation Machine Translation
— Unverified 0Over-parameterized Student Model via Tensor Decomposition Boosted Knowledge Distillation Nov 10, 2024 Knowledge Distillation Tensor Decomposition
Code Code Available 0Dynamic Textual Prompt For Rehearsal-free Lifelong Person Re-identification Nov 9, 2024 Knowledge Distillation Person Re-Identification
— Unverified 0Multi-Document Financial Question Answering using LLMs Nov 8, 2024 Knowledge Distillation Knowledge Graphs
— Unverified 0Knowledge Distillation Neural Network for Predicting Car-following Behaviour of Human-driven and Autonomous Vehicles Nov 8, 2024 Autonomous Vehicles Descriptive
— Unverified 0Towards Lifelong Few-Shot Customization of Text-to-Image Diffusion Nov 8, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Asterisk*: Keep it Simple Nov 8, 2024 Classification Knowledge Distillation
— Unverified 0Mitigating Hallucination with ZeroG: An Advanced Knowledge Management Engine Nov 8, 2024 Computational Efficiency Hallucination
— Unverified 0Performance-Guided LLM Knowledge Distillation for Efficient Text Classification at Scale Nov 7, 2024 Active Learning Benchmarking
— Unverified 0GazeGen: Gaze-Driven User Interaction for Visual Content Generation Nov 7, 2024 Gaze Estimation Knowledge Distillation
— Unverified 0Towards Personalized Federated Learning via Comprehensive Knowledge Distillation Nov 6, 2024 Federated Learning Knowledge Distillation
— Unverified 0Multimodal Commonsense Knowledge Distillation for Visual Question Answering Nov 5, 2024 Knowledge Distillation Question Answering
— Unverified 0Transformer-Based Fault-Tolerant Control for Fixed-Wing UAVs Using Knowledge Distillation and In-Context Adaptation Nov 5, 2024 Fault Detection In-Context Learning
— Unverified 0Centerness-based Instance-aware Knowledge Distillation with Task-wise Mutual Lifting for Object Detection on Drone Imagery Nov 5, 2024 Knowledge Distillation object-detection
— Unverified 0Training on the Test Model: Contamination in Ranking Distillation Nov 4, 2024 Knowledge Distillation
Code Code Available 0Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment Nov 3, 2024 Knowledge Distillation Philosophy
— Unverified 0Towards Building Secure UAV Navigation with FHE-aware Knowledge Distillation Nov 1, 2024 Knowledge Distillation Reinforcement Learning (RL)
— Unverified 0Adapting While Learning: Grounding LLMs for Scientific Problems with Intelligent Tool Usage Adaptation Nov 1, 2024 Epidemiology Knowledge Distillation
— Unverified 0On the Impact of White-box Deployment Strategies for Edge AI on Latency and Model Performance Nov 1, 2024 Knowledge Distillation
— Unverified 0Semantic Knowledge Distillation for Onboard Satellite Earth Observation Image Classification Oct 31, 2024 Earth Observation image-classification
Code Code Available 0IP-MOT: Instance Prompt Learning for Cross-Domain Multi-Object Tracking Oct 30, 2024 Knowledge Distillation Language Modelling
— Unverified 0The Graph's Apprentice: Teaching an LLM Low Level Knowledge for Circuit Quality Estimation Oct 30, 2024 Knowledge Distillation
— Unverified 0Unsupervised Training of a Dynamic Context-Aware Deep Denoising Framework for Low-Dose Fluoroscopic Imaging Oct 29, 2024 Denoising Diagnostic
Code Code Available 0Deep Learning for Medical Text Processing: BERT Model Fine-Tuning and Comparative Study Oct 28, 2024 Knowledge Distillation
— Unverified 0Knowledge Distillation for Real-Time Classification of Early Media in Voice Communications Oct 28, 2024 Audio Tagging Classification
— Unverified 0Unveiling Context-Aware Criteria in Self-Assessing LLMs Oct 28, 2024 Knowledge Distillation
— Unverified 0Relaxed Recursive Transformers: Effective Parameter Sharing with Layer-wise LoRA Oct 28, 2024 Knowledge Distillation
— Unverified 0SWITCH: Studying with Teacher for Knowledge Distillation of Large Language Models Oct 25, 2024 Instruction Following Knowledge Distillation
— Unverified 0Knowledge Distillation Using Frontier Open-source LLMs: Generalizability and the Role of Synthetic Data Oct 24, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0AlignCap: Aligning Speech Emotion Captioning to Human Preferences Oct 24, 2024 Knowledge Distillation Language Modeling
— Unverified 0SIKeD: Self-guided Iterative Knowledge Distillation for mathematical reasoning Oct 24, 2024 Knowledge Distillation Mathematical Reasoning
Code Code Available 0High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws Oct 24, 2024 Knowledge Distillation regression
— Unverified 0Towards Effective Data-Free Knowledge Distillation via Diverse Diffusion Augmentation Oct 23, 2024 Data-free Knowledge Distillation Diversity
Code Code Available 0ELAICHI: Enhancing Low-resource TTS by Addressing Infrequent and Low-frequency Character Bigrams Oct 23, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Towards Active Participant-Centric Vertical Federated Learning: Some Representations May Be All You Need Oct 23, 2024 All Federated Learning
— Unverified 0AttriPrompter: Auto-Prompting with Attribute Semantics for Zero-shot Nuclei Detection via Visual-Language Pre-trained Models Oct 22, 2024 Attribute Knowledge Distillation
Code Code Available 0SafetyAnalyst: Interpretable, Transparent, and Steerable Safety Moderation for AI Behavior Oct 22, 2024 Knowledge Distillation
— Unverified 0CK4Gen: A Knowledge Distillation Framework for Generating High-Utility Synthetic Survival Datasets in Healthcare Oct 22, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Pre-training Distillation for Large Language Models: A Design Space Exploration Oct 21, 2024 Knowledge Distillation
— Unverified 0Model Mimic Attack: Knowledge Distillation for Provably Transferable Adversarial Examples Oct 21, 2024 Knowledge Distillation
— Unverified 0GSSF: Generalized Structural Sparse Function for Deep Cross-modal Metric Learning Oct 20, 2024 Image Retrieval Image-text Retrieval
Code Code Available 0LLaVA-Ultra: Large Chinese Language and Vision Assistant for Ultrasound Oct 19, 2024 Instruction Following Knowledge Distillation
— Unverified 0