Retrieval-Oriented Knowledge for Click-Through Rate Prediction Apr 28, 2024 Click-Through Rate Prediction Contrastive Learning
Code Code Available 1A Novel Spike Transformer Network for Depth Estimation from Event Cameras via Cross-modality Knowledge Distillation Apr 26, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Correlation-Decoupled Knowledge Distillation for Multimodal Sentiment Analysis with Incomplete Modalities Apr 25, 2024 Disentanglement Knowledge Distillation
— Unverified 0Promoting CNNs with Cross-Architecture Knowledge Distillation for Efficient Monocular Depth Estimation Apr 25, 2024 Decoder Depth Estimation
— Unverified 0BeSound: Bluetooth-Based Position Estimation Enhancing with Cross-Modality Distillation Apr 24, 2024 Knowledge Distillation Position
— Unverified 0Compressed Meta-Optical Encoder for Image Classification Apr 23, 2024 Classification image-classification
— Unverified 0Sentence-Level or Token-Level? A Comprehensive Study on Knowledge Distillation Apr 23, 2024 Knowledge Distillation Machine Translation
— Unverified 0Distributed Learning for Wi-Fi AP Load Prediction Apr 22, 2024 Federated Learning Knowledge Distillation
— Unverified 0Towards Multi-Morphology Controllers with Diversity and Knowledge Distillation Apr 22, 2024 Diversity Knowledge Distillation
Code Code Available 0DynaMMo: Dynamic Model Merging for Efficient Class Incremental Learning for Medical Images Apr 22, 2024 class-incremental learning Class Incremental Learning
Code Code Available 0From LLM to NMT: Advancing Low-Resource Machine Translation with Claude Apr 22, 2024 Knowledge Distillation Language Modeling
— Unverified 0CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective Apr 22, 2024 Contrastive Learning image-classification
Code Code Available 0FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning Apr 22, 2024 Data-free Knowledge Distillation Federated Learning
— Unverified 0EncodeNet: A Framework for Boosting DNN Accuracy with Entropy-driven Generalized Converting Autoencoder Apr 21, 2024 image-classification Image Classification
— Unverified 0MergeNet: Knowledge Migration across Heterogeneous Models, Tasks, and Modalities Apr 20, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Dynamic Temperature Knowledge Distillation Apr 19, 2024 Knowledge Distillation
Code Code Available 1Parameter Efficient Diverse Paraphrase Generation Using Sequence-Level Knowledge Distillation Apr 19, 2024 Diversity Knowledge Distillation
— Unverified 0EdgeFusion: On-Device Text-to-Image Generation Apr 18, 2024 Image Generation Knowledge Distillation
— Unverified 0Data-free Knowledge Distillation for Fine-grained Visual Categorization Apr 18, 2024 Data-free Knowledge Distillation Fine-Grained Visual Categorization
Code Code Available 0KDk: A Defense Mechanism Against Label Inference Attacks in Vertical Federated Learning Apr 18, 2024 Federated Learning Knowledge Distillation
— Unverified 0LAPTOP-Diff: Layer Pruning and Normalized Distillation for Compressing Diffusion Models Apr 17, 2024 Knowledge Distillation
— Unverified 0GhostNetV3: Exploring the Training Strategies for Compact Models Apr 17, 2024 Image Classification Knowledge Distillation
— Unverified 0A Progressive Framework of Vision-language Knowledge Distillation and Alignment for Multilingual Scene Apr 17, 2024 image-classification Image Classification
— Unverified 0Comprehensive Survey of Model Compression and Speed up for Vision Transformers Apr 16, 2024 Computational Efficiency Edge-computing
— Unverified 0MK-SGN: A Spiking Graph Convolutional Network with Multimodal Fusion and Knowledge Distillation for Skeleton-based Action Recognition Apr 16, 2024 Action Recognition Knowledge Distillation
— Unverified 0Camera clustering for scalable stream-based active distillation Apr 16, 2024 Clustering Knowledge Distillation
Code Code Available 1Digging into contrastive learning for robust depth estimation with diffusion models Apr 15, 2024 Contrastive Learning Denoising
Code Code Available 1ReffAKD: Resource-efficient Autoencoder-based Knowledge Distillation Apr 15, 2024 Knowledge Distillation
Code Code Available 0AI-KD: Towards Alignment Invariant Face Image Quality Assessment Using Knowledge Distillation Apr 15, 2024 Face Alignment Face Image Quality
Code Code Available 0MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution Apr 15, 2024 Image Super-Resolution Knowledge Distillation
— Unverified 0Weight Copy and Low-Rank Adaptation for Few-Shot Distillation of Vision Transformers Apr 14, 2024 Knowledge Distillation
Code Code Available 0Navigating the Landscape of Large Language Models: A Comprehensive Review and Analysis of Paradigms and Fine-Tuning Strategies Apr 13, 2024 Few-Shot Learning Knowledge Distillation
Code Code Available 0Edge-Efficient Deep Learning Models for Automatic Modulation Classification: A Performance Analysis Apr 11, 2024 Knowledge Distillation Model Optimization
— Unverified 0Adversarial Robustness of Distilled and Pruned Deep Learning-based Wireless Classifiers Apr 11, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0Boosting Self-Supervision for Single-View Scene Completion via Knowledge Distillation Apr 11, 2024 Depth Estimation Depth Prediction
— Unverified 0Rethinking Transformer-Based Blind-Spot Network for Self-Supervised Image Denoising Apr 11, 2024 Computational Efficiency Denoising
Code Code Available 2Remembering Transformer for Continual Learning Apr 11, 2024 Continual Learning Knowledge Distillation
— Unverified 0A predictive machine learning force field framework for liquid electrolyte development Apr 10, 2024 Knowledge Distillation
— Unverified 0Optimization Methods for Personalizing Large Language Models through Retrieval Augmentation Apr 9, 2024 Knowledge Distillation Language Modeling
Code Code Available 2Improving Facial Landmark Detection Accuracy and Efficiency with Knowledge Distillation Apr 9, 2024 Emotion Recognition Facial Landmark Detection
— Unverified 0Robust feature knowledge distillation for enhanced performance of lightweight crack segmentation models Apr 9, 2024 Crack Segmentation Knowledge Distillation
— Unverified 0CLIP-Embed-KD: Computationally Efficient Knowledge Distillation Using Embeddings as Teachers Apr 9, 2024 Knowledge Distillation Zero-shot Generalization
Code Code Available 1GHOST: Grounded Human Motion Generation with Open Vocabulary Scene-and-Text Contexts Apr 8, 2024 Descriptive Image Segmentation
— Unverified 0Bootstrapping Chest CT Image Understanding by Distilling Knowledge from X-ray Expert Models Apr 7, 2024 Contrastive Learning Diagnostic
— Unverified 0MonoTAKD: Teaching Assistant Knowledge Distillation for Monocular 3D Object Detection Apr 7, 2024 3D Object Detection Autonomous Driving
Code Code Available 1Diffusion Time-step Curriculum for One Image to 3D Generation Apr 6, 2024 3D Generation Image to 3D
Code Code Available 2What Happens When Small Is Made Smaller? Exploring the Impact of Compression on Small Data Pretrained Language Models Apr 6, 2024 Knowledge Distillation Language Modeling
— Unverified 0Do We Really Need a Complex Agent System? Distill Embodied Agent into a Single Model Apr 6, 2024 Knowledge Distillation
— Unverified 0Knowledge Distillation-Based Model Extraction Attack using GAN-based Private Counterfactual Explanations Apr 4, 2024 counterfactual Knowledge Distillation
Code Code Available 0On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models Apr 4, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 0