Wake Vision: A Tailored Dataset and Benchmark Suite for TinyML Computer Vision Applications May 1, 2024 Human Detection Knowledge Distillation
— Unverified 0Error Exponent in Agnostic PAC Learning May 1, 2024 Binary Classification Knowledge Distillation
— Unverified 0Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism Apr 30, 2024 Data Augmentation Diversity
Code Code Available 0Knowledge Distillation vs. Pretraining from Scratch under a Fixed (Computation) Budget Apr 30, 2024 Knowledge Distillation Language Modeling
— Unverified 0Control Policy Correction Framework for Reinforcement Learning-based Energy Arbitrage Strategies Apr 29, 2024 Knowledge Distillation reinforcement-learning
— Unverified 0Revealing the Two Sides of Data Augmentation: An Asymmetric Distillation-based Win-Win Solution for Open-Set Recognition Apr 28, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Enhancing Action Recognition from Low-Quality Skeleton Data via Part-Level Knowledge Distillation Apr 28, 2024 Action Recognition General Knowledge
— Unverified 0A Novel Spike Transformer Network for Depth Estimation from Event Cameras via Cross-modality Knowledge Distillation Apr 26, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Correlation-Decoupled Knowledge Distillation for Multimodal Sentiment Analysis with Incomplete Modalities Apr 25, 2024 Disentanglement Knowledge Distillation
— Unverified 0Promoting CNNs with Cross-Architecture Knowledge Distillation for Efficient Monocular Depth Estimation Apr 25, 2024 Decoder Depth Estimation
— Unverified 0BeSound: Bluetooth-Based Position Estimation Enhancing with Cross-Modality Distillation Apr 24, 2024 Knowledge Distillation Position
— Unverified 0Sentence-Level or Token-Level? A Comprehensive Study on Knowledge Distillation Apr 23, 2024 Knowledge Distillation Machine Translation
— Unverified 0Compressed Meta-Optical Encoder for Image Classification Apr 23, 2024 Classification image-classification
— Unverified 0From LLM to NMT: Advancing Low-Resource Machine Translation with Claude Apr 22, 2024 Knowledge Distillation Language Modeling
— Unverified 0FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning Apr 22, 2024 Data-free Knowledge Distillation Federated Learning
— Unverified 0DynaMMo: Dynamic Model Merging for Efficient Class Incremental Learning for Medical Images Apr 22, 2024 class-incremental learning Class Incremental Learning
Code Code Available 0CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective Apr 22, 2024 Contrastive Learning image-classification
Code Code Available 0Distributed Learning for Wi-Fi AP Load Prediction Apr 22, 2024 Federated Learning Knowledge Distillation
— Unverified 0Towards Multi-Morphology Controllers with Diversity and Knowledge Distillation Apr 22, 2024 Diversity Knowledge Distillation
Code Code Available 0EncodeNet: A Framework for Boosting DNN Accuracy with Entropy-driven Generalized Converting Autoencoder Apr 21, 2024 image-classification Image Classification
— Unverified 0MergeNet: Knowledge Migration across Heterogeneous Models, Tasks, and Modalities Apr 20, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Parameter Efficient Diverse Paraphrase Generation Using Sequence-Level Knowledge Distillation Apr 19, 2024 Diversity Knowledge Distillation
— Unverified 0Data-free Knowledge Distillation for Fine-grained Visual Categorization Apr 18, 2024 Data-free Knowledge Distillation Fine-Grained Visual Categorization
Code Code Available 0EdgeFusion: On-Device Text-to-Image Generation Apr 18, 2024 Image Generation Knowledge Distillation
— Unverified 0KDk: A Defense Mechanism Against Label Inference Attacks in Vertical Federated Learning Apr 18, 2024 Federated Learning Knowledge Distillation
— Unverified 0GhostNetV3: Exploring the Training Strategies for Compact Models Apr 17, 2024 Image Classification Knowledge Distillation
— Unverified 0LAPTOP-Diff: Layer Pruning and Normalized Distillation for Compressing Diffusion Models Apr 17, 2024 Knowledge Distillation
— Unverified 0A Progressive Framework of Vision-language Knowledge Distillation and Alignment for Multilingual Scene Apr 17, 2024 image-classification Image Classification
— Unverified 0MK-SGN: A Spiking Graph Convolutional Network with Multimodal Fusion and Knowledge Distillation for Skeleton-based Action Recognition Apr 16, 2024 Action Recognition Knowledge Distillation
— Unverified 0Comprehensive Survey of Model Compression and Speed up for Vision Transformers Apr 16, 2024 Computational Efficiency Edge-computing
— Unverified 0AI-KD: Towards Alignment Invariant Face Image Quality Assessment Using Knowledge Distillation Apr 15, 2024 Face Alignment Face Image Quality
Code Code Available 0MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution Apr 15, 2024 Image Super-Resolution Knowledge Distillation
— Unverified 0ReffAKD: Resource-efficient Autoencoder-based Knowledge Distillation Apr 15, 2024 Knowledge Distillation
Code Code Available 0Weight Copy and Low-Rank Adaptation for Few-Shot Distillation of Vision Transformers Apr 14, 2024 Knowledge Distillation
Code Code Available 0Navigating the Landscape of Large Language Models: A Comprehensive Review and Analysis of Paradigms and Fine-Tuning Strategies Apr 13, 2024 Few-Shot Learning Knowledge Distillation
Code Code Available 0Boosting Self-Supervision for Single-View Scene Completion via Knowledge Distillation Apr 11, 2024 Depth Estimation Depth Prediction
— Unverified 0Remembering Transformer for Continual Learning Apr 11, 2024 Continual Learning Knowledge Distillation
— Unverified 0Edge-Efficient Deep Learning Models for Automatic Modulation Classification: A Performance Analysis Apr 11, 2024 Knowledge Distillation Model Optimization
— Unverified 0Adversarial Robustness of Distilled and Pruned Deep Learning-based Wireless Classifiers Apr 11, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0A predictive machine learning force field framework for liquid electrolyte development Apr 10, 2024 Knowledge Distillation
— Unverified 0Improving Facial Landmark Detection Accuracy and Efficiency with Knowledge Distillation Apr 9, 2024 Emotion Recognition Facial Landmark Detection
— Unverified 0Robust feature knowledge distillation for enhanced performance of lightweight crack segmentation models Apr 9, 2024 Crack Segmentation Knowledge Distillation
— Unverified 0GHOST: Grounded Human Motion Generation with Open Vocabulary Scene-and-Text Contexts Apr 8, 2024 Descriptive Image Segmentation
— Unverified 0Bootstrapping Chest CT Image Understanding by Distilling Knowledge from X-ray Expert Models Apr 7, 2024 Contrastive Learning Diagnostic
— Unverified 0What Happens When Small Is Made Smaller? Exploring the Impact of Compression on Small Data Pretrained Language Models Apr 6, 2024 Knowledge Distillation Language Modeling
— Unverified 0Do We Really Need a Complex Agent System? Distill Embodied Agent into a Single Model Apr 6, 2024 Knowledge Distillation
— Unverified 0Goldfish: An Efficient Federated Unlearning Framework Apr 4, 2024 Knowledge Distillation Machine Unlearning
Code Code Available 0On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models Apr 4, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 0Okay, Let's Do This! Modeling Event Coreference with Generated Rationales and Knowledge Distillation Apr 4, 2024 Clustering coreference-resolution
Code Code Available 0Knowledge Distillation-Based Model Extraction Attack using GAN-based Private Counterfactual Explanations Apr 4, 2024 counterfactual Knowledge Distillation
Code Code Available 0