Leveraging Expert Models for Training Deep Neural Networks in Scarce Data Domains: Application to Offline Handwritten Signature Verification Aug 2, 2023 Knowledge Distillation
— Unverified 0Spatio-Temporal Branching for Motion Prediction using Motion Increments Aug 2, 2023 Human motion prediction Knowledge Distillation
Code Code Available 0Towards Better Query Classification with Multi-Expert Knowledge Condensation in JD Ads Search Aug 2, 2023 Knowledge Distillation
— Unverified 0NormKD: Normalized Logits for Knowledge Distillation Aug 1, 2023 image-classification Image Classification
Code Code Available 1Ada-DQA: Adaptive Diverse Quality-aware Feature Acquisition for Video Quality Assessment Aug 1, 2023 Diversity Knowledge Distillation
— Unverified 0Online Prototype Learning for Online Continual Learning Aug 1, 2023 Continual Learning Knowledge Distillation
Code Code Available 1Can Self-Supervised Representation Learning Methods Withstand Distribution Shifts and Corruptions? Jul 31, 2023 Contrastive Learning Knowledge Distillation
Code Code Available 0Federated Learning for Data and Model Heterogeneity in Medical Imaging Jul 31, 2023 Federated Learning Knowledge Distillation
— Unverified 0BearingPGA-Net: A Lightweight and Deployable Bearing Fault Diagnosis Network via Decoupled Knowledge Distillation and FPGA Acceleration Jul 31, 2023 CPU Fault Diagnosis
Code Code Available 1Sampling to Distill: Knowledge Transfer from Open-World Data Jul 31, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Subspace Distillation for Continual Learning Jul 31, 2023 Continual Learning Knowledge Distillation
Code Code Available 0UPFL: Unsupervised Personalized Federated Learning towards New Clients Jul 29, 2023 Federated Learning Knowledge Distillation
Code Code Available 0Effective Whole-body Pose Estimation with Two-stages Distillation Jul 29, 2023 2D Human Pose Estimation Knowledge Distillation
Code Code Available 4f-Divergence Minimization for Sequence-Level Knowledge Distillation Jul 27, 2023 Knowledge Distillation
Code Code Available 1Incrementally-Computable Neural Networks: Efficient Inference for Dynamic Inputs Jul 27, 2023 Document Classification Knowledge Distillation
— Unverified 0Fitting Auditory Filterbanks with Multiresolution Neural Networks Jul 25, 2023 Inductive Bias Knowledge Distillation
Code Code Available 1Mitigating Cross-client GANs-based Attack in Federated Learning Jul 25, 2023 Data-free Knowledge Distillation Federated Learning
— Unverified 0MetricGAN-OKD: Multi-Metric Optimization of MetricGAN via Online Knowledge Distillation for Speech Enhancement Jul 24, 2023 Knowledge Distillation Speech Enhancement
Code Code Available 1A Good Student is Cooperative and Reliable: CNN-Transformer Collaborative Learning for Semantic Segmentation Jul 24, 2023 Knowledge Distillation Semantic Segmentation
— Unverified 0CLIP-KD: An Empirical Study of CLIP Model Distillation Jul 24, 2023 Contrastive Learning Cross-Modal Retrieval
Code Code Available 1HeteFedRec: Federated Recommender Systems with Model Heterogeneity Jul 24, 2023 Knowledge Distillation model
— Unverified 0Model Compression Methods for YOLOv5: A Review Jul 21, 2023 Knowledge Distillation model
— Unverified 0DPM-OT: A New Diffusion Probabilistic Model Based on Optimal Transport Jul 21, 2023 Denoising Knowledge Distillation
Code Code Available 1Distribution Shift Matters for Knowledge Distillation with Webly Collected Images Jul 21, 2023 Contrastive Learning Data-free Knowledge Distillation
— Unverified 0Quantized Feature Distillation for Network Quantization Jul 20, 2023 image-classification Image Classification
— Unverified 0Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering Jul 20, 2023 Clustering Data Augmentation
Code Code Available 0Reverse Knowledge Distillation: Training a Large Model using a Small One for Retinal Image Matching on Limited Data Jul 20, 2023 Image Registration Keypoint Detection
Code Code Available 1LightPath: Lightweight and Scalable Path Representation Learning Jul 19, 2023 Knowledge Distillation Relational Reasoning
Code Code Available 0Teach model to answer questions after comprehending the document Jul 18, 2023 Knowledge Distillation Machine Reading Comprehension
— Unverified 0FedDefender: Client-Side Attack-Tolerant Federated Learning Jul 18, 2023 Federated Learning Knowledge Distillation
Code Code Available 1A Survey on Open-Vocabulary Detection and Segmentation: Past, Present, and Future Jul 18, 2023 Knowledge Distillation object-detection
Code Code Available 2Knowledge Distillation for Object Detection: from generic to remote sensing datasets Jul 18, 2023 Knowledge Distillation Model Compression
— Unverified 0Class-relation Knowledge Distillation for Novel Class Discovery Jul 18, 2023 Knowledge Distillation Novel Class Discovery
Code Code Available 1Domain Knowledge Distillation from Large Language Model: An Empirical Study in the Autonomous Driving Domain Jul 17, 2023 Autonomous Driving Knowledge Distillation
— Unverified 0DARTS: Double Attention Reference-based Transformer for Super-resolution Jul 17, 2023 Image Super-Resolution Knowledge Distillation
Code Code Available 1Cumulative Spatial Knowledge Distillation for Vision Transformers Jul 17, 2023 Inductive Bias Knowledge Distillation
Code Code Available 1DOT: A Distillation-Oriented Trainer Jul 17, 2023 Knowledge Distillation
Code Code Available 2Improving End-to-End Speech Translation by Imitation-Based Knowledge Distillation with Synthetic Transcripts Jul 17, 2023 automatic-speech-translation Imitation Learning
Code Code Available 0Cross-Lingual NER for Financial Transaction Data in Low-Resource Languages Jul 16, 2023 Cross-Lingual NER Knowledge Distillation
— Unverified 0A Survey of Techniques for Optimizing Transformer Inference Jul 16, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 0MinT: Boosting Generalization in Mathematical Reasoning via Multi-View Fine-Tuning Jul 16, 2023 Knowledge Distillation Mathematical Reasoning
— Unverified 0Intuitive Access to Smartphone Settings Using Relevance Model Trained by Contrastive Learning Jul 15, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0SoccerKDNet: A Knowledge Distillation Framework for Action Recognition in Soccer Videos Jul 15, 2023 Action Recognition Knowledge Distillation
— Unverified 0Learning to Retrieve In-Context Examples for Large Language Models Jul 14, 2023 In-Context Learning Knowledge Distillation
Code Code Available 1DreamTeacher: Pretraining Image Backbones with Deep Generative Models Jul 14, 2023 Knowledge Distillation Representation Learning
— Unverified 0Multimodal Distillation for Egocentric Action Recognition Jul 14, 2023 Action Recognition Knowledge Distillation
Code Code Available 1A metric learning approach for endoscopic kidney stone identification Jul 13, 2023 Few-Shot Learning Knowledge Distillation
— Unverified 0Frameless Graph Knowledge Distillation Jul 13, 2023 Graph Representation Learning Knowledge Distillation
Code Code Available 0Regression-Oriented Knowledge Distillation for Lightweight Ship Orientation Angle Prediction with Optical Remote Sensing Images Jul 13, 2023 Knowledge Distillation Prediction
Code Code Available 0The Staged Knowledge Distillation in Video Classification: Harmonizing Student Progress by a Complementary Weakly Supervised Framework Jul 11, 2023 Knowledge Distillation Pseudo Label
— Unverified 0