Leveraging Expert Models for Training Deep Neural Networks in Scarce Data Domains: Application to Offline Handwritten Signature Verification Aug 2, 2023 Knowledge Distillation
— Unverified 0A vision transformer-based framework for knowledge transfer from multi-modal to mono-modal lymphoma subtyping models Aug 2, 2023 Knowledge Distillation Transfer Learning
— Unverified 0Three Factors to Improve Out-of-Distribution Detection Aug 2, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0Spatio-Temporal Branching for Motion Prediction using Motion Increments Aug 2, 2023 Human motion prediction Knowledge Distillation
Code Code Available 0Towards Better Query Classification with Multi-Expert Knowledge Condensation in JD Ads Search Aug 2, 2023 Knowledge Distillation
— Unverified 0Ada-DQA: Adaptive Diverse Quality-aware Feature Acquisition for Video Quality Assessment Aug 1, 2023 Diversity Knowledge Distillation
— Unverified 0Subspace Distillation for Continual Learning Jul 31, 2023 Continual Learning Knowledge Distillation
Code Code Available 0Sampling to Distill: Knowledge Transfer from Open-World Data Jul 31, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Federated Learning for Data and Model Heterogeneity in Medical Imaging Jul 31, 2023 Federated Learning Knowledge Distillation
— Unverified 0Can Self-Supervised Representation Learning Methods Withstand Distribution Shifts and Corruptions? Jul 31, 2023 Contrastive Learning Knowledge Distillation
Code Code Available 0UPFL: Unsupervised Personalized Federated Learning towards New Clients Jul 29, 2023 Federated Learning Knowledge Distillation
Code Code Available 0Incrementally-Computable Neural Networks: Efficient Inference for Dynamic Inputs Jul 27, 2023 Document Classification Knowledge Distillation
— Unverified 0Mitigating Cross-client GANs-based Attack in Federated Learning Jul 25, 2023 Data-free Knowledge Distillation Federated Learning
— Unverified 0A Good Student is Cooperative and Reliable: CNN-Transformer Collaborative Learning for Semantic Segmentation Jul 24, 2023 Knowledge Distillation Semantic Segmentation
— Unverified 0HeteFedRec: Federated Recommender Systems with Model Heterogeneity Jul 24, 2023 Knowledge Distillation model
— Unverified 0Distribution Shift Matters for Knowledge Distillation with Webly Collected Images Jul 21, 2023 Contrastive Learning Data-free Knowledge Distillation
— Unverified 0Model Compression Methods for YOLOv5: A Review Jul 21, 2023 Knowledge Distillation model
— Unverified 0Quantized Feature Distillation for Network Quantization Jul 20, 2023 image-classification Image Classification
— Unverified 0Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering Jul 20, 2023 Clustering Data Augmentation
Code Code Available 0LightPath: Lightweight and Scalable Path Representation Learning Jul 19, 2023 Knowledge Distillation Relational Reasoning
Code Code Available 0Teach model to answer questions after comprehending the document Jul 18, 2023 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Knowledge Distillation for Object Detection: from generic to remote sensing datasets Jul 18, 2023 Knowledge Distillation Model Compression
— Unverified 0Improving End-to-End Speech Translation by Imitation-Based Knowledge Distillation with Synthetic Transcripts Jul 17, 2023 automatic-speech-translation Imitation Learning
Code Code Available 0Domain Knowledge Distillation from Large Language Model: An Empirical Study in the Autonomous Driving Domain Jul 17, 2023 Autonomous Driving Knowledge Distillation
— Unverified 0Cross-Lingual NER for Financial Transaction Data in Low-Resource Languages Jul 16, 2023 Cross-Lingual NER Knowledge Distillation
— Unverified 0MinT: Boosting Generalization in Mathematical Reasoning via Multi-View Fine-Tuning Jul 16, 2023 Knowledge Distillation Mathematical Reasoning
— Unverified 0A Survey of Techniques for Optimizing Transformer Inference Jul 16, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 0Intuitive Access to Smartphone Settings Using Relevance Model Trained by Contrastive Learning Jul 15, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0SoccerKDNet: A Knowledge Distillation Framework for Action Recognition in Soccer Videos Jul 15, 2023 Action Recognition Knowledge Distillation
— Unverified 0DreamTeacher: Pretraining Image Backbones with Deep Generative Models Jul 14, 2023 Knowledge Distillation Representation Learning
— Unverified 0Regression-Oriented Knowledge Distillation for Lightweight Ship Orientation Angle Prediction with Optical Remote Sensing Images Jul 13, 2023 Knowledge Distillation Prediction
Code Code Available 0Frameless Graph Knowledge Distillation Jul 13, 2023 Graph Representation Learning Knowledge Distillation
Code Code Available 0A metric learning approach for endoscopic kidney stone identification Jul 13, 2023 Few-Shot Learning Knowledge Distillation
— Unverified 0The Staged Knowledge Distillation in Video Classification: Harmonizing Student Progress by a Complementary Weakly Supervised Framework Jul 11, 2023 Knowledge Distillation Pseudo Label
— Unverified 0Customizing Synthetic Data for Data-Free Student Learning Jul 10, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data Jul 7, 2023 Knowledge Distillation Model Compression
Code Code Available 0On-Device Constrained Self-Supervised Speech Representation Learning for Keyword Spotting via Knowledge Distillation Jul 6, 2023 Keyword Spotting Knowledge Distillation
— Unverified 0Contextual Affinity Distillation for Image Anomaly Detection Jul 6, 2023 Anomaly Detection Knowledge Distillation
— Unverified 0Distilling Missing Modality Knowledge from Ultrasound for Endometriosis Diagnosis with Magnetic Resonance Images Jul 5, 2023 Knowledge Distillation
— Unverified 0KDSTM: Neural Semi-supervised Topic Modeling with Knowledge Distillation Jul 4, 2023 Classification Knowledge Distillation
— Unverified 0Review helps learn better: Temporal Supervised Knowledge Distillation Jul 3, 2023 image-classification Image Classification
— Unverified 0Shared Growth of Graph Neural Networks via Prompted Free-direction Knowledge Distillation Jul 2, 2023 Knowledge Distillation Prompt Learning
— Unverified 0Long-Tailed Continual Learning For Visual Food Recognition Jul 1, 2023 Continual Learning Data Augmentation
— Unverified 0Streaming egocentric action anticipation: An evaluation scheme and approach Jun 29, 2023 Action Anticipation Knowledge Distillation
— Unverified 0Understanding the Overfitting of the Episodic Meta-training Jun 29, 2023 Knowledge Distillation
— Unverified 0A Dimensional Structure based Knowledge Distillation Method for Cross-Modal Learning Jun 28, 2023 Knowledge Distillation
— Unverified 0Exploring Dual Model Knowledge Distillation for Anomaly Detection Jun 27, 2023 Anomaly Detection feature selection
— Unverified 0Shoggoth: Towards Efficient Edge-Cloud Collaborative Real-Time Video Inference via Adaptive Online Learning Jun 27, 2023 Knowledge Distillation
— Unverified 0Reducing the gap between streaming and non-streaming Transducer-based ASR by adaptive two-stage knowledge distillation Jun 27, 2023 Knowledge Distillation speech-recognition
— Unverified 0Accelerating Molecular Graph Neural Networks via Knowledge Distillation Jun 26, 2023 Data Augmentation Knowledge Distillation
— Unverified 0