Extreme compression of sentence-transformer ranker models: faster inference, longer battery life, and less storage on edge devices Jun 29, 2022 Dimensionality Reduction Knowledge Distillation
— Unverified 0Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing? Jun 29, 2022 image-classification Image Classification
Code Code Available 1Knowledge Distillation of Transformer-based Language Models Revisited Jun 29, 2022 GPU Knowledge Distillation
— Unverified 0QTI Submission to DCASE 2021: residual normalization for device-imbalanced acoustic scene classification with efficient design Jun 28, 2022 Acoustic Scene Classification Knowledge Distillation
— Unverified 0Cooperative Retriever and Ranker in Deep Recommenders Jun 28, 2022 Knowledge Distillation Recommendation Systems
Code Code Available 0Revisiting Architecture-aware Knowledge Distillation: Smaller Models and Faster Search Jun 27, 2022 Bayesian Optimization Knowledge Distillation
— Unverified 0Representative Teacher Keys for Knowledge Distillation Model Compression Based on Attention Mechanism for Image Classification Jun 26, 2022 GPU image-classification
— Unverified 0Feature Representation Learning for Robust Retinal Disease Detection from Optical Coherence Tomography Images Jun 24, 2022 Decoder Knowledge Distillation
Code Code Available 0Mixed Sample Augmentation for Online Distillation Jun 24, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Knowledge Distillation via Weighted Ensemble of Teaching Assistants Jun 23, 2022 Ensemble Learning Knowledge Distillation
— Unverified 0Conformer with dual-mode chunked attention for joint online and offline ASR Jun 22, 2022 Knowledge Distillation
— Unverified 0Knowledge Distillation for Oriented Object Detection on Aerial Images Jun 20, 2022 Knowledge Distillation Model Compression
— Unverified 0MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Personalized Healthcare Jun 17, 2022 Federated Learning Knowledge Distillation
Code Code Available 2Revisiting Self-Distillation Jun 17, 2022 Knowledge Distillation Model Compression
— Unverified 0Multi scale Feature Extraction and Fusion for Online Knowledge Distillation Jun 16, 2022 Knowledge Distillation Transfer Learning
— Unverified 0FreeKD: Free-direction Knowledge Distillation for Graph Neural Networks Jun 14, 2022 Knowledge Distillation reinforcement-learning
— Unverified 0FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models Jun 14, 2022 Cross-Lingual Transfer Diagnostic
— Unverified 0Toward Student-Oriented Teacher Network Training For Knowledge Distillation Jun 14, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Robust Distillation for Worst-class Performance Jun 13, 2022 Knowledge Distillation
— Unverified 0Better Teacher Better Student: Dynamic Prior Knowledge for Knowledge Distillation Jun 13, 2022 image-classification Image Classification
Code Code Available 0Federated Bayesian Neural Regression: A Scalable Global Federated Gaussian Process Jun 13, 2022 Federated Learning Knowledge Distillation
— Unverified 0The Modality Focusing Hypothesis: Towards Understanding Crossmodal Knowledge Distillation Jun 13, 2022 Knowledge Distillation Transfer Learning
Code Code Available 1Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting Jun 11, 2022 Computational Efficiency Crowd Counting
Code Code Available 0Knowledge Distillation Decision Tree for Unravelling Black-box Machine Learning Models Jun 9, 2022 Knowledge Distillation
— Unverified 0SDQ: Stochastic Differentiable Quantization with Mixed Precision Jun 9, 2022 Knowledge Distillation Neural Architecture Search
— Unverified 0Narrowing the Coordinate-frame Gap in Behavior Prediction Models: Distillation for Efficient and Accurate Scene-centric Motion Forecasting Jun 8, 2022 Autonomous Driving Knowledge Distillation
— Unverified 0Reconsidering Learning Objectives in Unbiased Recommendation with Unobserved Confounders Jun 7, 2022 Generalization Bounds Knowledge Distillation
— Unverified 0cViL: Cross-Lingual Training of Vision-Language Models using Knowledge Distillation Jun 7, 2022 Knowledge Distillation Question Answering
Code Code Available 0Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding Jun 7, 2022 Graph Embedding Knowledge Distillation
— Unverified 0Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images Jun 7, 2022 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Evaluation-oriented Knowledge Distillation for Deep Face Recognition Jun 6, 2022 Face Recognition Knowledge Distillation
— Unverified 0Lip-Listening: Mixing Senses to Understand Lips using Cross Modality Knowledge Distillation for Word-Based Models Jun 5, 2022 Knowledge Distillation Lipreading
— Unverified 0Point-to-Voxel Knowledge Distillation for LiDAR Semantic Segmentation Jun 5, 2022 3D Semantic Segmentation Knowledge Distillation
Code Code Available 0Vanilla Feature Distillation for Improving the Accuracy-Robustness Trade-Off in Adversarial Training Jun 5, 2022 Knowledge Distillation
— Unverified 0Guided Deep Metric Learning Jun 4, 2022 Few-Shot Learning Knowledge Distillation
— Unverified 0Extreme Compression for Pre-trained Transformers Made Simple and Efficient Jun 4, 2022 Knowledge Distillation Quantization
— Unverified 0ZeroQuant: Efficient and Affordable Post-Training Quantization for Large-Scale Transformers Jun 4, 2022 Knowledge Distillation Quantization
Code Code Available 23D-Augmented Contrastive Knowledge Distillation for Image-based Object Pose Estimation Jun 2, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0Detecting Optimism in Tweets using Knowledge Distillation and Linguistic Analysis of Optimism Jun 1, 2022 Hate Speech Detection Knowledge Distillation
— Unverified 0ORC: Network Group-based Knowledge Distillation using Online Role Change Jun 1, 2022 Knowledge Distillation
Code Code Available 0Generalized Supervised Contrastive Learning Jun 1, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0Searching for COMETINHO: The Little Metric That Could Jun 1, 2022 Computational Efficiency Knowledge Distillation
— Unverified 0VFed-SSD: Towards Practical Vertical Federated Advertising May 31, 2022 Federated Learning Knowledge Distillation
— Unverified 0What Knowledge Gets Distilled in Knowledge Distillation? May 31, 2022 Knowledge Distillation
— Unverified 0itKD: Interchange Transfer-based Knowledge Distillation for 3D Object Detection May 31, 2022 3D Object Detection Cloud Detection
Code Code Available 1Knowledge Distillation for 6D Pose Estimation by Aligning Distributions of Local Predictions May 30, 2022 6D Pose Estimation 6D Pose Estimation using RGB
— Unverified 0RLx2: Training a Sparse Deep Reinforcement Learning Model from Scratch May 30, 2022 Continuous Control Deep Reinforcement Learning
Code Code Available 1Spectral Maps for Learning on Subgraphs May 30, 2022 Graph Learning Knowledge Distillation
— Unverified 0Towards Efficient 3D Object Detection with Knowledge Distillation May 30, 2022 3D Object Detection Knowledge Distillation
Code Code Available 1A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks May 29, 2022 Data Augmentation image-classification
— Unverified 0