HVDistill: Transferring Knowledge from Images to Point Clouds via Unsupervised Hybrid-View Distillation Mar 18, 2024 Knowledge Distillation NER
Code Code Available 05 On the Transferability of Visual Features in Generalized Zero-Shot Learning Nov 22, 2022 Generalized Zero-Shot Learning Knowledge Distillation
Code Code Available 05 Hybrid Attention Model Using Feature Decomposition and Knowledge Distillation for Glucose Forecasting Nov 16, 2024 Knowledge Distillation
Code Code Available 05 Hybrid Data-Free Knowledge Distillation Dec 18, 2024 Data-free Knowledge Distillation Generative Adversarial Network
Code Code Available 05 Applying Knowledge Distillation to Improve Weed Mapping With Drones Oct 8, 2023 Knowledge Distillation Management
Code Code Available 05 Chemical transformer compression for accelerating both training and inference of molecular modeling May 16, 2022 Knowledge Distillation Model Compression
Code Code Available 05 Distribution Aligned Semantics Adaption for Lifelong Person Re-Identification May 30, 2024 Knowledge Distillation Person Re-Identification
Code Code Available 05 Facilitating NSFW Text Detection in Open-Domain Dialogue Systems via Knowledge Distillation Sep 18, 2023 Chatbot Knowledge Distillation
Code Code Available 05 Facilitating Pornographic Text Detection for Open-Domain Dialogue Systems via Knowledge Distillation of Large Language Models Mar 20, 2024 Chatbot Knowledge Distillation
Code Code Available 05 Distributed Soft Actor-Critic with Multivariate Reward Representation and Knowledge Distillation Nov 29, 2019 Knowledge Distillation reinforcement-learning
Code Code Available 05 TinyBERT: Distilling BERT for Natural Language Understanding Sep 23, 2019 Knowledge Distillation Language Modelling
Code Code Available 05 HTR-JAND: Handwritten Text Recognition with Joint Attention Network and Knowledge Distillation Dec 24, 2024 Computational Efficiency Handwritten Text Recognition
Code Code Available 05 Human Guided Exploitation of Interpretable Attention Patterns in Summarization and Topic Segmentation Dec 10, 2021 Extractive Summarization Knowledge Distillation
Code Code Available 05 Image Recognition with Online Lightweight Vision Transformer: A Survey May 6, 2025 Knowledge Distillation Survey
Code Code Available 05 Invariant debiasing learning for recommendation via biased imputation Dec 28, 2024 Imputation Knowledge Distillation
Code Code Available 05 How Knowledge Distillation Mitigates the Synthetic Gap in Fair Face Recognition Aug 30, 2024 Face Recognition Fairness
Code Code Available 05 HiTSR: A Hierarchical Transformer for Reference-based Super-Resolution Aug 30, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 05 Holistic White-light Polyp Classification via Alignment-free Dense Distillation of Auxiliary Optical Chromoendoscopy May 25, 2025 Diagnostic Knowledge Distillation
Code Code Available 05 Highlight Every Step: Knowledge Distillation via Collaborative Teaching Jul 23, 2019 Knowledge Distillation
Code Code Available 05 HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification Jul 10, 2024 Computational Efficiency image-classification
Code Code Available 05 Distill n' Explain: explaining graph neural networks using simple surrogates Mar 17, 2023 Knowledge Distillation
Code Code Available 05 Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation Jun 12, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 05 Distilling Virtual Examples for Long-tailed Recognition Mar 28, 2021 Knowledge Distillation Long-tail Learning
Code Code Available 05 FAKD: Feature Augmented Knowledge Distillation for Semantic Segmentation Aug 30, 2022 Knowledge Distillation Segmentation
Code Code Available 05 Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data Jul 7, 2023 Knowledge Distillation Model Compression
Code Code Available 05 GSB: Group Superposition Binarization for Vision Transformer with Limited Training Samples May 13, 2023 Binarization Knowledge Distillation
Code Code Available 05 A Dual-Contrastive Framework for Low-Resource Cross-Lingual Named Entity Recognition Apr 2, 2022 Contrastive Learning Cross-Lingual NER
Code Code Available 05 Distilling the Undistillable: Learning from a Nasty Teacher Oct 21, 2022 Knowledge Distillation
Code Code Available 05 Group Multi-View Transformer for 3D Shape Analysis with Spatial Encoding Dec 27, 2023 3D Classification 3D Shape Recognition
Code Code Available 05 GSSF: Generalized Structural Sparse Function for Deep Cross-modal Metric Learning Oct 20, 2024 Image Retrieval Image-text Retrieval
Code Code Available 05 Distilling the Knowledge of Romanian BERTs Using Multiple Teachers Dec 23, 2021 Dialect Identification GPU
Code Code Available 05 Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation Aug 28, 2021 Knowledge Distillation Retrieval
Code Code Available 05 CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing May 24, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition Nov 16, 2021 Cross-Lingual NER Knowledge Distillation
Code Code Available 05 Greedy-layer Pruning: Speeding up Transformer Models for Natural Language Processing May 31, 2021 Knowledge Distillation Unsupervised Pre-training
Code Code Available 05 Graph Knowledge Distillation to Mixture of Experts Jun 17, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 05 Handling Data Heterogeneity in Federated Learning via Knowledge Distillation and Fusion Jul 23, 2022 Data-free Knowledge Distillation Fairness
Code Code Available 05 Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy Aug 29, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 Distilling Stereo Networks for Performant and Efficient Leaner Networks Mar 24, 2025 General Knowledge Knowledge Distillation
Code Code Available 05 Graph-based Knowledge Distillation by Multi-head Attention Network Jul 4, 2019 Inductive Bias Knowledge Distillation
Code Code Available 05 Cooperative Classification and Rationalization for Graph Generalization Mar 10, 2024 Classification Graph Classification
Code Code Available 05 Gradient Knowledge Distillation for Pre-trained Language Models Nov 2, 2022 Knowledge Distillation
Code Code Available 05 Graph Entropy Minimization for Semi-supervised Node Classification May 31, 2023 Classification Knowledge Distillation
Code Code Available 05 Spending Your Winning Lottery Better After Drawing It Jan 8, 2021 Knowledge Distillation
Code Code Available 05 GOTHAM: Graph Class Incremental Learning Framework under Weak Supervision Apr 7, 2025 Attribute class-incremental learning
Code Code Available 05 Goldfish: An Efficient Federated Unlearning Framework Apr 4, 2024 Knowledge Distillation Machine Unlearning
Code Code Available 05 Answering Diverse Questions via Text Attached with Key Audio-Visual Clues Mar 11, 2024 Audio-visual Question Answering Audio-Visual Question Answering (AVQA)
Code Code Available 05 GNN's Uncertainty Quantification using Self-Distillation Jun 24, 2025 Knowledge Distillation Uncertainty Quantification
Code Code Available 05 Distilling Object Detectors With Global Knowledge Oct 17, 2022 Knowledge Distillation Object
Code Code Available 05 Goal-Conditioned Q-Learning as Knowledge Distillation Aug 28, 2022 Knowledge Distillation Q-Learning
Code Code Available 05