DearKD: Data-Efficient Early Knowledge Distillation for Vision Transformers Apr 27, 2022 Knowledge Distillation
— Unverified 00 GripRank: Bridging the Gap between Retrieval and Generation via the Generative Knowledge Improved Passage Ranking May 29, 2023 Answer Generation Dialogue Generation
— Unverified 00 Dealing with training and test segmentation mismatch: FBK@IWSLT2021 Jun 23, 2021 Action Detection Activity Detection
— Unverified 00 Industry Scale Semi-Supervised Learning for Natural Language Understanding Mar 29, 2021 intent-classification Intent Classification
— Unverified 00 Bi-CryptoNets: Leveraging Different-Level Privacy for Encrypted Inference Feb 2, 2024 Knowledge Distillation Privacy Preserving
— Unverified 00 InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation Jun 25, 2024 Knowledge Distillation
— Unverified 00 AKE-GNN: Effective Graph Learning with Adaptive Knowledge Exchange Jun 10, 2021 Classification Graph Classification
— Unverified 00 Graph Representation Learning via Multi-task Knowledge Distillation Nov 11, 2019 Graph Representation Learning Knowledge Distillation
— Unverified 00 Dealing with Missing Modalities in the Visual Question Answer-Difference Prediction Task through Knowledge Distillation Apr 13, 2021 Knowledge Distillation Triplet
— Unverified 00 DDK: Distilling Domain Knowledge for Efficient Large Language Models Jul 23, 2024 Knowledge Distillation
— Unverified 00 DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images May 14, 2025 Diagnostic Knowledge Distillation
— Unverified 00 Be Your Own Best Competitor! Multi-Branched Adversarial Knowledge Transfer Oct 9, 2020 Decoder image-classification
— Unverified 00 ALP-KD: Attention-Based Layer Projection for Knowledge Distillation Dec 27, 2020 Knowledge Distillation
— Unverified 00 Initial Classifier Weights Replay for Memoryless Class Incremental Learning Aug 31, 2020 All class-incremental learning
— Unverified 00 Adaptive Label Smoothing with Self-Knowledge in Natural Language Generation Oct 22, 2022 Knowledge Distillation Text Generation
— Unverified 00 DC-CCL: Device-Cloud Collaborative Controlled Learning for Large Vision Models Mar 18, 2023 Knowledge Distillation
— Unverified 00 Graph-Based Cross-Domain Knowledge Distillation for Cross-Dataset Text-to-Image Person Retrieval Jan 25, 2025 Domain Adaptation Knowledge Distillation
— Unverified 00 Injecting Spatial Information for Monaural Speech Enhancement via Knowledge Distillation Dec 2, 2022 Knowledge Distillation Speech Enhancement
— Unverified 00 Inplace knowledge distillation with teacher assistant for improved training of flexible deep neural networks May 18, 2021 image-classification Image Classification
— Unverified 00 In-situ animal behavior classification using knowledge distillation and fixed-point quantization Sep 9, 2022 Classification Knowledge Distillation
— Unverified 00 Graph-Adaptive Pruning for Efficient Inference of Convolutional Neural Networks Nov 21, 2018 Knowledge Distillation Model Compression
— Unverified 00 Granite Embedding Models Feb 27, 2025 Information Retrieval Knowledge Distillation
— Unverified 00 Beyond the Tip of Efficiency: Uncovering the Submerged Threats of Jailbreak Attacks in Small Language Models Feb 27, 2025 Knowledge Distillation Model Compression
— Unverified 00 Gradient Reweighting: Towards Imbalanced Class-Incremental Learning Feb 28, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 Gradient-Guided Knowledge Distillation for Object Detectors Mar 7, 2023 Knowledge Distillation Object
— Unverified 00 In Teacher We Trust: Learning Compressed Models for Pedestrian Detection Dec 1, 2016 Knowledge Distillation Pedestrian Detection
— Unverified 00 Data Techniques For Online End-to-end Speech Recognition Jan 24, 2020 Data Augmentation Domain Adaptation
— Unverified 00 Integrating Arithmetic Learning Improves Mathematical Reasoning in Smaller Models Feb 18, 2025 Data Augmentation GSM8K
— Unverified 00 Gradient Adversarial Training of Neural Networks Jun 21, 2018 BIG-bench Machine Learning Binary Classification
— Unverified 00 Integration of Pre-trained Networks with Continuous Token Interface for End-to-End Spoken Language Understanding Apr 15, 2021 intent-classification Intent Classification
— Unverified 00 GOVERN: Gradient Orientation Vote Ensemble for Multi-Teacher Reinforced Distillation May 6, 2024 Knowledge Distillation Question Answering
— Unverified 00 Mining Data Impressions from Deep Models as Substitute for the Unavailable Training Data Jan 15, 2021 Adversarial Robustness Continual Learning
— Unverified 00 Beyond Task Vectors: Selective Task Arithmetic Based on Importance Metrics Nov 25, 2024 Knowledge Distillation Multi-Task Learning
— Unverified 00 Adaptive Label Smoothing with Self-Knowledge Sep 29, 2021 Knowledge Distillation Machine Translation
— Unverified 00 Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition Nov 28, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Intermediate Distillation: Data-Efficient Distillation from Black-Box LLMs for Information Retrieval Jun 18, 2024 Information Retrieval Knowledge Distillation
— Unverified 00 Interpretable discovery of new semiconductors with machine learning Jan 12, 2021 BIG-bench Machine Learning Knowledge Distillation
— Unverified 00 A Closer Look at Deep Learning Heuristics: Learning rate restarts, Warmup and Distillation Oct 29, 2018 Dimensionality Reduction Knowledge Distillation
— Unverified 00 Interpretable Foreground Object Search As Knowledge Distillation Jul 20, 2020 Knowledge Distillation Object
— Unverified 00 Interpretable Traces, Unexpected Outcomes: Investigating the Disconnect in Trace-Based Knowledge Distillation May 20, 2025 Information Retrieval Knowledge Distillation
— Unverified 00 Data-Free Knowledge Transfer: A Survey Dec 31, 2021 Data-free Knowledge Distillation Domain Adaptation
— Unverified 00 GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation Mar 28, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis Apr 10, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Interruption-Aware Cooperative Perception for V2X Communication-Aided Autonomous Driving Apr 24, 2023 Autonomous Driving Autonomous Vehicles
— Unverified 00 Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images Oct 20, 2023 Data Augmentation Data-free Knowledge Distillation
— Unverified 00 Local-Global Knowledge Distillation in Heterogeneous Federated Learning with Non-IID Data Jun 30, 2021 Federated Learning Knowledge Distillation
— Unverified 00 Global Intervention and Distillation for Federated Out-of-Distribution Generalization Apr 1, 2025 Attribute Data Augmentation
— Unverified 00 Beyond Classification: Knowledge Distillation using Multi-Object Impressions Oct 27, 2021 Classification Knowledge Distillation
— Unverified 00 All You Need in Knowledge Distillation Is a Tailored Coordinate System Dec 12, 2024 All Few-Shot Learning
— Unverified 00 Adaptive Knowledge Distillation for Classification of Hand Images using Explainable Vision Transformers Aug 20, 2024 Knowledge Distillation
— Unverified 00