Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning Jan 19, 2024 GPU Knowledge Distillation
— Unverified 0Exclusivity-Consistency Regularized Knowledge Distillation for Face Recognition Aug 1, 2020 Diversity Face Recognition
— Unverified 0Enhancing Romanian Offensive Language Detection through Knowledge Distillation, Multi-Task Learning, and Data Augmentation Sep 30, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Enhancing Review Comprehension with Domain-Specific Commonsense Apr 6, 2020 Aspect Extraction Knowledge Distillation
— Unverified 0Enhancing Once-For-All: A Study on Parallel Blocks, Skip Connections and Early Exits Feb 3, 2023 All Knowledge Distillation
— Unverified 0Expediting Contrastive Language-Image Pretraining via Self-distilled Encoders Dec 19, 2023 Knowledge Distillation
— Unverified 0Experimentation in Content Moderation using RWKV Sep 5, 2024 CPU Knowledge Distillation
— Unverified 0Experimenting with Knowledge Distillation techniques for performing Brain Tumor Segmentation May 24, 2021 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0Explainability-Driven Leaf Disease Classification Using Adversarial Training and Knowledge Distillation Dec 30, 2023 Adversarial Attack Classification
— Unverified 0Explainable Knowledge Distillation for On-device Chest X-Ray Classification May 10, 2023 Explainable artificial intelligence Explainable Artificial Intelligence (XAI)
— Unverified 0Explainable LLM-driven Multi-dimensional Distillation for E-Commerce Relevance Learning Nov 20, 2024 Knowledge Distillation Large Language Model
— Unverified 0Explaining Knowledge Distillation by Quantifying the Knowledge Mar 7, 2020 Knowledge Distillation
— Unverified 0ConaCLIP: Exploring Distillation of Fully-Connected Knowledge Interaction Graph for Lightweight Text-Image Retrieval May 28, 2023 Image Retrieval Knowledge Distillation
— Unverified 0Explaining Sequence-Level Knowledge Distillation as Data-Augmentation for Neural Machine Translation Dec 6, 2019 Data Augmentation Knowledge Distillation
— Unverified 0A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks May 29, 2022 Data Augmentation image-classification
— Unverified 0Explicit Connection Distillation Jan 1, 2021 image-classification Image Classification
— Unverified 0A Transformer-in-Transformer Network Utilizing Knowledge Distillation for Image Recognition Feb 24, 2025 image-classification Image Classification
— Unverified 0Explicit Knowledge Transfer for Weakly-Supervised Code Generation Nov 30, 2022 Code Generation Few-Shot Learning
— Unverified 0FlyKD: Graph Knowledge Distillation on the Fly with Curriculum Learning Mar 16, 2024 Knowledge Distillation
— Unverified 0Exploiting Unlabelled Photos for Stronger Fine-Grained SBIR Mar 24, 2023 Image Retrieval Knowledge Distillation
— Unverified 0Enhancing Modality-Agnostic Representations via Meta-Learning for Brain Tumor Segmentation Feb 8, 2023 Brain Tumor Segmentation Image Generation
— Unverified 0Enhancing Mapless Trajectory Prediction through Knowledge Distillation Jun 25, 2023 Autonomous Driving Knowledge Distillation
— Unverified 0Exploring compressibility of transformer based text-to-music (TTM) models Jun 24, 2024 Decoder FAD
— Unverified 0Exploring Dark Knowledge under Various Teacher Capacities and Addressing Capacity Mismatch May 21, 2024 Knowledge Distillation
— Unverified 0Compression of end-to-end non-autoregressive image-to-speech system for low-resourced devices Nov 30, 2023 Knowledge Distillation
— Unverified 0Compression of Deep Learning Models for Text: A Survey Aug 12, 2020 Deep Learning Information Retrieval
— Unverified 0Generalized Supervised Contrastive Learning Jun 1, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0Exploring Extreme Quantization in Spiking Language Models May 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0Compression of Acoustic Event Detection Models With Quantized Distillation Jul 1, 2019 Event Detection Knowledge Distillation
— Unverified 0Continual Learning for Class- and Domain-Incremental Semantic Segmentation Sep 16, 2022 class-incremental learning Class Incremental Learning
— Unverified 0FLAR: A Unified Prototype Framework for Few-Sample Lifelong Active Recognition Jan 1, 2021 Knowledge Distillation Lifelong learning
— Unverified 0For the Misgendered Chinese in Gender Bias Research: Multi-Task Learning with Knowledge Distillation for Pinyin Name-Gender Prediction May 10, 2024 Gender Prediction Knowledge Distillation
— Unverified 0Compressing Visual-linguistic Model via Knowledge Distillation Apr 5, 2021 Image Captioning Knowledge Distillation
— Unverified 0Fully Synthetic Data Improves Neural Machine Translation with Knowledge Distillation Dec 31, 2020 Knowledge Distillation Machine Translation
— Unverified 0Enhancing Generalization in Chain of Thought Reasoning for Smaller Models Jan 16, 2025 Knowledge Distillation Memorization
— Unverified 0A Theoretical Analysis of Soft-Label vs Hard-Label Training in Neural Networks Dec 12, 2024 Binary Classification Knowledge Distillation
— Unverified 0Enhancing Few-shot Keyword Spotting Performance through Pre-Trained Self-supervised Speech Models Jun 21, 2025 Dimensionality Reduction Keyword Spotting
— Unverified 0Enhancing Data-Free Adversarial Distillation with Activation Regularization and Virtual Interpolation Feb 23, 2021 Knowledge Distillation
— Unverified 0Exploring Self- and Cross-Triplet Correlations for Human-Object Interaction Detection Jan 11, 2024 Human-Object Interaction Detection Knowledge Distillation
— Unverified 0Unsupervised Continual Learning Via Pseudo Labels Apr 14, 2021 Clustering Continual Learning
— Unverified 0Continual Learning with Diffusion-based Generative Replay for Industrial Streaming Data Jun 22, 2024 Continual Learning Knowledge Distillation
— Unverified 0A Note on Knowledge Distillation Loss Function for Object Classification Sep 14, 2021 Knowledge Distillation Model Compression
— Unverified 0Continual Learning with Dirichlet Generative-based Rehearsal Sep 13, 2023 Continual Learning Incremental Learning
— Unverified 0Exploring the Limits of Simple Learners in Knowledge Distillation for Document Classification with DocBERT Jul 1, 2020 Document Classification General Classification
— Unverified 0Extending Label Smoothing Regularization with Self-Knowledge Distillation Sep 11, 2020 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation Jan 22, 2025 Knowledge Distillation
— Unverified 0Extracting knowledge from features with multilevel abstraction Dec 4, 2021 Data Augmentation Knowledge Distillation
— Unverified 0Compressing VAE-Based Out-of-Distribution Detectors for Embedded Deployment Sep 2, 2024 CPU GPU
— Unverified 0Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation Apr 24, 2021 Knowledge Distillation
— Unverified 0Enhancing CTC-Based Visual Speech Recognition Sep 11, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0