FEED: Feature-level Ensemble for Knowledge Distillation Sep 24, 2019 Knowledge Distillation
— Unverified 0Few-shot 3D LiDAR Semantic Segmentation for Autonomous Driving Feb 17, 2023 Autonomous Driving Few-Shot Learning
— Unverified 0Few-shot Face Image Translation via GAN Prior Distillation Jan 28, 2023 Knowledge Distillation Translation
— Unverified 0Few-shot learning of neural networks from scratch by pseudo example optimization Feb 8, 2018 Few-Shot Learning Knowledge Distillation
— Unverified 0Few-Shot Object Detection by Knowledge Distillation Using Bag-of-Visual-Words Representations Jul 25, 2022 Few-Shot Object Detection Knowledge Distillation
— Unverified 0Optimizing Vision Transformers with Data-Free Knowledge Transfer Aug 12, 2024 Knowledge Distillation object-detection
— Unverified 0Optimizing YOLOv5s Object Detection through Knowledge Distillation algorithm Oct 16, 2024 Knowledge Distillation Object
— Unverified 0Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models Nov 5, 2021 Knowledge Distillation Machine Translation
— Unverified 0Orderly Dual-Teacher Knowledge Distillation for Lightweight Human Pose Estimation Apr 21, 2021 Binarization Knowledge Distillation
— Unverified 0ORQA: A Benchmark and Foundation Model for Holistic Operating Room Modeling May 19, 2025 Graph Generation Knowledge Distillation
— Unverified 0Overcoming Language Priors for Visual Question Answering Based on Knowledge Distillation Jan 10, 2025 Knowledge Distillation Question Answering
— Unverified 0P4: Towards private, personalized, and Peer-to-Peer learning May 27, 2024 Knowledge Distillation
— Unverified 0Pacemaker: Intermediate Teacher Knowledge Distillation For On-The-Fly Convolutional Neural Network Mar 9, 2020 Knowledge Distillation Model Compression
— Unverified 0PAIR: Leveraging Passage-Centric Similarity Relation for Improving Dense Passage Retrieval Aug 13, 2021 Knowledge Distillation Natural Questions
— Unverified 0Pan-infection Foundation Framework Enables Multiple Pathogen Prediction Dec 31, 2024 Diagnostic Knowledge Distillation
— Unverified 0PANLP at MEDIQA 2019: Pre-trained Language Models, Transfer Learning and Knowledge Distillation Aug 1, 2019 Knowledge Distillation Re-Ranking
— Unverified 0Papago’s Submission for the WMT21 Quality Estimation Shared Task Nov 1, 2021 Knowledge Distillation Multi-Task Learning
— Unverified 0Paralinguistic Privacy Protection at the Edge Nov 4, 2020 CPU Knowledge Distillation
— Unverified 0Parameter-Efficient and Student-Friendly Knowledge Distillation May 28, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Parameter-Efficient Conformers via Sharing Sparsely-Gated Experts for End-to-End Speech Recognition Sep 17, 2022 Knowledge Distillation Mixture-of-Experts
— Unverified 0Parameter Efficient Diverse Paraphrase Generation Using Sequence-Level Knowledge Distillation Apr 19, 2024 Diversity Knowledge Distillation
— Unverified 0Partial Knowledge Distillation for Alleviating the Inherent Inter-Class Discrepancy in Federated Learning Nov 23, 2024 Federated Learning Knowledge Distillation
— Unverified 0Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better Sep 26, 2021 Knowledge Distillation
— Unverified 0PC-LoRA: Low-Rank Adaptation for Progressive Model Compression with Knowledge Distillation Jun 13, 2024 Knowledge Distillation Model Compression
— Unverified 0PDALN: Progressive Domain Adaptation over a Pre-trained Model for Low-Resource Cross-Domain Named Entity Recognition Nov 1, 2021 Cross-Domain Named Entity Recognition Data Augmentation
— Unverified 0Peak-Controlled Logits Poisoning Attack in Federated Distillation Jul 25, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Pea-KD: Parameter-efficient and Accurate Knowledge Distillation on BERT Sep 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Pea-KD: Parameter-efficient and accurate Knowledge Distillation Sep 28, 2020 Knowledge Distillation Model Compression
— Unverified 0Peak-First CTC: Reducing the Peak Latency of CTC Models by Applying Peak-First Regularization Nov 7, 2022 Knowledge Distillation
— Unverified 0Peer Collaborative Learning for Polyphonic Sound Event Detection Oct 7, 2021 Event Detection Knowledge Distillation
— Unverified 0Learning to Maximize Speech Quality Directly Using MOS Prediction for Neural Text-to-Speech Nov 2, 2020 Knowledge Distillation Speech Synthesis
— Unverified 0Performance-Aware Mutual Knowledge Distillation for Improving Neural Architecture Search Jan 1, 2022 Knowledge Distillation Neural Architecture Search
— Unverified 0Performance-Efficiency Trade-Offs in Adapting Language Models to Text Classification Tasks Oct 21, 2022 Knowledge Distillation text-classification
— Unverified 0Performance-Guided LLM Knowledge Distillation for Efficient Text Classification at Scale Nov 7, 2024 Active Learning Benchmarking
— Unverified 0Periocular Embedding Learning with Consistent Knowledge Distillation from Face Dec 12, 2020 Knowledge Distillation Prediction
— Unverified 0Personalised Federated Learning: A Combinational Approach Aug 22, 2021 Federated Learning Knowledge Distillation
— Unverified 0Personalized Decentralized Federated Learning with Knowledge Distillation Feb 23, 2023 Federated Learning Knowledge Distillation
— Unverified 0PGX: A Multi-level GNN Explanation Framework Based on Separate Knowledge Distillation Processes Aug 5, 2022 Knowledge Distillation Representation Learning
— Unverified 0PHI-S: Distribution Balancing for Label-Free Multi-Teacher Distillation Oct 2, 2024 Knowledge Distillation
— Unverified 0PicoSAM2: Low-Latency Segmentation In-Sensor for Edge Vision Applications Jun 23, 2025 Knowledge Distillation Privacy Preserving
— Unverified 0PILE: Pairwise Iterative Logits Ensemble for Multi-Teacher Labeled Distillation Nov 11, 2022 Knowledge Distillation
— Unverified 0PIRB: A Comprehensive Benchmark of Polish Dense and Hybrid Text Retrieval Methods Feb 20, 2024 Information Retrieval Knowledge Distillation
— Unverified 0PISCO: Pretty Simple Compression for Retrieval-Augmented Generation Jan 27, 2025 GPU Knowledge Distillation
— Unverified 0Pixel Invisibility: Detecting Objects Invisible in Color Images Jun 15, 2020 Knowledge Distillation object-detection
— Unverified 0P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection Jul 14, 2020 Anomaly Detection Decoder
— Unverified 0PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient Jul 5, 2022 Knowledge Distillation object-detection
— Unverified 0PLaD: Preference-based Large Language Model Distillation with Pseudo-Preference Pairs Jun 5, 2024 Knowledge Distillation Language Modeling
— Unverified 0PlaStIL: Plastic and Stable Memory-Free Class-Incremental Learning Sep 14, 2022 class-incremental learning Class Incremental Learning
— Unverified 0Plug-and-Play Interpretable Responsible Text-to-Image Generation via Dual-Space Multi-facet Concept Control Mar 24, 2025 Image Generation Knowledge Distillation
— Unverified 0Point Adversarial Self Mining: A Simple Method for Facial Expression Recognition Aug 26, 2020 Adversarial Attack Data Augmentation
— Unverified 0