Population-Based Evolutionary Gaming for Unsupervised Person Re-identification Jun 8, 2023 Diversity Knowledge Distillation
— Unverified 0Regularized Evolutionary Population-Based Training Feb 11, 2020 Diversity image-classification
— Unverified 0Pose-Guided Feature Learning with Knowledge Distillation for Occluded Person Re-Identification Jul 31, 2021 Knowledge Distillation Occluded Person Re-Identification
— Unverified 0Pose Uncertainty Aware Movement Synchrony Estimation via Spatial-Temporal Graph Transformer Aug 1, 2022 Activity Recognition Contrastive Learning
— Unverified 0Positive-Unlabeled Data Purification in the Wild for Object Detection Jun 19, 2021 Knowledge Distillation object-detection
— Unverified 0Poster: Self-Supervised Quantization-Aware Knowledge Distillation Sep 22, 2023 Knowledge Distillation Quantization
— Unverified 0PPC-GPT: Federated Task-Specific Compression of Large Language Models via Pruning and Chain-of-Thought Distillation Feb 21, 2025 Knowledge Distillation Privacy Preserving
— Unverified 0PP-StructureV2: A Stronger Document Analysis System Oct 11, 2022 Key Information Extraction Knowledge Distillation
— Unverified 0PQDAST: Depth-Aware Arbitrary Style Transfer for Games via Perceptual Quality-Guided Distillation Feb 24, 2025 Knowledge Distillation Style Transfer
— Unverified 0PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation Jun 25, 2021 Keyword Spotting Knowledge Distillation
— Unverified 0Practical Insights into Knowledge Distillation for Pre-Trained Models Feb 22, 2024 Federated Learning Knowledge Distillation
— Unverified 0Practical Knowledge Distillation: Using DNNs to Beat DNNs Feb 23, 2023 Denoising Knowledge Distillation
— Unverified 0PRAL: A Tailored Pre-Training Model for Task-Oriented Dialog Generation Aug 1, 2021 Knowledge Distillation Language Modeling
— Unverified 0Predicting Multi-Codebook Vector Quantization Indexes for Knowledge Distillation Oct 31, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Prepending or Cross-Attention for Speech-to-Text? An Empirical Comparison Jan 4, 2025 Decoder Knowledge Distillation
— Unverified 0Preserving Node Distinctness in Graph Autoencoders via Similarity Distillation Jun 25, 2024 Decoder Knowledge Distillation
— Unverified 0Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation Sep 10, 2022 Federated Learning image-classification
— Unverified 0Pre-trained Language Model and Knowledge Distillation for Lightweight Sequential Recommendation Sep 23, 2024 Knowledge Distillation Language Modeling
— Unverified 0Pre-trained Model Guided Mixture Knowledge Distillation for Adversarial Federated Learning Jan 25, 2025 Adversarial Robustness Federated Learning
— Unverified 0Pre-trained Model Representations and their Robustness against Noise for Speech Emotion Analysis Mar 3, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0Pre-Trained Vision-Language Models as Partial Annotators May 23, 2024 Contrastive Learning image-classification
— Unverified 0Pre-training Distillation for Large Language Models: A Design Space Exploration Oct 21, 2024 Knowledge Distillation
— Unverified 0Pre-Training Graph Contrastive Masked Autoencoders are Strong Distillers for EEG Nov 28, 2024 EEG Knowledge Distillation
— Unverified 0Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data Aug 11, 2021 Knowledge Distillation Model Compression
— Unverified 0Preventing Distillation-based Attacks on Neural Network IP Apr 1, 2022 Knowledge Distillation
— Unverified 0Preview-based Category Contrastive Learning for Knowledge Distillation Oct 18, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Prime-Aware Adaptive Distillation Aug 4, 2020 Knowledge Distillation Metric Learning
— Unverified 0Prior knowledge distillation based on financial time series Jun 16, 2020 Knowledge Distillation Time Series
— Unverified 0Prior Knowledge Distillation Network for Face Super-Resolution Sep 22, 2024 Knowledge Distillation Super-Resolution
— Unverified 0Prior Knowledge Guided Network for Video Anomaly Detection Sep 4, 2023 Anomaly Detection Knowledge Distillation
— Unverified 0Privacy Distillation: Reducing Re-identification Risk of Multimodal Diffusion Models Jun 2, 2023 Knowledge Distillation
— Unverified 0Privacy-Preserving Federated Learning with Consistency via Knowledge Distillation Using Conditional Generator Sep 11, 2024 Diversity Federated Learning
— Unverified 0Privacy-preserving Fine-tuning of Large Language Models through Flatness Mar 7, 2024 Knowledge Distillation Privacy Preserving
— Unverified 0Private Deep Learning with Teacher Ensembles Jun 5, 2019 Deep Learning Ensemble Learning
— Unverified 0Private Model Compression via Knowledge Distillation Nov 13, 2018 Knowledge Distillation model
— Unverified 0Privileged Knowledge Distillation for Online Action Detection Nov 18, 2020 Action Detection Knowledge Distillation
— Unverified 0Proactive Detection and Calibration of Seasonal Advertisements with Multimodal Large Language Models Oct 16, 2024 Knowledge Distillation
— Unverified 0Proactive Guidance of Multi-Turn Conversation in Industrial Search May 30, 2025 Knowledge Distillation reinforcement-learning
— Unverified 0Proactive Sequence Generator via Knowledge Acquisition Sep 25, 2019 de-en Knowledge Distillation
— Unverified 0Probabilistic Integration of Object Level Annotations in Chest X-ray Classification Oct 13, 2022 Knowledge Distillation Variational Inference
— Unverified 0Probabilistic Knowledge Distillation of Face Ensembles Jan 1, 2023 Face Image Quality Face Recognition
— Unverified 0Probabilistic Self-supervised Learning via Scoring Rules Minimization Sep 5, 2023 Knowledge Distillation Out-of-Distribution Detection
— Unverified 0PROD: Progressive Distillation for Dense Retrieval Sep 27, 2022 Knowledge Distillation Natural Questions
— Unverified 0ProFe: Communication-Efficient Decentralized Federated Learning via Distillation and Prototypes Dec 15, 2024 Federated Learning Knowledge Distillation
— Unverified 0Progressive Class-level Distillation May 30, 2025 Benchmarking Knowledge Distillation
— Unverified 0Progressive Collaborative and Semantic Knowledge Fusion for Generative Recommendation Feb 10, 2025 Knowledge Distillation
— Unverified 0Progressive Cross-modal Knowledge Distillation for Human Action Recognition Aug 17, 2022 Action Recognition Knowledge Distillation
— Unverified 0Progressive distillation induces an implicit curriculum Oct 7, 2024 Knowledge Distillation
— Unverified 0Progressive Label Distillation: Learning Input-Efficient Deep Neural Networks Jan 26, 2019 Knowledge Distillation speech-recognition
— Unverified 0ProKD: An Unsupervised Prototypical Knowledge Distillation Network for Zero-Resource Cross-Lingual Named Entity Recognition Jan 21, 2023 Contrastive Learning Cross-Lingual NER
— Unverified 0