Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher Oct 16, 2021 image-classification Image Classification
— Unverified 0Promoting CNNs with Cross-Architecture Knowledge Distillation for Efficient Monocular Depth Estimation Apr 25, 2024 Decoder Depth Estimation
— Unverified 0PromptDet: A Lightweight 3D Object Detection Framework with LiDAR Prompts Dec 17, 2024 3D Object Detection Depth Estimation
— Unverified 0Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt May 16, 2022 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Propagate & Distill: Towards Effective Graph Learners Using Propagation-Embracing MLPs Nov 29, 2023 Graph Neural Network Knowledge Distillation
— Unverified 0Prototypical Contrastive Predictive Coding Sep 29, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition Oct 31, 2020 Face Recognition Knowledge Distillation
— Unverified 0Pseudo Knowledge Distillation: Towards Learning Optimal Instance-specific Label Smoothing Regularization Sep 29, 2021 image-classification Image Classification
— Unverified 0Pseudo-label Correction for Instance-dependent Noise Using Teacher-student Framework Nov 24, 2023 Knowledge Distillation Pseudo Label
— Unverified 0Pseudo-Label Training and Model Inertia in Neural Machine Translation May 19, 2023 Knowledge Distillation Machine Translation
— Unverified 0Pseudo Supervised Monocular Depth Estimation with Teacher-Student Network Oct 22, 2021 Depth Estimation Knowledge Distillation
— Unverified 0PTMs-TSCIL Pre-Trained Models Based Class-Incremental Learning Mar 10, 2025 class-incremental learning Class Incremental Learning
— Unverified 0PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation Feb 26, 2021 Clustering Knowledge Distillation
— Unverified 0In-Distribution Consistency Regularization Improves the Generalization of Quantization-Aware Training Feb 21, 2024 Knowledge Distillation Quantization
— Unverified 0Puzzle: Distillation-Based NAS for Inference-Optimized LLMs Nov 28, 2024 GPU Knowledge Distillation
— Unverified 0QABISAR: Query-Article Bipartite Interactions for Statutory Article Retrieval Dec 1, 2024 Articles Knowledge Distillation
— Unverified 0QA-HFL: Quality-Aware Hierarchical Federated Learning for Resource-Constrained Mobile Devices with Heterogeneous Image Quality Jun 4, 2025 Federated Learning Knowledge Distillation
— Unverified 0QCRD: Quality-guided Contrastive Rationale Distillation for Large Language Models May 14, 2024 Contrastive Learning Denoising
— Unverified 0QKD: Quantization-aware Knowledge Distillation Nov 28, 2019 Knowledge Distillation Quantization
— Unverified 0QTI Submission to DCASE 2021: residual normalization for device-imbalanced acoustic scene classification with efficient design Jun 28, 2022 Acoustic Scene Classification Knowledge Distillation
— Unverified 0QuaLA-MiniLM: a Quantized Length Adaptive MiniLM Oct 31, 2022 Computational Efficiency Knowledge Distillation
— Unverified 0Quantifying Knowledge Distillation Using Partial Information Decomposition Nov 12, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Quantifying the Knowledge in a DNN to Explain Knowledge Distillation for Classification Aug 18, 2022 3D Point Cloud Classification Classification
— Unverified 0Quantized Feature Distillation for Network Quantization Jul 20, 2023 image-classification Image Classification
— Unverified 0Query-Based Knowledge Sharing for Open-Vocabulary Multi-Label Classification Jan 2, 2024 Knowledge Distillation Multi-Label Classification
— Unverified 0Query Distillation: BERT-based Distillation for Ensemble Ranking Dec 1, 2020 Knowledge Distillation
— Unverified 0Query Optimization for Parametric Knowledge Refinement in Retrieval-Augmented Large Language Models Nov 12, 2024 Knowledge Distillation Question Answering
— Unverified 0Quick Dense Retrievers Consume KALE: Post Training Kullback Leibler Alignment of Embeddings for Asymmetrical dual encoders Mar 31, 2023 Knowledge Distillation Language Modeling
— Unverified 0QUILL: Query Intent with Large Language Models using Retrieval Augmentation and Multi-stage Distillation Oct 27, 2022 Feature Engineering Knowledge Distillation
— Unverified 0QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning Jul 29, 2021 Federated Learning Knowledge Distillation
— Unverified 0Radio2Text: Streaming Speech Recognition Using mmWave Radio Signals Aug 16, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0RadOcc: Learning Cross-Modality Occupancy Knowledge through Rendering Assisted Distillation Dec 19, 2023 Knowledge Distillation Prediction
— Unverified 0RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation Sep 21, 2021 Knowledge Distillation
— Unverified 0Random Conditioning for Diffusion Model Compression with Distillation Jan 1, 2025 Denoising Knowledge Distillation
— Unverified 0Random Conditioning with Distillation for Data-Efficient Diffusion Model Compression Apr 2, 2025 Denoising Knowledge Distillation
— Unverified 0Random Copolymer inverse design system orienting on Accurate discovering of Antimicrobial peptide-mimetic copolymers Nov 30, 2022 Activity Prediction Knowledge Distillation
— Unverified 0RangeAugment: Efficient Online Augmentation with Range Learning Dec 20, 2022 Knowledge Distillation object-detection
— Unverified 0RankByGene: Gene-Guided Histopathology Representation Learning Through Cross-Modal Ranking Consistency Nov 22, 2024 Knowledge Distillation Representation Learning
— Unverified 0RankDistil: Knowledge Distillation for Ranking Apr 13, 2021 Document Ranking Knowledge Distillation
— Unverified 0RankDVQA-mini: Knowledge Distillation-Driven Deep Video Quality Assessment Dec 14, 2023 Knowledge Distillation Model Compression
— Unverified 0Ranking-aware Continual Learning for LiDAR Place Recognition May 12, 2025 Autonomous Driving Continual Learning
— Unverified 0MotherNets: Rapid Deep Ensemble Learning Sep 12, 2018 Clustering Clustering Ensemble
— Unverified 0Rationalization Models for Text-to-SQL Feb 10, 2025 Knowledge Distillation Language Modeling
— Unverified 0RAVIR: A Dataset and Methodology for the Semantic Segmentation and Quantitative Analysis of Retinal Arteries and Veins in Infrared Reflectance Imaging Mar 28, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0RAWtoBit: A Fully End-to-end Camera ISP Network Aug 16, 2022 Image Compression Knowledge Distillation
— Unverified 0RCKD: Response-Based Cross-Task Knowledge Distillation for Pathological Image Analysis Oct 29, 2023 Image Classification Knowledge Distillation
— Unverified 0Lightweight Embedded FPGA Deployment of Learned Image Compression with Knowledge Distillation and Hybrid Quantization Mar 5, 2025 Image Compression Knowledge Distillation
— Unverified 0RdimKD: Generic Distillation Paradigm by Dimensionality Reduction Dec 14, 2023 Dimensionality Reduction Knowledge Distillation
— Unverified 0Re2G: Retrieve, Rerank, Generate Jan 16, 2022 Fact Checking GPU
— Unverified 0Real-time Monocular Depth Estimation with Sparse Supervision on Mobile May 25, 2021 Autonomous Vehicles Depth Estimation
— Unverified 0