Exploring Multi-Modal Contextual Knowledge for Open-Vocabulary Object Detection Aug 30, 2023 Knowledge Distillation Language Modeling
— Unverified 00 Exploring Self- and Cross-Triplet Correlations for Human-Object Interaction Detection Jan 11, 2024 Human-Object Interaction Detection Knowledge Distillation
— Unverified 00 A Note on Knowledge Distillation Loss Function for Object Classification Sep 14, 2021 Knowledge Distillation Model Compression
— Unverified 00 Exploring the Limits of Simple Learners in Knowledge Distillation for Document Classification with DocBERT Jul 1, 2020 Document Classification General Classification
— Unverified 00 Extending Label Smoothing Regularization with Self-Knowledge Distillation Sep 11, 2020 Knowledge Distillation Self-Knowledge Distillation
— Unverified 00 Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation Jan 22, 2025 Knowledge Distillation
— Unverified 00 Extracting knowledge from features with multilevel abstraction Dec 4, 2021 Data Augmentation Knowledge Distillation
— Unverified 00 Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation Apr 24, 2021 Knowledge Distillation
— Unverified 00 Extracurricular Learning: Knowledge Transfer Beyond Empirical Distribution Jun 30, 2020 Image Classification Knowledge Distillation
— Unverified 00 Extreme Compression for Pre-trained Transformers Made Simple and Efficient Jun 4, 2022 Knowledge Distillation Quantization
— Unverified 00 Extreme compression of sentence-transformer ranker models: faster inference, longer battery life, and less storage on edge devices Jun 29, 2022 Dimensionality Reduction Knowledge Distillation
— Unverified 00 Extremely Small BERT Models from Mixed-Vocabulary Training Sep 25, 2019 Knowledge Distillation Language Modelling
— Unverified 00 Face to Cartoon Incremental Super-Resolution using Knowledge Distillation Jan 27, 2024 Hallucination Incremental Learning
— Unverified 00 Factorized Distillation: Training Holistic Person Re-identification Model by Distilling an Ensemble of Partial ReID Models Nov 20, 2018 Knowledge Distillation Person Re-Identification
— Unverified 00 Factorized RVQ-GAN For Disentangled Speech Tokenization Jun 18, 2025 Disentanglement Knowledge Distillation
— Unverified 00 Factual Dialogue Summarization via Learning from Large Language Models Jun 20, 2024 Contrastive Learning Data Augmentation
— Unverified 00 Selective Cross-Task Distillation Apr 25, 2022 Knowledge Distillation
— Unverified 00 Failure-Resilient Distributed Inference with Model Compression over Heterogeneous Edge Devices Jun 20, 2024 Knowledge Distillation Model Compression
— Unverified 00 Fair Feature Distillation for Visual Recognition May 27, 2021 Fairness Knowledge Distillation
— Unverified 00 Fair Feature Importance Scores for Interpreting Tree-Based Methods and Surrogates Oct 6, 2023 Fairness Feature Importance
— Unverified 00 Fairly Predicting Graft Failure in Liver Transplant for Organ Assigning Feb 18, 2023 Fairness Knowledge Distillation
— Unverified 00 Fairness Continual Learning Approach to Semantic Scene Understanding in Open-World Environments May 25, 2023 Continual Learning Continual Semantic Segmentation
— Unverified 00 Fair Text to Medical Image Diffusion Model with Subgroup Distribution Aligned Tuning Jun 21, 2024 Knowledge Distillation
— Unverified 00 Faithful Knowledge Distillation Jun 7, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 00 Fake It Till Make It: Federated Learning with Consensus-Oriented Generation Dec 10, 2023 Federated Learning Knowledge Distillation
— Unverified 00 Fall Detection using Knowledge Distillation Based Long short-term memory for Offline Embedded and Low Power Devices Aug 24, 2023 Knowledge Distillation Time Series
— Unverified 00 False Negative Distillation and Contrastive Learning for Personalized Outfit Recommendation Oct 13, 2021 Contrastive Learning Data Augmentation
— Unverified 00 FAN-Trans: Online Knowledge Distillation for Facial Action Unit Detection Nov 11, 2022 Action Unit Detection Face Alignment
— Unverified 00 Fast and Efficient Once-For-All Networks for Diverse Hardware Deployment Sep 29, 2021 All GPU
— Unverified 00 Fast and High-Performance Learned Image Compression With Improved Checkerboard Context Model, Deformable Residual Module, and Knowledge Distillation Sep 5, 2023 Image Compression Knowledge Distillation
— Unverified 00 Fast DistilBERT on CPUs Oct 27, 2022 Knowledge Distillation Model Compression
— Unverified 00 Fast End-to-end Coreference Resolution for Korean Nov 1, 2020 coreference-resolution Coreference Resolution
— Unverified 00 FasterAI: A Lightweight Library for Creating Sparse Neural Networks Jul 3, 2022 Knowledge Distillation
— Unverified 00 Faster Inference of Integer SWIN Transformer by Removing the GELU Activation Feb 2, 2024 GPU image-classification
— Unverified 00 Fast Real-time Personalized Speech Enhancement: End-to-End Enhancement Network (E3Net) and Knowledge Distillation Apr 2, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Fast Sampling Through The Reuse Of Attention Maps In Diffusion Models Dec 13, 2023 Image Generation Knowledge Distillation
— Unverified 00 FastSR-NeRF: Improving NeRF Efficiency on Consumer Devices with A Simple Super-Resolution Pipeline Dec 15, 2023 GPU Knowledge Distillation
— Unverified 00 Fast Streaming Transducer ASR Prototyping via Knowledge Distillation with Whisper Sep 20, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Fast Video Salient Object Detection via Spatiotemporal Knowledge Distillation Oct 20, 2020 Knowledge Distillation Object
— Unverified 00 Feature Adversarial Distillation for Point Cloud Classification Jun 25, 2023 Classification FAD
— Unverified 00 Feature Affinity Assisted Knowledge Distillation and Quantization of Deep Neural Networks on Label-Free Data Feb 10, 2023 Knowledge Distillation Quantization
— Unverified 00 Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models Apr 18, 2025 image-classification Image Classification
— Unverified 00 Feature Alignment-Based Knowledge Distillation for Efficient Compression of Large Language Models Dec 27, 2024 Knowledge Distillation Model Compression
— Unverified 00 Feature-Align Network with Knowledge Distillation for Efficient Denoising Mar 2, 2021 Decoder Denoising
— Unverified 00 Feature-domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution Nov 29, 2022 Image Super-Resolution Knowledge Distillation
— Unverified 00 Feature-based One-For-All: A Universal Framework for Heterogeneous Knowledge Distillation Jan 15, 2025 All Knowledge Distillation
— Unverified 00 Feature Correlation-guided Knowledge Transfer for Federated Self-supervised Learning Nov 14, 2022 Feature Correlation Federated Learning
— Unverified 00 Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning Jul 14, 2025 Federated Learning Knowledge Distillation
— Unverified 00 Feature Fusion and Knowledge-Distilled Multi-Modal Multi-Target Detection May 31, 2025 Domain Adaptation Knowledge Distillation
— Unverified 00 Feature Interaction Fusion Self-Distillation Network For CTR Prediction Nov 12, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 00