On-Policy Distillation of Language Models: Learning from Self-Generated Mistakes Jun 23, 2023 Arithmetic Reasoning Knowledge Distillation
— Unverified 00 GhostNetV3: Exploring the Training Strategies for Compact Models Apr 17, 2024 Image Classification Knowledge Distillation
— Unverified 00 GHOST: Grounded Human Motion Generation with Open Vocabulary Scene-and-Text Contexts Apr 8, 2024 Descriptive Image Segmentation
— Unverified 00 Data-Free Federated Class Incremental Learning with Diffusion-Based Generative Memory May 22, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 GeoMask3D: Geometrically Informed Mask Selection for Self-Supervised Point Cloud Learning in 3D May 20, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 GenURL: A General Framework for Unsupervised Representation Learning Oct 27, 2021 Contrastive Learning Dimensionality Reduction
— Unverified 00 Data-free Distillation with Degradation-prompt Diffusion for Multi-weather Image Restoration Sep 5, 2024 Image Restoration Knowledge Distillation
— Unverified 00 Alleviating LLM-based Generative Retrieval Hallucination in Alipay Search Mar 27, 2025 Hallucination Knowledge Distillation
— Unverified 00 Data-Free Distillation of Language Model by Text-to-Text Transfer Nov 3, 2023 Data-free Knowledge Distillation Diversity
— Unverified 00 Generative Negative Text Replay for Continual Vision-Language Pretraining Oct 31, 2022 Continual Learning image-classification
— Unverified 00 Dense Depth Distillation with Out-of-Distribution Simulated Images Aug 26, 2022 Data-free Knowledge Distillation Depth Estimation
— Unverified 00 Generative Dataset Distillation Based on Self-knowledge Distillation Jan 8, 2025 Dataset Distillation Knowledge Distillation
— Unverified 00 Data-Free Adversarial Knowledge Distillation for Graph Neural Networks May 8, 2022 Generative Adversarial Network Graph Classification
— Unverified 00 Alleviating Catastrophic Forgetting of Incremental Object Detection via Within-Class and Between-Class Knowledge Distillation Jan 1, 2023 Knowledge Distillation object-detection
— Unverified 00 Adaptive Knowledge Distillation between Text and Speech Pre-trained Models Mar 7, 2023 Knowledge Distillation Spoken Language Understanding
— Unverified 00 A Classifier-Free Incremental Learning Framework for Scalable Medical Image Segmentation May 25, 2024 Contrastive Learning Image Segmentation
— Unverified 00 Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging May 15, 2025 Continual Learning Diagnostic
— Unverified 00 Generative Adversarial Simulator Nov 23, 2020 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Generation-Distillation for Efficient Natural Language Understanding in Low-Data Settings Jan 25, 2020 General Classification Knowledge Distillation
— Unverified 00 Generation and Consolidation of Recollections for Efficient Deep Lifelong Learning Jan 1, 2018 Knowledge Distillation Lifelong learning
— Unverified 00 Generating Synthetic Fair Syntax-agnostic Data by Learning and Distilling Fair Representation Aug 20, 2024 Fairness Knowledge Distillation
— Unverified 00 Generating Long Financial Report using Conditional Variational Autoencoders with Knowledge Distillation Oct 23, 2020 Decoder Knowledge Distillation
— Unverified 00 Generate, Annotate, and Learn: Generative Models Advance Self-Training and Knowledge Distillation Sep 29, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 00 General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference Apr 29, 2020 Knowledge Distillation Quantization
— Unverified 00 Data-Efficient Ranking Distillation for Image Retrieval Jul 10, 2020 Image Retrieval Knowledge Distillation
— Unverified 00 Generalized Uncertainty of Deep Neural Networks: Taxonomy and Applications Feb 2, 2023 Knowledge Distillation Model Compression
— Unverified 00 Data-efficient Event Camera Pre-training via Disentangled Masked Modeling Mar 1, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 Data Efficient Acoustic Scene Classification using Teacher-Informed Confusing Class Instruction Sep 18, 2024 Acoustic Scene Classification Data Augmentation
— Unverified 00 Better Knowledge Enhancement for Privacy-Preserving Cross-Project Defect Prediction Dec 23, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Generalized Continual Zero-Shot Learning Nov 17, 2020 Continual Learning Knowledge Distillation
— Unverified 00 Generalization in birdsong classification: impact of transfer learning methods and dataset characteristics Sep 21, 2024 Knowledge Distillation Sound Classification
— Unverified 00 Data-Driven Compression of Convolutional Neural Networks Nov 28, 2019 Knowledge Distillation Model Compression
— Unverified 00 Alignment Knowledge Distillation for Online Streaming Attention-based Speech Recognition Feb 28, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Adaptive Instance Distillation for Object Detection in Autonomous Driving Jan 26, 2022 Autonomous Driving Knowledge Distillation
— Unverified 00 GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model Jun 12, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 GenDistiller: Distilling Pre-trained Language Models based on Generative Models Oct 20, 2023 Knowledge Distillation Language Modeling
— Unverified 00 G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation Aug 17, 2021 Knowledge Distillation object-detection
— Unverified 00 GazeGen: Gaze-Driven User Interaction for Visual Content Generation Nov 7, 2024 Gaze Estimation Knowledge Distillation
— Unverified 00 Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher Oct 5, 2024 Knowledge Distillation
— Unverified 00 GAN-Knowledge Distillation for one-stage Object Detection Jun 20, 2019 Knowledge Distillation Object
— Unverified 00 GAML-BERT: Improving BERT Early Exiting by Gradient Aligned Mutual Learning Nov 1, 2021 Knowledge Distillation
— Unverified 00 DASECount: Domain-Agnostic Sample-Efficient Wireless Indoor Crowd Counting via Few-shot Learning Nov 18, 2022 Crowd Counting Few-Shot Learning
— Unverified 00 BeSound: Bluetooth-Based Position Estimation Enhancing with Cross-Modality Distillation Apr 24, 2024 Knowledge Distillation Position
— Unverified 00 Galileo at SemEval-2020 Task 12: Multi-lingual Learning for Offensive Language Identification using Pre-trained Language Models Oct 7, 2020 All Knowledge Distillation
— Unverified 00 GAI-Enabled Explainable Personalized Federated Semi-Supervised Learning Oct 11, 2024 Federated Learning Knowledge Distillation
— Unverified 00 BERT Learns to Teach: Knowledge Distillation with Meta Learning Aug 17, 2021 Knowledge Distillation Meta-Learning
— Unverified 00 Fuzzy Knowledge Distillation from High-Order TSK to Low-Order TSK Feb 16, 2023 Benchmarking Knowledge Distillation
— Unverified 00 Future-Guided Incremental Transformer for Simultaneous Translation Dec 23, 2020 Knowledge Distillation Translation
— Unverified 00 DAKD: Data Augmentation and Knowledge Distillation using Diffusion Models for SAR Oil Spill Segmentation Dec 11, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 Fusing Bidirectional Chains of Thought and Reward Mechanisms A Method for Enhancing Question-Answering Capabilities of Large Language Models for Chinese Intangible Cultural Heritage May 13, 2025 Knowledge Distillation Large Language Model
— Unverified 00