Follow Your Path: a Progressive Method for Knowledge Distillation Jul 20, 2021 Knowledge Distillation
— Unverified 00 For the Misgendered Chinese in Gender Bias Research: Multi-Task Learning with Knowledge Distillation for Pinyin Name-Gender Prediction May 10, 2024 Gender Prediction Knowledge Distillation
— Unverified 00 Forward-Backward Knowledge Distillation for Continual Clustering May 29, 2024 Clustering Continual Learning
— Unverified 00 Foundational Model for Electron Micrograph Analysis: Instruction-Tuning Small-Scale Language-and-Vision Assistant for Enterprise Adoption Aug 23, 2024 Instruction Following Knowledge Distillation
— Unverified 00 FPGA Resource-aware Structured Pruning for Real-Time Neural Networks Aug 9, 2023 Classification image-classification
— Unverified 00 FreeKD: Free-direction Knowledge Distillation for Graph Neural Networks Jun 14, 2022 Knowledge Distillation reinforcement-learning
— Unverified 00 FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models Jun 14, 2022 Cross-Lingual Transfer Diagnostic
— Unverified 00 FReTAL: Generalizing Deepfake Detection using Knowledge Distillation and Representation Learning May 28, 2021 DeepFake Detection Domain Adaptation
— Unverified 00 From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks May 9, 2024 Knowledge Distillation Model Compression
— Unverified 00 From Data to Modeling: Fully Open-vocabulary Scene Graph Generation May 26, 2025 Graph Generation Knowledge Distillation
— Unverified 00 From Easy to Hard: Learning Curricular Shape-aware Features for Robust Panoptic Scene Graph Generation Jul 12, 2024 Graph Generation Knowledge Distillation
— Unverified 00 From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels Mar 23, 2023 Knowledge Distillation Self-Knowledge Distillation
— Unverified 00 From Large to Super-Tiny: End-to-End Optimization for Cost-Efficient LLMs Apr 18, 2025 Knowledge Distillation Model Compression
— Unverified 00 From LLM to NMT: Advancing Low-Resource Machine Translation with Claude Apr 22, 2024 Knowledge Distillation Language Modeling
— Unverified 00 From Multimodal to Unimodal Attention in Transformers using Knowledge Distillation Oct 15, 2021 Knowledge Distillation Multimodal Deep Learning
— Unverified 00 From Two-Stream to One-Stream: Efficient RGB-T Tracking via Mutual Prompt Learning and Knowledge Distillation Mar 25, 2024 Knowledge Distillation Object Tracking
— Unverified 00 From Wide to Deep: Dimension Lifting Network for Parameter-efficient Knowledge Graph Embedding Mar 22, 2023 Graph Embedding Knowledge Distillation
— Unverified 00 FSAR: Federated Skeleton-based Action Recognition with Adaptive Topology Structure and Knowledge Distillation Jun 19, 2023 Action Recognition Federated Learning
— Unverified 00 Fully Fine-tuned CLIP Models are Efficient Few-Shot Learners Jul 4, 2024 Domain Generalization Few-Shot Learning
— Unverified 00 Fusing Bidirectional Chains of Thought and Reward Mechanisms A Method for Enhancing Question-Answering Capabilities of Large Language Models for Chinese Intangible Cultural Heritage May 13, 2025 Knowledge Distillation Large Language Model
— Unverified 00 Future-Guided Incremental Transformer for Simultaneous Translation Dec 23, 2020 Knowledge Distillation Translation
— Unverified 00 Fuzzy Knowledge Distillation from High-Order TSK to Low-Order TSK Feb 16, 2023 Benchmarking Knowledge Distillation
— Unverified 00 GAI-Enabled Explainable Personalized Federated Semi-Supervised Learning Oct 11, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Galileo at SemEval-2020 Task 12: Multi-lingual Learning for Offensive Language Identification using Pre-trained Language Models Oct 7, 2020 All Knowledge Distillation
— Unverified 00 GAML-BERT: Improving BERT Early Exiting by Gradient Aligned Mutual Learning Nov 1, 2021 Knowledge Distillation
— Unverified 00 GAN-Knowledge Distillation for one-stage Object Detection Jun 20, 2019 Knowledge Distillation Object
— Unverified 00 Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher Oct 5, 2024 Knowledge Distillation
— Unverified 00 GazeGen: Gaze-Driven User Interaction for Visual Content Generation Nov 7, 2024 Gaze Estimation Knowledge Distillation
— Unverified 00 G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation Aug 17, 2021 Knowledge Distillation object-detection
— Unverified 00 GenDistiller: Distilling Pre-trained Language Models based on Generative Models Oct 20, 2023 Knowledge Distillation Language Modeling
— Unverified 00 GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model Jun 12, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 Generalization in birdsong classification: impact of transfer learning methods and dataset characteristics Sep 21, 2024 Knowledge Distillation Sound Classification
— Unverified 00 Generalized Continual Zero-Shot Learning Nov 17, 2020 Continual Learning Knowledge Distillation
— Unverified 00 Generalized Uncertainty of Deep Neural Networks: Taxonomy and Applications Feb 2, 2023 Knowledge Distillation Model Compression
— Unverified 00 General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference Apr 29, 2020 Knowledge Distillation Quantization
— Unverified 00 Generate, Annotate, and Learn: Generative Models Advance Self-Training and Knowledge Distillation Sep 29, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 00 Generating Long Financial Report using Conditional Variational Autoencoders with Knowledge Distillation Oct 23, 2020 Decoder Knowledge Distillation
— Unverified 00 Generating Synthetic Fair Syntax-agnostic Data by Learning and Distilling Fair Representation Aug 20, 2024 Fairness Knowledge Distillation
— Unverified 00 Generation and Consolidation of Recollections for Efficient Deep Lifelong Learning Jan 1, 2018 Knowledge Distillation Lifelong learning
— Unverified 00 Generation-Distillation for Efficient Natural Language Understanding in Low-Data Settings Jan 25, 2020 General Classification Knowledge Distillation
— Unverified 00 Generative Adversarial Simulator Nov 23, 2020 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Generative Dataset Distillation Based on Self-knowledge Distillation Jan 8, 2025 Dataset Distillation Knowledge Distillation
— Unverified 00 Generative Negative Text Replay for Continual Vision-Language Pretraining Oct 31, 2022 Continual Learning image-classification
— Unverified 00 GenURL: A General Framework for Unsupervised Representation Learning Oct 27, 2021 Contrastive Learning Dimensionality Reduction
— Unverified 00 GeoMask3D: Geometrically Informed Mask Selection for Self-Supervised Point Cloud Learning in 3D May 20, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 GHOST: Grounded Human Motion Generation with Open Vocabulary Scene-and-Text Contexts Apr 8, 2024 Descriptive Image Segmentation
— Unverified 00 GhostNetV3: Exploring the Training Strategies for Compact Models Apr 17, 2024 Image Classification Knowledge Distillation
— Unverified 00 On-Policy Distillation of Language Models: Learning from Self-Generated Mistakes Jun 23, 2023 Arithmetic Reasoning Knowledge Distillation
— Unverified 00 Global Intervention and Distillation for Federated Out-of-Distribution Generalization Apr 1, 2025 Attribute Data Augmentation
— Unverified 00 Local-Global Knowledge Distillation in Heterogeneous Federated Learning with Non-IID Data Jun 30, 2021 Federated Learning Knowledge Distillation
— Unverified 00