Edge Bias in Federated Learning and its Solution by Buffered Knowledge Distillation Oct 20, 2020 Federated Learning Knowledge Distillation
— Unverified 0Galileo at SemEval-2020 Task 12: Multi-lingual Learning for Offensive Language Identification using Pre-trained Language Models Oct 7, 2020 All Knowledge Distillation
— Unverified 0ActivityCLIP: Enhancing Group Activity Recognition by Mining Complementary Information from Text to Supplement Image Modality Jul 29, 2024 Activity Recognition Group Activity Recognition
— Unverified 0GAN-Knowledge Distillation for one-stage Object Detection Jun 20, 2019 Knowledge Distillation Object
— Unverified 0Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher Oct 5, 2024 Knowledge Distillation
— Unverified 0GazeGen: Gaze-Driven User Interaction for Visual Content Generation Nov 7, 2024 Gaze Estimation Knowledge Distillation
— Unverified 0End-to-End Automatic Speech Recognition with Deep Mutual Learning Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Endpoints Weight Fusion for Class Incremental Semantic Segmentation Jan 1, 2023 class-incremental learning Class Incremental Learning
— Unverified 0EncodeNet: A Framework for Boosting DNN Accuracy with Entropy-driven Generalized Converting Autoencoder Apr 21, 2024 image-classification Image Classification
— Unverified 0Enabling Weak Client Participation via On-device Knowledge Distillation in Heterogenous Federated Learning Mar 14, 2025 Federated Learning Knowledge Distillation
— Unverified 0Compositional Data Augmentation for Abstractive Conversation Summarization Nov 16, 2021 Conversation Summarization Data Augmentation
— Unverified 0Asynchronous Convergence in Multi-Task Learning via Knowledge Distillation from Converged Tasks Jul 1, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Mar 12, 2022 Image Captioning Knowledge Distillation
— Unverified 0Generalized Continual Zero-Shot Learning Nov 17, 2020 Continual Learning Knowledge Distillation
— Unverified 0Data Efficient Acoustic Scene Classification using Teacher-Informed Confusing Class Instruction Sep 18, 2024 Acoustic Scene Classification Data Augmentation
— Unverified 0Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Nov 16, 2021 Image Captioning Knowledge Distillation
— Unverified 0Generalized Uncertainty of Deep Neural Networks: Taxonomy and Applications Feb 2, 2023 Knowledge Distillation Model Compression
— Unverified 0Data-efficient Event Camera Pre-training via Disentangled Masked Modeling Mar 1, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again Oct 10, 2022 Knowledge Distillation
— Unverified 0General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference Apr 29, 2020 Knowledge Distillation Quantization
— Unverified 0Generate, Annotate, and Learn: Generative Models Advance Self-Training and Knowledge Distillation Sep 29, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 0Generating Long Financial Report using Conditional Variational Autoencoders with Knowledge Distillation Oct 23, 2020 Decoder Knowledge Distillation
— Unverified 0Complex Emotion Recognition System using basic emotions via Facial Expression, EEG, and ECG Signals: a review Sep 9, 2024 EEG Electroencephalogram (EEG)
— Unverified 0Generation and Consolidation of Recollections for Efficient Deep Lifelong Learning Jan 1, 2018 Knowledge Distillation Lifelong learning
— Unverified 0Generation-Distillation for Efficient Natural Language Understanding in Low-Data Settings Jan 25, 2020 General Classification Knowledge Distillation
— Unverified 0Generative Adversarial Simulator Nov 23, 2020 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification Oct 25, 2020 3D Point Cloud Classification General Classification
— Unverified 0AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages Feb 25, 2025 Knowledge Distillation Language Modeling
— Unverified 0I2D2: Inductive Knowledge Distillation with NeuroLogic and Self-Imitation Dec 19, 2022 Imitation Learning Knowledge Distillation
— Unverified 0I^2KD-SLU: An Intra-Inter Knowledge Distillation Framework for Zero-Shot Cross-Lingual Spoken Language Understanding Oct 4, 2023 Intent Detection Knowledge Distillation
— Unverified 0Empowering Dual-Encoder with Query Generator for Cross-Lingual Dense Retrieval Mar 27, 2023 Knowledge Distillation Retrieval
— Unverified 0Empirical Evaluation of Knowledge Distillation from Transformers to Subquadratic Language Models Apr 19, 2025 Knowledge Distillation State Space Models
— Unverified 0Complete-to-Partial 4D Distillation for Self-Supervised Point Cloud Sequence Representation Learning Dec 10, 2022 Knowledge Distillation Representation Learning
— Unverified 0Knowledge distillation for optimization of quantized deep neural networks Sep 4, 2019 Knowledge Distillation
— Unverified 0Emo Pillars: Knowledge Distillation to Support Fine-Grained Context-Aware and Context-Less Emotion Classification Apr 23, 2025 Emotion Classification GPU
— Unverified 0A Framework for Double-Blind Federated Adaptation of Foundation Models Feb 3, 2025 Federated Learning image-classification
— Unverified 0Embracing the Dark Knowledge: Domain Generalization Using Regularized Knowledge Distillation Jul 6, 2021 Domain Generalization image-classification
— Unverified 0EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval Jan 27, 2023 Information Retrieval Knowledge Distillation
— Unverified 0Completely Heterogeneous Federated Learning Oct 28, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 0GhostNetV3: Exploring the Training Strategies for Compact Models Apr 17, 2024 Image Classification Knowledge Distillation
— Unverified 0Embedding Compression for Teacher-to-Student Knowledge Transfer Feb 9, 2024 Knowledge Distillation Transfer Learning
— Unverified 0On-Policy Distillation of Language Models: Learning from Self-Generated Mistakes Jun 23, 2023 Arithmetic Reasoning Knowledge Distillation
— Unverified 0Asymmetric Image Retrieval with Cross Model Compatible Ensembles Mar 30, 2023 Diversity Face Recognition
— Unverified 0ABKD: Graph Neural Network Compression with Attention-Based Knowledge Distillation Oct 24, 2023 Drug Discovery Fake News Detection
— Unverified 0Embedded Knowledge Distillation in Depth-Level Dynamic Neural Network Mar 1, 2021 Dynamic neural networks Knowledge Distillation
— Unverified 0ELiTe: Efficient Image-to-LiDAR Knowledge Transfer for Semantic Segmentation May 7, 2024 Knowledge Distillation LIDAR Semantic Segmentation
— Unverified 0Comparison of Soft and Hard Target RNN-T Distillation for Large-scale ASR Oct 11, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Global Intervention and Distillation for Federated Out-of-Distribution Generalization Apr 1, 2025 Attribute Data Augmentation
— Unverified 0ADPS: Asymmetric Distillation Post-Segmentation for Image Anomaly Detection Oct 19, 2022 Anomaly Detection Anomaly Localization
— Unverified 0VizECGNet: Visual ECG Image Network for Cardiovascular Diseases Classification with Multi-Modal Training and Knowledge Distillation Aug 6, 2024 ECG Classification Knowledge Distillation
— Unverified 0