Bidirectional Distillation: A Mixed-Play Framework for Multi-Agent Generalizable Behaviors May 16, 2025 Knowledge Distillation Multi-agent Reinforcement Learning
— Unverified 00 Ground Reaction Force Estimation via Time-aware Knowledge Distillation Jun 12, 2025 Knowledge Distillation
— Unverified 00 3D-Augmented Contrastive Knowledge Distillation for Image-based Object Pose Estimation Jun 2, 2022 Contrastive Learning Knowledge Distillation
— Unverified 00 3D Denoisers are Good 2D Teachers: Molecular Pretraining via Denoising and Cross-Modal Distillation Sep 8, 2023 Denoising Knowledge Distillation
— Unverified 00 3D Face Alignment Through Fusion of Head Pose Information and Features Aug 25, 2023 3D Face Alignment Face Alignment
— Unverified 00 3D Point Cloud Pre-training with Knowledge Distillation from 2D Images Dec 17, 2022 Concept Alignment Knowledge Distillation
— Unverified 00 A baseline revisited: Pushing the limits of multi-segment models for context-aware translation Oct 19, 2022 Knowledge Distillation Translation
— Unverified 00 A Bayesian Optimization Framework for Neural Network Compression Oct 1, 2019 Bayesian Optimization Knowledge Distillation
— Unverified 00 ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression May 26, 2023 Knowledge Distillation
— Unverified 00 ABKD: Graph Neural Network Compression with Attention-Based Knowledge Distillation Oct 24, 2023 Drug Discovery Fake News Detection
— Unverified 00 ACAM-KD: Adaptive and Cooperative Attention Masking for Knowledge Distillation Mar 8, 2025 Autonomous Driving feature selection
— Unverified 00 Accelerating Diffusion Models with One-to-Many Knowledge Distillation Oct 5, 2024 Image Generation Knowledge Distillation
— Unverified 00 Accelerating Large Scale Knowledge Distillation via Dynamic Importance Sampling Dec 3, 2018 Knowledge Distillation Machine Translation
— Unverified 00 Accelerating Molecular Graph Neural Networks via Knowledge Distillation Jun 26, 2023 Data Augmentation Knowledge Distillation
— Unverified 00 Accelerating Transformer Decoding via a Hybrid of Self-attention and Recurrent Neural Network Sep 5, 2019 Decoder Knowledge Distillation
— Unverified 00 Accurate and Structured Pruning for Efficient Automatic Speech Recognition May 31, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Accurate Knowledge Distillation with n-best Reranking May 20, 2023 Knowledge Distillation Reranking
— Unverified 00 A Classifier-Free Incremental Learning Framework for Scalable Medical Image Segmentation May 25, 2024 Contrastive Learning Image Segmentation
— Unverified 00 A Closer Look at Deep Learning Heuristics: Learning rate restarts, Warmup and Distillation Oct 29, 2018 Dimensionality Reduction Knowledge Distillation
— Unverified 00 A Closer Look at Knowledge Distillation with Features, Logits, and Gradients Mar 18, 2022 Incremental Learning Knowledge Distillation
— Unverified 00 A Closer Look at Rehearsal-Free Continual Learning Mar 31, 2022 Continual Learning Knowledge Distillation
— Unverified 00 A Closer Look at Wav2Vec2 Embeddings for On-Device Single-Channel Speech Enhancement Mar 3, 2024 Automatic Speech Recognition Keyword Spotting
— Unverified 00 A Cohesive Distillation Architecture for Neural Language Models Jan 12, 2023 Knowledge Distillation Language Modeling
— Unverified 00 A Comparative Analysis of Task-Agnostic Distillation Methods for Compressing Transformer Language Models Oct 13, 2023 Knowledge Distillation
— Unverified 00 Supervised domain adaptation for building extraction from off-nadir aerial images Nov 7, 2023 Domain Adaptation Earth Observation
— Unverified 00 A Comprehensive Overhaul of Distilling Unconditional GANs Sep 29, 2021 Knowledge Distillation
— Unverified 00 A Comprehensive Review of Knowledge Distillation in Computer Vision Apr 1, 2024 Deep Learning Knowledge Distillation
— Unverified 00 A Comprehensive Study of Class Incremental Learning Algorithms for Visual Tasks Nov 3, 2020 class-incremental learning Class Incremental Learning
— Unverified 00 A Comprehensive Survey of Compression Algorithms for Language Models Jan 27, 2024 Knowledge Distillation Quantization
— Unverified 00 A Comprehensive Survey on Knowledge Distillation of Diffusion Models Apr 9, 2023 Knowledge Distillation Survey
— Unverified 00 A Continual and Incremental Learning Approach for TinyML On-device Training Using Dataset Distillation and Model Size Adaption Sep 11, 2024 Anomaly Detection Computational Efficiency
— Unverified 00 A Contrastive Teacher-Student Framework for Novelty Detection under Style Shifts Jan 28, 2025 Autonomous Driving Knowledge Distillation
— Unverified 00 Acquiring Knowledge from Pre-trained Model to Neural Machine Translation Dec 4, 2019 General Knowledge Knowledge Distillation
— Unverified 00 A Cross-Domain Approach for Continuous Impression Recognition from Dyadic Audio-Visual-Physio Signals Mar 25, 2022 Knowledge Distillation Spoken Dialogue Systems
— Unverified 00 Action Spotting and Precise Event Detection in Sports: Datasets, Methods, and Challenges May 6, 2025 Action Localization Action Spotting
— Unverified 00 Activation Map Adaptation for Effective Knowledge Distillation Oct 26, 2020 Knowledge Distillation Model Compression
— Unverified 00 Active Class Incremental Learning for Imbalanced Datasets Aug 25, 2020 class-incremental learning Class Incremental Learning
— Unverified 00 Active Data Curation Effectively Distills Large-Scale Multimodal Models Nov 27, 2024 Decoder Image Captioning
— Unverified 00 Active Exploration of Multimodal Complementarity for Few-Shot Action Recognition Jan 1, 2023 Action Recognition Few-Shot action recognition
— Unverified 00 Active Large Language Model-based Knowledge Distillation for Session-based Recommendation Dec 15, 2024 Active Learning Knowledge Distillation
— Unverified 00 Active Learning for Lane Detection: A Knowledge Distillation Approach Jan 1, 2021 2D Object Detection Active Learning
— Unverified 00 ActivityCLIP: Enhancing Group Activity Recognition by Mining Complementary Information from Text to Supplement Image Modality Jul 29, 2024 Activity Recognition Group Activity Recognition
— Unverified 00 Ada-DQA: Adaptive Diverse Quality-aware Feature Acquisition for Video Quality Assessment Aug 1, 2023 Diversity Knowledge Distillation
— Unverified 00 AdaKD: Dynamic Knowledge Distillation of ASR models using Adaptive Loss Weighting May 11, 2024 Knowledge Distillation Model Compression
— Unverified 00 Adam: Dense Retrieval Distillation with Adaptive Dark Examples Dec 20, 2022 Knowledge Distillation Retrieval
— Unverified 00 Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains Jun 25, 2021 Knowledge Distillation
— Unverified 00 Adapter-based Selective Knowledge Distillation for Federated Multi-domain Meeting Summarization Aug 7, 2023 Federated Learning Knowledge Distillation
— Unverified 00 AdapterDistillation: Non-Destructive Task Composition with Knowledge Distillation Dec 26, 2023 Knowledge Distillation Retrieval
— Unverified 00 Adapting Models to Signal Degradation using Distillation Apr 1, 2016 Domain Adaptation Knowledge Distillation
— Unverified 00 Adapting OC20-trained EquiformerV2 Models for High-Entropy Materials Mar 14, 2024 Knowledge Distillation
— Unverified 00