Correlation-Decoupled Knowledge Distillation for Multimodal Sentiment Analysis with Incomplete Modalities Apr 25, 2024 Disentanglement Knowledge Distillation
— Unverified 00 CORSD: Class-Oriented Relational Self Distillation Apr 28, 2023 Knowledge Distillation Model Compression
— Unverified 00 Cosine Similarity Knowledge Distillation for Individual Class Information Transfer Nov 24, 2023 Knowledge Distillation Model Compression
— Unverified 00 Cost-effective Deployment of BERT Models in Serverless Environment Mar 19, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 00 Cost-effective Deployment of BERT Models in Serverless Environment Jun 1, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 00 CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers Feb 24, 2025 Knowledge Distillation
— Unverified 00 CoT-Drive: Efficient Motion Forecasting for Autonomous Driving with LLMs and Chain-of-Thought Prompting Mar 10, 2025 Autonomous Driving Knowledge Distillation
— Unverified 00 Co-training and Co-distillation for Quality Improvement and Compression of Language Models Nov 6, 2023 Data Augmentation Knowledge Distillation
— Unverified 00 Coupled End-to-End Transfer Learning With Generalized Fisher Information Jun 1, 2018 Decoder Domain Adaptation
— Unverified 00 CoupleFace: Relation Matters for Face Recognition Distillation Apr 12, 2022 Face Recognition Knowledge Distillation
— Unverified 00 CourseGPT-zh: an Educational Large Language Model Based on Knowledge Distillation Incorporating Prompt Optimization May 8, 2024 Diversity Knowledge Distillation
— Unverified 00 CovidCare: Transferring Knowledge from Existing EMR to Emerging Epidemic for Interpretable Prognosis Jul 17, 2020 Diagnostic Knowledge Distillation
— Unverified 00 Creating a Good Teacher for Knowledge Distillation in Acoustic Scene Classification Mar 14, 2025 Acoustic Scene Classification Knowledge Distillation
— Unverified 00 Creating Lightweight Object Detectors with Model Compression for Deployment on Edge Devices May 6, 2019 Knowledge Distillation Model Compression
— Unverified 00 CREFT: Sequential Multi-Agent LLM for Character Relation Extraction May 30, 2025 Knowledge Distillation Language Modeling
— Unverified 00 CRKD: Enhanced Camera-Radar Object Detection with Cross-modality Knowledge Distillation Mar 28, 2024 3D Object Detection Autonomous Driving
— Unverified 00 Cross Architecture Distillation for Face Recognition Jun 26, 2023 Face Recognition Knowledge Distillation
— Unverified 00 Cross-Architecture Knowledge Distillation Jul 12, 2022 Knowledge Distillation
— Unverified 00 Cross-Class Feature Augmentation for Class Incremental Learning Apr 4, 2023 class-incremental learning Class Incremental Learning
— Unverified 00 Cross domain knowledge compression in realtime optical flow prediction on ultrasound sequences Feb 4, 2022 Knowledge Distillation Optical Flow Estimation
— Unverified 00 Cross-Domain Knowledge Distillation for Low-Resolution Human Pose Estimation May 19, 2024 Knowledge Distillation Pose Estimation
— Unverified 00 Cross-Level Multi-Instance Distillation for Self-Supervised Fine-Grained Visual Categorization Jan 16, 2024 Fine-Grained Visual Categorization Knowledge Distillation
— Unverified 00 Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages May 25, 2023 Knowledge Distillation Machine Translation
— Unverified 00 Cross-lingual Knowledge Distillation via Flow-based Voice Conversion for Robust Polyglot Text-To-Speech Sep 15, 2023 Knowledge Distillation Speech Synthesis
— Unverified 00 Cross-lingual Machine Reading Comprehension with Language Branch Knowledge Distillation Oct 27, 2020 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Cross-Lingual NER for Financial Transaction Data in Low-Resource Languages Jul 16, 2023 Cross-Lingual NER Knowledge Distillation
— Unverified 00 Cross-modal Contrastive Distillation for Instructional Activity Anticipation Jan 18, 2022 Knowledge Distillation
— Unverified 00 Cross Modal Distillation for Flood Extent Mapping Feb 16, 2023 Knowledge Distillation
— Unverified 00 Cross-modal knowledge distillation for action recognition Oct 10, 2019 Action Recognition Knowledge Distillation
— Unverified 00 Crossmodal Knowledge Distillation with WordNet-Relaxed Text Embeddings for Robust Image Classification Mar 31, 2025 image-classification Image Classification
— Unverified 00 Cross-Resolution Face Recognition via Prior-Aided Face Hallucination and Residual Knowledge Distillation May 26, 2019 Face Hallucination Face Recognition
— Unverified 00 Canine EEG Helps Human: Cross-Species and Cross-Modality Epileptic Seizure Detection via Multi-Space Alignment Dec 18, 2024 Brain Computer Interface Diagnostic
— Unverified 00 Cross-Task Knowledge Distillation in Multi-Task Recommendation Feb 20, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 00 Crowd Counting with Online Knowledge Learning Mar 18, 2023 Crowd Counting Edge-computing
— Unverified 00 CTC Blank Triggered Dynamic Layer-Skipping for Efficient CTC-based Speech Recognition Jan 4, 2024 Knowledge Distillation speech-recognition
— Unverified 00 CULL-MT: Compression Using Language and Layer pruning for Machine Translation Nov 10, 2024 Knowledge Distillation Machine Translation
— Unverified 00 CustomKD: Customizing Large Vision Foundation for Edge Model Improvement via Knowledge Distillation Mar 23, 2025 Domain Adaptation Knowledge Distillation
— Unverified 00 D^3ETR: Decoder Distillation for Detection Transformer Nov 17, 2022 Decoder Knowledge Distillation
— Unverified 00 D3T-GAN: Data-Dependent Domain Transfer GANs for Few-shot Image Generation May 12, 2022 Image Generation Knowledge Distillation
— Unverified 00 DA-CIL: Towards Domain Adaptive Class-Incremental 3D Object Detection Dec 5, 2022 3D Object Detection class-incremental learning
— Unverified 00 DaFKD: Domain-Aware Federated Knowledge Distillation Jan 1, 2023 Knowledge Distillation
— Unverified 00 DAKD: Data Augmentation and Knowledge Distillation using Diffusion Models for SAR Oil Spill Segmentation Dec 11, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 DASECount: Domain-Agnostic Sample-Efficient Wireless Indoor Crowd Counting via Few-shot Learning Nov 18, 2022 Crowd Counting Few-Shot Learning
— Unverified 00 Data-Driven Compression of Convolutional Neural Networks Nov 28, 2019 Knowledge Distillation Model Compression
— Unverified 00 Data Efficient Acoustic Scene Classification using Teacher-Informed Confusing Class Instruction Sep 18, 2024 Acoustic Scene Classification Data Augmentation
— Unverified 00 Data-efficient Event Camera Pre-training via Disentangled Masked Modeling Mar 1, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 Data-Efficient Ranking Distillation for Image Retrieval Jul 10, 2020 Image Retrieval Knowledge Distillation
— Unverified 00 Data-Free Adversarial Knowledge Distillation for Graph Neural Networks May 8, 2022 Generative Adversarial Network Graph Classification
— Unverified 00 Dense Depth Distillation with Out-of-Distribution Simulated Images Aug 26, 2022 Data-free Knowledge Distillation Depth Estimation
— Unverified 00 Data-Free Distillation of Language Model by Text-to-Text Transfer Nov 3, 2023 Data-free Knowledge Distillation Diversity
— Unverified 00