C3R: Channel Conditioned Cell Representations for unified evaluation in microscopy imaging May 24, 2025 Knowledge Distillation
— Unverified 00 CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation Apr 30, 2025 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence Oct 17, 2024 Binary Classification Knowledge Distillation
— Unverified 00 CAMeMBERT: Cascading Assistant-Mediated Multilingual BERT Dec 22, 2022 Knowledge Distillation
— Unverified 00 Can a student Large Language Model perform as well as it's teacher? Oct 3, 2023 Knowledge Distillation Language Modeling
— Unverified 00 Can Current Explainability Help Provide References in Clinical Notes to Support Humans Annotate Medical Codes? Oct 28, 2022 Knowledge Distillation Medical Code Prediction
— Unverified 00 Can LLMs Revolutionize the Design of Explainable and Efficient TinyML Models? Apr 13, 2025 Computational Efficiency Efficient Neural Network
— Unverified 00 Can Low-Rank Knowledge Distillation in LLMs be Useful for Microelectronic Reasoning? Jun 19, 2024 Knowledge Distillation
— Unverified 00 Can Model Compression Improve NLP Fairness Jan 21, 2022 Fairness Knowledge Distillation
— Unverified 00 Can Small Language Models be Good Reasoners for Sequential Recommendation? Mar 7, 2024 Knowledge Distillation Recommendation Systems
— Unverified 00 Can Small Language Models Help Large Language Models Reason Better?: LM-Guided Chain-of-Thought Apr 4, 2024 Extractive Question-Answering Knowledge Distillation
— Unverified 00 Can Students Beyond The Teacher? Distilling Knowledge from Teacher's Bias Dec 13, 2024 Knowledge Distillation Model Compression
— Unverified 00 Can Students Outperform Teachers in Knowledge Distillation based Model Compression? Jan 1, 2021 Knowledge Distillation Model Compression
— Unverified 00 Can We Use Probing to Better Understand Fine-tuning and Knowledge Distillation of the BERT NLU? Jan 27, 2023 Knowledge Distillation Natural Language Understanding
— Unverified 00 CAP-GAN: Towards Adversarial Robustness with Cycle-consistent Attentional Purification Feb 15, 2021 Adversarial Attack Adversarial Robustness
— Unverified 00 CapsuleRRT: Relationships-Aware Regression Tracking via Capsules Jun 19, 2021 image-classification Image Classification
— Unverified 00 Capturing Rich Behavior Representations: A Dynamic Action Semantic-Aware Graph Transformer for Video Captioning Feb 19, 2025 Knowledge Distillation Object
— Unverified 00 Cascaded channel pruning using hierarchical self-distillation Aug 16, 2020 Knowledge Distillation Model Compression
— Unverified 00 CASIA's System for IWSLT 2020 Open Domain Translation Jul 1, 2020 Knowledge Distillation Machine Translation
— Unverified 00 CAST: Contrastive Adaptation and Distillation for Semi-Supervised Instance Segmentation May 28, 2025 Domain Adaptation Instance Segmentation
— Unverified 00 Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation Jun 19, 2023 Knowledge Distillation Relation
— Unverified 00 Causality Enhanced Origin-Destination Flow Prediction in Data-Scarce Cities Mar 9, 2025 Graph Attention Knowledge Distillation
— Unverified 00 Causal Self-supervised Pretrained Frontend with Predictive Code for Speech Separation Apr 3, 2025 Decoder Knowledge Distillation
— Unverified 00 Causes of Catastrophic Forgetting in Class-Incremental Semantic Segmentation Sep 16, 2022 class-incremental learning Class Incremental Learning
— Unverified 00 CBNN: 3-Party Secure Framework for Customized Binary Neural Networks Inference Dec 21, 2024 Binarization Knowledge Distillation
— Unverified 00 CCFace: Classification Consistency for Low-Resolution Face Recognition Aug 18, 2023 Classification Classification Consistency
— Unverified 00 CCS: Continuous Learning for Customized Incremental Wireless Sensing Services Dec 6, 2024 Action Recognition Knowledge Distillation
— Unverified 00 CDKT-FL: Cross-Device Knowledge Transfer using Proxy Dataset in Federated Learning Apr 4, 2022 Federated Learning Knowledge Distillation
— Unverified 00 CEKD:Cross Ensemble Knowledge Distillation for Augmented Fine-grained Data Mar 13, 2022 Data Augmentation Knowledge Distillation
— Unverified 00 Centerness-based Instance-aware Knowledge Distillation with Task-wise Mutual Lifting for Object Detection on Drone Imagery Nov 5, 2024 Knowledge Distillation object-detection
— Unverified 00 CES-KD: Curriculum-based Expert Selection for Guided Knowledge Distillation Sep 15, 2022 Knowledge Distillation
— Unverified 00 Order of Compression: A Systematic and Optimal Sequence to Combinationally Compress CNN Mar 26, 2024 Knowledge Distillation Model Compression
— Unverified 00 Channel Fingerprint Construction for Massive MIMO: A Deep Conditional Generative Approach May 12, 2025 Denoising Knowledge Distillation
— Unverified 00 Channel Planting for Deep Neural Networks using Knowledge Distillation Nov 4, 2020 Knowledge Distillation Network Pruning
— Unverified 00 Channel Self-Supervision for Online Knowledge Distillation Mar 22, 2022 Diversity Knowledge Distillation
— Unverified 00 CILDA: Contrastive Data Augmentation using Intermediate Layer Knowledge Distillation Apr 15, 2022 Contrastive Learning Data Augmentation
— Unverified 00 Improving Acoustic Scene Classification with City Features Mar 21, 2025 Acoustic Scene Classification Classification
— Unverified 00 CK4Gen: A Knowledge Distillation Framework for Generating High-Utility Synthetic Survival Datasets in Healthcare Oct 22, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 Claim Matching Beyond English to Scale Global Fact-Checking Jun 1, 2021 Fact Checking Knowledge Distillation
— Unverified 00 Class-aware Information for Logit-based Knowledge Distillation Nov 27, 2022 Knowledge Distillation
— Unverified 00 CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks Dec 5, 2021 Classification Continual Learning
— Unverified 00 Classification of Diabetic Retinopathy Using Unlabeled Data and Knowledge Distillation Sep 1, 2020 Classification General Classification
— Unverified 00 Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Evolvability Dec 1, 2020 Classification Fairness
— Unverified 00 Class-Incremental Continual Learning into the eXtended DER-verse Jan 3, 2022 Continual Learning Knowledge Distillation
— Unverified 00 Class-Incremental Few-Shot Event Detection Apr 2, 2024 Event Detection Few-Shot Learning
— Unverified 00 Class-Incremental Few-Shot Object Detection May 17, 2021 Clustering Few-Shot Object Detection
— Unverified 00 Class-Incremental Learning for Action Recognition in Videos Mar 25, 2022 Action Recognition Action Recognition In Videos
— Unverified 00 Class-Incremental Learning of Plant and Disease Detection: Growing Branches with Knowledge Distillation Apr 13, 2023 class-incremental learning Class Incremental Learning
— Unverified 00 Class Incremental Learning with Self-Supervised Pre-Training and Prototype Learning Aug 4, 2023 class-incremental learning Class Incremental Learning
— Unverified 00 Class Incremental Online Streaming Learning Oct 20, 2021 class-incremental learning Class Incremental Learning
— Unverified 00