Gemma 2: Improving Open Language Models at a Practical Size Jul 31, 2024 Knowledge Distillation
— Unverified 0A Lightweight Low-Light Image Enhancement Network via Channel Prior and Gamma Correction Feb 28, 2024 Image Enhancement Knowledge Distillation
— Unverified 0Cross-Domain Knowledge Distillation for Low-Resolution Human Pose Estimation May 19, 2024 Knowledge Distillation Pose Estimation
— Unverified 0Cross domain knowledge compression in realtime optical flow prediction on ultrasound sequences Feb 4, 2022 Knowledge Distillation Optical Flow Estimation
— Unverified 0Adaptive Affinity-Based Generalization For MRI Imaging Segmentation Across Resource-Limited Settings Apr 3, 2024 Data Integration Knowledge Distillation
— Unverified 0Cross-Class Feature Augmentation for Class Incremental Learning Apr 4, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Cross-Architecture Knowledge Distillation Jul 12, 2022 Knowledge Distillation
— Unverified 0Cross Architecture Distillation for Face Recognition Jun 26, 2023 Face Recognition Knowledge Distillation
— Unverified 0BabyHGRN: Exploring RNNs for Sample-Efficient Training of Language Models Dec 20, 2024 Knowledge Distillation Language Modeling
— Unverified 0A Lightweight Domain Adversarial Neural Network Based on Knowledge Distillation for EEG-based Cross-subject Emotion Recognition May 12, 2023 EEG Electroencephalogram (EEG)
— Unverified 0Exploring Extreme Quantization in Spiking Language Models May 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0CRKD: Enhanced Camera-Radar Object Detection with Cross-modality Knowledge Distillation Mar 28, 2024 3D Object Detection Autonomous Driving
— Unverified 0CREFT: Sequential Multi-Agent LLM for Character Relation Extraction May 30, 2025 Knowledge Distillation Language Modeling
— Unverified 0AWF: Adaptive Weight Fusion for Enhanced Class Incremental Semantic Segmentation Sep 13, 2024 Class-Incremental Semantic Segmentation Knowledge Distillation
— Unverified 0Creating Lightweight Object Detectors with Model Compression for Deployment on Edge Devices May 6, 2019 Knowledge Distillation Model Compression
— Unverified 0A Light-weight Deep Learning Model for Remote Sensing Image Classification Feb 25, 2023 image-classification Image Classification
— Unverified 0Creating a Good Teacher for Knowledge Distillation in Acoustic Scene Classification Mar 14, 2025 Acoustic Scene Classification Knowledge Distillation
— Unverified 0CovidCare: Transferring Knowledge from Existing EMR to Emerging Epidemic for Interpretable Prognosis Jul 17, 2020 Diagnostic Knowledge Distillation
— Unverified 0Aware of the History: Trajectory Forecasting with the Local Behavior Data Jul 20, 2022 Knowledge Distillation Prediction
— Unverified 0CourseGPT-zh: an Educational Large Language Model Based on Knowledge Distillation Incorporating Prompt Optimization May 8, 2024 Diversity Knowledge Distillation
— Unverified 0CoupleFace: Relation Matters for Face Recognition Distillation Apr 12, 2022 Face Recognition Knowledge Distillation
— Unverified 0A Knowledge Distillation framework for Multi-Organ Segmentation of Medaka Fish in Tomographic Image Feb 24, 2023 Computed Tomography (CT) Image Segmentation
— Unverified 0Adapting While Learning: Grounding LLMs for Scientific Problems with Intelligent Tool Usage Adaptation Nov 1, 2024 Epidemiology Knowledge Distillation
— Unverified 0Coupled End-to-End Transfer Learning With Generalized Fisher Information Jun 1, 2018 Decoder Domain Adaptation
— Unverified 0Co-training and Co-distillation for Quality Improvement and Compression of Language Models Nov 6, 2023 Data Augmentation Knowledge Distillation
— Unverified 0CoT-Drive: Efficient Motion Forecasting for Autonomous Driving with LLMs and Chain-of-Thought Prompting Mar 10, 2025 Autonomous Driving Knowledge Distillation
— Unverified 0A vision transformer-based framework for knowledge transfer from multi-modal to mono-modal lymphoma subtyping models Aug 2, 2023 Knowledge Distillation Transfer Learning
— Unverified 01st Place Solution to the EPIC-Kitchens Action Anticipation Challenge 2022 Jul 10, 2022 Action Anticipation Knowledge Distillation
— Unverified 0CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers Feb 24, 2025 Knowledge Distillation
— Unverified 0Cost-effective Deployment of BERT Models in Serverless Environment Jun 1, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 0AUTOSUMM: Automatic Model Creation for Text Summarization Nov 1, 2021 Abstractive Text Summarization Deep Learning
— Unverified 0Cost-effective Deployment of BERT Models in Serverless Environment Mar 19, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 0Cosine Similarity Knowledge Distillation for Individual Class Information Transfer Nov 24, 2023 Knowledge Distillation Model Compression
— Unverified 0Adapting OC20-trained EquiformerV2 Models for High-Entropy Materials Mar 14, 2024 Knowledge Distillation
— Unverified 0Exploring Dark Knowledge under Various Teacher Capacities and Addressing Capacity Mismatch May 21, 2024 Knowledge Distillation
— Unverified 0Exploring Dual Model Knowledge Distillation for Anomaly Detection Jun 27, 2023 Anomaly Detection feature selection
— Unverified 0CORSD: Class-Oriented Relational Self Distillation Apr 28, 2023 Knowledge Distillation Model Compression
— Unverified 0Correlation-Decoupled Knowledge Distillation for Multimodal Sentiment Analysis with Incomplete Modalities Apr 25, 2024 Disentanglement Knowledge Distillation
— Unverified 0A Knowledge Distillation-Based Backdoor Attack in Federated Learning Aug 12, 2022 Backdoor Attack Federated Learning
— Unverified 0Automatic Mixed-Precision Quantization Search of BERT Dec 30, 2021 Knowledge Distillation Model Compression
— Unverified 0Corrected with the Latest Version: Make Robust Asynchronous Federated Learning Possible Apr 5, 2025 Federated Learning Knowledge Distillation
— Unverified 0Exploiting Unlabelled Photos for Stronger Fine-Grained SBIR Mar 24, 2023 Image Retrieval Knowledge Distillation
— Unverified 0Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models Sep 19, 2024 Knowledge Distillation
— Unverified 0CoroNetGAN: Controlled Pruning of GANs via Hypernetworks Mar 13, 2024 Knowledge Distillation
— Unverified 0ChromaDistill: Colorizing Monochrome Radiance Fields with Knowledge Distillation Sep 14, 2023 3DGS Colorization
— Unverified 0Automatic Block-wise Pruning with Auxiliary Gating Structures for Deep Convolutional Neural Networks May 7, 2022 Knowledge Distillation Model Compression
— Unverified 0Adapting Models to Signal Degradation using Distillation Apr 1, 2016 Domain Adaptation Knowledge Distillation
— Unverified 0Coordinating Cross-modal Distillation for Molecular Property Prediction Nov 30, 2022 Graph Regression Graph Representation Learning
— Unverified 0Accelerating Molecular Graph Neural Networks via Knowledge Distillation Jun 26, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Exploiting Knowledge Distillation for Few-Shot Image Generation Sep 29, 2021 Diversity Image Generation
— Unverified 0