Compressing Recurrent Neural Networks for FPGA-accelerated Implementation in Fluorescence Lifetime Imaging Oct 1, 2024 Computational Efficiency Knowledge Distillation
— Unverified 0Efficient Technical Term Translation: A Knowledge Distillation Approach for Parenthetical Terminology Translation Oct 1, 2024 Knowledge Distillation Machine Translation
— Unverified 0Advancing Medical Radiograph Representation Learning: A Hybrid Pre-training Paradigm with Multilevel Semantic Granularity Oct 1, 2024 Decoder Knowledge Distillation
— Unverified 0Local-to-Global Self-Supervised Representation Learning for Diabetic Retinopathy Grading Oct 1, 2024 Diabetic Retinopathy Grading image-classification
— Unverified 0Enhancing Romanian Offensive Language Detection through Knowledge Distillation, Multi-Task Learning, and Data Augmentation Sep 30, 2024 Data Augmentation Knowledge Distillation
— Unverified 0HYDRA-FL: Hybrid Knowledge Distillation for Robust and Accurate Federated Learning Sep 30, 2024 Federated Learning Knowledge Distillation
— Unverified 0Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies Sep 30, 2024 2D Human Pose Estimation image-classification
— Unverified 0Linear Projections of Teacher Embeddings for Few-Class Distillation Sep 30, 2024 Binary Classification Knowledge Distillation
— Unverified 0InfantCryNet: A Data-driven Framework for Intelligent Analysis of Infant Cries Sep 29, 2024 Knowledge Distillation Model Compression
— Unverified 0Tailored Federated Learning: Leveraging Direction Regulation & Knowledge Distillation Sep 29, 2024 Federated Learning Knowledge Distillation
— Unverified 0Mind the Gap: Promoting Missing Modality Brain Tumor Segmentation with Alignment Sep 28, 2024 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0MiniVLN: Efficient Vision-and-Language Navigation by Progressive Knowledge Distillation Sep 27, 2024 Knowledge Distillation Vision and Language Navigation
— Unverified 0Towards Diverse Device Heterogeneous Federated Learning via Task Arithmetic Knowledge Integration Sep 27, 2024 Federated Learning Knowledge Distillation
Code Code Available 0Harmonizing knowledge Transfer in Neural Network with Unified Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Multi-modal Cross-domain Self-supervised Pre-training for fMRI and EEG Fusion Sep 27, 2024 Data Augmentation EEG
— Unverified 0Semi-Supervised Bone Marrow Lesion Detection from Knee MRI Segmentation Using Mask Inpainting Models Sep 27, 2024 Anomaly Detection Knowledge Distillation
— Unverified 0Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Kendall's τ Coefficient for Logits Distillation Sep 26, 2024 Knowledge Distillation
— Unverified 0Shape-intensity knowledge distillation for robust medical image segmentation Sep 26, 2024 Image Segmentation Knowledge Distillation
Code Code Available 0Weak-to-Strong Backdoor Attack for Large Language Models Sep 26, 2024 Backdoor Attack Knowledge Distillation
— Unverified 0SelectiveKD: A semi-supervised framework for cancer detection in DBT through Knowledge Distillation and Pseudo-labeling Sep 25, 2024 Cancer Classification Knowledge Distillation
— Unverified 0MT2KD: Towards A General-Purpose Encoder for Speech, Speaker, and Audio Events Sep 25, 2024 Audio Tagging Automatic Speech Recognition
— Unverified 0Adverse Weather Optical Flow: Cumulative Homogeneous-Heterogeneous Adaptation Sep 25, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0Twin Network Augmentation: A Novel Training Strategy for Improved Spiking Neural Networks and Efficient Weight Quantization Sep 24, 2024 Knowledge Distillation Quantization
— Unverified 0Privacy Evaluation Benchmarks for NLP Models Sep 24, 2024 Knowledge Distillation
Code Code Available 0TS-HTFA: Advancing Time Series Forecasting via Hierarchical Text-Free Alignment with Large Language Models Sep 23, 2024 Contrastive Learning cross-modal alignment
— Unverified 0Pre-trained Language Model and Knowledge Distillation for Lightweight Sequential Recommendation Sep 23, 2024 Knowledge Distillation Language Modeling
— Unverified 0DSG-KD: Knowledge Distillation from Domain-Specific to General Language Models Sep 23, 2024 Knowledge Distillation Transfer Learning
Code Code Available 0DilateQuant: Accurate and Efficient Diffusion Quantization via Weight Dilation Sep 22, 2024 Image Generation Knowledge Distillation
— Unverified 0Prior Knowledge Distillation Network for Face Super-Resolution Sep 22, 2024 Knowledge Distillation Super-Resolution
— Unverified 0EchoAtt: Attend, Copy, then Adjust for More Efficient Large Language Models Sep 22, 2024 Knowledge Distillation
— Unverified 0On Importance of Pruning and Distillation for Efficient Low Resource NLP Sep 21, 2024 Document Classification GPU
— Unverified 0Generalization in birdsong classification: impact of transfer learning methods and dataset characteristics Sep 21, 2024 Knowledge Distillation Sound Classification
— Unverified 0Fast Streaming Transducer ASR Prototyping via Knowledge Distillation with Whisper Sep 20, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Simple Unsupervised Knowledge Distillation With Space Similarity Sep 20, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks Sep 19, 2024 Knowledge Distillation
Code Code Available 0LLMR: Knowledge Distillation with a Large Language Model-Induced Reward Sep 19, 2024 Dialogue Generation Knowledge Distillation
— Unverified 0Bayesian-Optimized One-Step Diffusion Model with Knowledge Distillation for Real-Time 3D Human Motion Prediction Sep 19, 2024 Bayesian Optimization Human motion prediction
— Unverified 0Small Language Models are Equation Reasoners Sep 19, 2024 Arithmetic Reasoning Knowledge Distillation
— Unverified 0Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models Sep 19, 2024 Knowledge Distillation
— Unverified 0Enhancing TinyBERT for Financial Sentiment Analysis Using GPT-Augmented FinBERT Distillation Sep 19, 2024 Data Augmentation Edge-computing
Code Code Available 0Enhancing SLM via ChatGPT and Dataset Augmentation Sep 19, 2024 Knowledge Distillation Natural Language Inference
— Unverified 0Enhancing Knowledge Distillation of Large Language Models through Efficient Multi-Modal Distribution Alignment Sep 19, 2024 Knowledge Distillation Model Compression
Code Code Available 0Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights Sep 19, 2024 Decision Making Knowledge Distillation
— Unverified 0Improving Cone-Beam CT Image Quality with Knowledge Distillation-Enhanced Diffusion Model in Imbalanced Data Settings Sep 19, 2024 Computed Tomography (CT) Image Generation
— Unverified 0StableMamba: Distillation-free Scaling of Large SSMs for Images and Videos Sep 18, 2024 Action Recognition image-classification
— Unverified 0Data Efficient Acoustic Scene Classification using Teacher-Informed Confusing Class Instruction Sep 18, 2024 Acoustic Scene Classification Data Augmentation
— Unverified 0RUIE: Retrieval-based Unified Information Extraction using Large Language Model Sep 18, 2024 Contrastive Learning In-Context Learning
Code Code Available 0EFCM: Efficient Fine-tuning on Compressed Models for deployment of large models in medical image analysis Sep 18, 2024 Knowledge Distillation Medical Image Analysis
— Unverified 0Applications of Knowledge Distillation in Remote Sensing: A Survey Sep 18, 2024 Computational Efficiency Instance Segmentation
— Unverified 0