Data Techniques For Online End-to-end Speech Recognition Jan 24, 2020 Data Augmentation Domain Adaptation
— Unverified 0Beyond Task Vectors: Selective Task Arithmetic Based on Importance Metrics Nov 25, 2024 Knowledge Distillation Multi-Task Learning
— Unverified 0Mining Data Impressions from Deep Models as Substitute for the Unavailable Training Data Jan 15, 2021 Adversarial Robustness Continual Learning
— Unverified 0A Closer Look at Deep Learning Heuristics: Learning rate restarts, Warmup and Distillation Oct 29, 2018 Dimensionality Reduction Knowledge Distillation
— Unverified 0Enhancing Modality-Agnostic Representations via Meta-Learning for Brain Tumor Segmentation Feb 8, 2023 Brain Tumor Segmentation Image Generation
— Unverified 0Enhancing Romanian Offensive Language Detection through Knowledge Distillation, Multi-Task Learning, and Data Augmentation Sep 30, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Data-Free Knowledge Transfer: A Survey Dec 31, 2021 Data-free Knowledge Distillation Domain Adaptation
— Unverified 0Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis Apr 10, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images Oct 20, 2023 Data Augmentation Data-free Knowledge Distillation
— Unverified 0Adaptive Knowledge Distillation for Classification of Hand Images using Explainable Vision Transformers Aug 20, 2024 Knowledge Distillation
— Unverified 0A Classifier-Free Incremental Learning Framework for Scalable Medical Image Segmentation May 25, 2024 Contrastive Learning Image Segmentation
— Unverified 0All You Need in Knowledge Distillation Is a Tailored Coordinate System Dec 12, 2024 All Few-Shot Learning
— Unverified 0Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging May 15, 2025 Continual Learning Diagnostic
— Unverified 0Beyond Classification: Knowledge Distillation using Multi-Object Impressions Oct 27, 2021 Classification Knowledge Distillation
— Unverified 0Enhancing Few-shot Keyword Spotting Performance through Pre-Trained Self-supervised Speech Models Jun 21, 2025 Dimensionality Reduction Keyword Spotting
— Unverified 0Enhancing Generalization in Chain of Thought Reasoning for Smaller Models Jan 16, 2025 Knowledge Distillation Memorization
— Unverified 0Alleviating LLM-based Generative Retrieval Hallucination in Alipay Search Mar 27, 2025 Hallucination Knowledge Distillation
— Unverified 0Adaptive Knowledge Distillation between Text and Speech Pre-trained Models Mar 7, 2023 Knowledge Distillation Spoken Language Understanding
— Unverified 0Data-Free Federated Class Incremental Learning with Diffusion-Based Generative Memory May 22, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Data-free Distillation with Degradation-prompt Diffusion for Multi-weather Image Restoration Sep 5, 2024 Image Restoration Knowledge Distillation
— Unverified 0Enhancing CTC-Based Visual Speech Recognition Sep 11, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Data-Free Distillation of Language Model by Text-to-Text Transfer Nov 3, 2023 Data-free Knowledge Distillation Diversity
— Unverified 0Dense Depth Distillation with Out-of-Distribution Simulated Images Aug 26, 2022 Data-free Knowledge Distillation Depth Estimation
— Unverified 0Data-Free Adversarial Knowledge Distillation for Graph Neural Networks May 8, 2022 Generative Adversarial Network Graph Classification
— Unverified 0Alleviating Catastrophic Forgetting of Incremental Object Detection via Within-Class and Between-Class Knowledge Distillation Jan 1, 2023 Knowledge Distillation object-detection
— Unverified 0Enhancing Data-Free Adversarial Distillation with Activation Regularization and Virtual Interpolation Feb 23, 2021 Knowledge Distillation
— Unverified 0Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning Jan 19, 2024 GPU Knowledge Distillation
— Unverified 0Enhancing Action Recognition from Low-Quality Skeleton Data via Part-Level Knowledge Distillation Apr 28, 2024 Action Recognition General Knowledge
— Unverified 0Accurate Knowledge Distillation with n-best Reranking May 20, 2023 Knowledge Distillation Reranking
— Unverified 0Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression Mar 11, 2024 Backdoor Attack Image Compression
— Unverified 0Data-Efficient Ranking Distillation for Image Retrieval Jul 10, 2020 Image Retrieval Knowledge Distillation
— Unverified 0Adaptive Instance Distillation for Object Detection in Autonomous Driving Jan 26, 2022 Autonomous Driving Knowledge Distillation
— Unverified 0Data-efficient Event Camera Pre-training via Disentangled Masked Modeling Mar 1, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Data Efficient Acoustic Scene Classification using Teacher-Informed Confusing Class Instruction Sep 18, 2024 Acoustic Scene Classification Data Augmentation
— Unverified 0Better Knowledge Enhancement for Privacy-Preserving Cross-Project Defect Prediction Dec 23, 2024 Federated Learning Knowledge Distillation
— Unverified 0Data-Driven Compression of Convolutional Neural Networks Nov 28, 2019 Knowledge Distillation Model Compression
— Unverified 0Alignment Knowledge Distillation for Online Streaming Attention-based Speech Recognition Feb 28, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation Nov 1, 2022 Knowledge Distillation Multi Label Text Classification
— Unverified 0DASECount: Domain-Agnostic Sample-Efficient Wireless Indoor Crowd Counting via Few-shot Learning Nov 18, 2022 Crowd Counting Few-Shot Learning
— Unverified 0BeSound: Bluetooth-Based Position Estimation Enhancing with Cross-Modality Distillation Apr 24, 2024 Knowledge Distillation Position
— Unverified 0Adaptive Group Robust Ensemble Knowledge Distillation Nov 22, 2024 Knowledge Distillation
— Unverified 0BERT Learns to Teach: Knowledge Distillation with Meta Learning Aug 17, 2021 Knowledge Distillation Meta-Learning
— Unverified 0DAKD: Data Augmentation and Knowledge Distillation using Diffusion Models for SAR Oil Spill Segmentation Dec 11, 2024 Data Augmentation Knowledge Distillation
— Unverified 0DaFKD: Domain-Aware Federated Knowledge Distillation Jan 1, 2023 Knowledge Distillation
— Unverified 0BERM: Training the Balanced and Extractable Representation for Matching to Improve Generalization Ability of Dense Retrieval May 18, 2023 Information Retrieval Knowledge Distillation
— Unverified 0Enhancing Accuracy and Parameter-Efficiency of Neural Representations for Network Parameterization Jun 29, 2024 Knowledge Distillation
— Unverified 0Enhancing Content Representation for AR Image Quality Assessment Using Knowledge Distillation Dec 8, 2024 Image Quality Assessment Knowledge Distillation
— Unverified 0Energy-efficient Knowledge Distillation for Spiking Neural Networks Jun 14, 2021 Knowledge Distillation Model Compression
— Unverified 0StyleRF-VolVis: Style Transfer of Neural Radiance Fields for Expressive Volume Visualization Jul 31, 2024 Knowledge Distillation NeRF
— Unverified 0Enhanced Multimodal Representation Learning with Cross-modal KD Jun 13, 2023 Contrastive Learning Emotion Classification
— Unverified 0