Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence Mar 9, 2025 Decision Making Knowledge Distillation
— Unverified 0Integrated Multi-Level Knowledge Distillation for Enhanced Speaker Verification Sep 14, 2024 Knowledge Distillation Speaker Verification
— Unverified 0HoverFast: an accurate, high-throughput, clinically deployable nuclear segmentation tool for brightfield digital pathology images May 22, 2024 GPU Knowledge Distillation
— Unverified 0Integrating Arithmetic Learning Improves Mathematical Reasoning in Smaller Models Feb 18, 2025 Data Augmentation GSM8K
— Unverified 0EfficientViT-SAM: Accelerated Segment Anything Model Without Accuracy Loss Feb 7, 2024 Decoder GPU
— Unverified 0How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation? May 27, 2021 Diversity Knowledge Distillation
— Unverified 0Deep Neural Network Models Compression Mar 4, 2021 Knowledge Distillation Quantization
— Unverified 0How many Observations are Enough? Knowledge Distillation for Trajectory Forecasting Mar 9, 2022 Knowledge Distillation Trajectory Forecasting
— Unverified 0Compact Speaker Embedding: lrx-vector Aug 11, 2020 Knowledge Distillation Speaker Recognition
— Unverified 0How to Backdoor the Knowledge Distillation Apr 30, 2025 Knowledge Distillation
— Unverified 0Efficient Video Segmentation Models with Per-frame Inference Feb 24, 2022 Image Matting Instance Segmentation
— Unverified 0How to Prune Your Language Model: Recovering Accuracy on the "Sparsity May Cry'' Benchmark Dec 21, 2023 Knowledge Distillation Language Modeling
— Unverified 0How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding Nov 1, 2021 Adversarial Robustness All
— Unverified 0Efficient Verified Machine Unlearning For Distillation Mar 28, 2025 Knowledge Distillation Machine Unlearning
— Unverified 0Discovery of novel antimicrobial peptides with notable antibacterial potency by a LLM-based foundation model Jul 17, 2024 Knowledge Distillation scientific discovery
— Unverified 0Amortized Noisy Channel Neural Machine Translation Dec 16, 2021 Imitation Learning Knowledge Distillation
— Unverified 0Integrating ChatGPT into Secure Hospital Networks: A Case Study on Improving Radiology Report Analysis Feb 14, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0HRPose: Real-Time High-Resolution 6D Pose Estimation Network Using Knowledge Distillation Apr 20, 2022 6D Pose Estimation 6D Pose Estimation using RGB
— Unverified 0Efficient Transformer Knowledge Distillation: A Performance Review Nov 22, 2023 Knowledge Distillation Model Compression
— Unverified 0Human-Centered Prior-Guided and Task-Dependent Multi-Task Representation Learning for Action Recognition Pre-Training Apr 27, 2022 Action Recognition Contrastive Learning
— Unverified 0Efficient Transformer-based Large Scale Language Representations using Hardware-friendly Block Structured Pruning Sep 17, 2020 Edge-computing Knowledge Distillation
— Unverified 0Compacting Deep Neural Networks for Internet of Things: Methods and Applications Mar 20, 2021 Diversity Knowledge Distillation
— Unverified 0Deep versus Wide: An Analysis of Student Architectures for Task-Agnostic Knowledge Distillation of Self-Supervised Speech Models Jul 14, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Human in the Latent Loop (HILL): Interactively Guiding Model Training Through Human Intuition May 9, 2025 Knowledge Distillation
— Unverified 0Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation Aug 26, 2021 Density Estimation Knowledge Distillation
— Unverified 0Compact CNN Structure Learning by Knowledge Distillation Apr 19, 2021 Knowledge Distillation Model Compression
— Unverified 0HW-TSC’s Participation in the WMT 2020 News Translation Shared Task Nov 1, 2020 Knowledge Distillation Translation
— Unverified 0HW-TSC’s Participation in the WMT 2021 Large-Scale Multilingual Translation Task Nov 1, 2021 Knowledge Distillation Translation
— Unverified 0A Survey on Transformer Compression Feb 5, 2024 Knowledge Distillation Mamba
— Unverified 0Compact CNN Models for On-device Ocular-based User Recognition in Mobile Devices Oct 11, 2021 Knowledge Distillation Network Pruning
— Unverified 0Efficient Technical Term Translation: A Knowledge Distillation Approach for Parenthetical Terminology Translation Oct 1, 2024 Knowledge Distillation Machine Translation
— Unverified 0Hybrid Paradigm-based Brain-Computer Interface for Robotic Arm Control Dec 14, 2022 Brain Computer Interface EEG
— Unverified 0HYDRA-FL: Hybrid Knowledge Distillation for Robust and Accurate Federated Learning Sep 30, 2024 Federated Learning Knowledge Distillation
— Unverified 0A Survey on Symbolic Knowledge Distillation of Large Language Models Jul 12, 2024 Knowledge Distillation Survey
— Unverified 0A Flexible Multi-Task Model for BERT Serving Nov 16, 2021 Knowledge Distillation model
— Unverified 0In Teacher We Trust: Learning Compressed Models for Pedestrian Detection Dec 1, 2016 Knowledge Distillation Pedestrian Detection
— Unverified 0Integration of Pre-trained Networks with Continuous Token Interface for End-to-End Spoken Language Understanding Apr 15, 2021 intent-classification Intent Classification
— Unverified 0Efficient speech detection in environmental audio using acoustic recognition and knowledge distillation Dec 14, 2023 Knowledge Distillation Model Selection
— Unverified 0I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation Mar 27, 2024 Knowledge Distillation Segmentation
— Unverified 0I2D2: Inductive Knowledge Distillation with NeuroLogic and Self-Imitation Dec 19, 2022 Imitation Learning Knowledge Distillation
— Unverified 0I^2KD-SLU: An Intra-Inter Knowledge Distillation Framework for Zero-Shot Cross-Lingual Spoken Language Understanding Oct 4, 2023 Intent Detection Knowledge Distillation
— Unverified 0A Survey on Recent Teacher-student Learning Studies Apr 10, 2023 Knowledge Distillation Survey
— Unverified 0IAG: Induction-Augmented Generation Framework for Answering Reasoning Questions Nov 30, 2023 Knowledge Distillation RAG
— Unverified 0ICD-Face: Intra-class Compactness Distillation for Face Recognition Jan 1, 2023 Face Recognition Knowledge Distillation
— Unverified 0Efficient Speech Command Recognition Leveraging Spiking Neural Network and Curriculum Learning-based Knowledge Distillation Dec 17, 2024 Edge-computing Knowledge Distillation
— Unverified 0Batch Selection and Communication for Active Learning with Edge Labeling Nov 14, 2023 Active Learning Knowledge Distillation
— Unverified 0Cross-resolution Face Recognition via Identity-Preserving Network and Knowledge Distillation Mar 15, 2023 Face Recognition Knowledge Distillation
— Unverified 0If At First You Don't Succeed: Test Time Re-ranking for Zero-shot, Cross-domain Retrieval Mar 30, 2023 Image Retrieval Knowledge Distillation
— Unverified 0Active Large Language Model-based Knowledge Distillation for Session-based Recommendation Dec 15, 2024 Active Learning Knowledge Distillation
— Unverified 0Efficient Point Cloud Classification via Offline Distillation Framework and Negative-Weight Self-Distillation Technique Sep 3, 2024 Data Augmentation Knowledge Distillation
— Unverified 0