A "Network Pruning Network" Approach to Deep Model Compression Jan 15, 2020 Knowledge Distillation Model Compression
— Unverified 00 A New Method to Capturing Compositional Knowledge in Linguistic Space Dec 20, 2024 Image Retrieval Knowledge Distillation
— Unverified 00 An Extra RMSNorm is All You Need for Fine Tuning to 1.58 Bits May 12, 2025 All Knowledge Distillation
— Unverified 00 An Interpretable Neuron Embedding for Static Knowledge Distillation Nov 14, 2022 Knowledge Distillation
— Unverified 00 A Novel Algorithm for Personalized Federated Learning: Knowledge Distillation with Weighted Combination Loss Apr 6, 2025 Federated Learning Knowledge Distillation
— Unverified 00 A Novel Approach To Implementing Knowledge Distillation In Tsetlin Machines Apr 2, 2025 Knowledge Distillation text-classification
— Unverified 00 A Novel Architecture Slimming Method for Network Pruning and Knowledge Distillation Feb 21, 2022 Knowledge Distillation Model Compression
— Unverified 00 A novel channel pruning method for deep neural network compression May 29, 2018 channel selection Combinatorial Optimization
— Unverified 00 A Novel Garment Transfer Method Supervised by Distilled Knowledge of Virtual Try-on Model Jan 23, 2024 Disentanglement Knowledge Distillation
— Unverified 00 A Novel Lightweight Transformer with Edge-Aware Fusion for Remote Sensing Image Captioning Jun 11, 2025 Decoder Image Captioning
— Unverified 00 A Novel Local-Global Feature Fusion Framework for Body-weight Exercise Recognition with Pressure Mapping Sensors Sep 14, 2023 Knowledge Distillation object-detection
— Unverified 00 A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition Sep 3, 2022 Action Recognition Knowledge Distillation
— Unverified 00 A Novel Spike Transformer Network for Depth Estimation from Event Cameras via Cross-modality Knowledge Distillation Apr 26, 2024 Depth Estimation Knowledge Distillation
— Unverified 00 An Overview of Neural Network Compression Jun 5, 2020 Knowledge Distillation Model Compression
— Unverified 00 AntMan: Sparse Low-Rank Compression to Accelerate RNN inference Oct 2, 2019 Knowledge Distillation Low-rank compression
— Unverified 00 An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition May 1, 2022 Cross-Lingual NER Knowledge Distillation
— Unverified 00 APALU: A Trainable, Adaptive Activation Function for Deep Learning Networks Feb 13, 2024 Anomaly Detection Deep Learning
— Unverified 00 A Peek Into the Reasoning of Neural Networks: Interpreting with Structural Visual Concepts May 1, 2021 Explainable artificial intelligence Knowledge Distillation
— Unverified 00 A Plasticity-Aware Method for Continual Self-Supervised Learning in Remote Sensing Mar 31, 2025 Continual Self-Supervised Learning Knowledge Distillation
— Unverified 00 Application of Knowledge Distillation to Multi-task Speech Representation Learning Oct 29, 2022 Keyword Spotting Knowledge Distillation
— Unverified 00 Application of Vision-Language Model to Pedestrians Behavior and Scene Understanding in Autonomous Driving Jan 12, 2025 Autonomous Driving Decision Making
— Unverified 00 Applications of Knowledge Distillation in Remote Sensing: A Survey Sep 18, 2024 Computational Efficiency Instance Segmentation
— Unverified 00 Applied Federated Model Personalisation in the Industrial Domain: A Comparative Study Sep 10, 2024 Active Learning Federated Learning
— Unverified 00 Apprenticeship-Inspired Elegance: Synergistic Knowledge Distillation Empowers Spiking Neural Networks for Efficient Single-Eye Emotion Recognition Jun 20, 2024 Emotion Recognition Knowledge Distillation
— Unverified 00 Apprentice: Using Knowledge Distillation Techniques To Improve Low-Precision Network Accuracy Nov 15, 2017 image-classification Image Classification
— Unverified 00 Apprentissage automatique de repr\'esentation de voix \`a l'aide d'une distillation de la connaissance pour le casting vocal (Learning voice representation using knowledge distillation for automatic voice casting ) Jun 1, 2020 Knowledge Distillation
— Unverified 00 A Practical Survey on Faster and Lighter Transformers Mar 26, 2021 Knowledge Distillation Survey
— Unverified 00 A Progressive Framework of Vision-language Knowledge Distillation and Alignment for Multilingual Scene Apr 17, 2024 image-classification Image Classification
— Unverified 00 ARDIR: Improving Robustness using Knowledge Distillation of Internal Representation Nov 1, 2022 Knowledge Distillation
— Unverified 00 A Recipe for Efficient SBIR Models: Combining Relative Triplet Loss with Batch Normalization and Knowledge Distillation May 30, 2023 Data Augmentation Image Retrieval
— Unverified 00 A Review on Discriminative Self-supervised Learning Methods in Computer Vision May 8, 2024 Clustering Knowledge Distillation
— Unverified 00 Artificial Behavior Intelligence: Technology, Challenges, and Future Directions May 6, 2025 Autonomous Driving Emotion Recognition
— Unverified 00 A scalable convolutional neural network for task-specified scenarios via knowledge distillation Sep 19, 2016 Knowledge Distillation
— Unverified 00 A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models Nov 30, 2020 Knowledge Distillation Model Compression
— Unverified 00 A Short Study on Compressing Decoder-Based Language Models Oct 16, 2021 Decoder Knowledge Distillation
— Unverified 00 A Simple and Generic Framework for Feature Distillation via Channel-wise Transformation Mar 23, 2023 image-classification Image Classification
— Unverified 00 A Simple but Effective BERT Model for Dialog State Tracking on Resource-Limited Systems Oct 28, 2019 dialog state tracking Dialogue State Tracking
— Unverified 00 SS-IL: Separated Softmax for Incremental Learning Mar 31, 2020 class-incremental learning Class Incremental Learning
— Unverified 00 A Simple Linear Patch Revives Layer-Pruned Large Language Models May 30, 2025 Knowledge Distillation Question Answering
— Unverified 00 A Simple Recipe for Competitive Low-compute Self supervised Vision Models Jan 23, 2023 Knowledge Distillation
— Unverified 00 Asterisk*: Keep it Simple Nov 8, 2024 Classification Knowledge Distillation
— Unverified 00 A Study of Non-autoregressive Model for Sequence Generation Apr 22, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models May 26, 2023 Knowledge Distillation
— Unverified 00 A Study on the Efficiency and Generalization of Light Hybrid Retrievers Oct 4, 2022 Adversarial Attack Contrastive Learning
— Unverified 00 A Survey of Methods for Low-Power Deep Learning and Computer Vision Mar 24, 2020 Knowledge Distillation Quantization
— Unverified 00 A Survey of Model Compression and Acceleration for Deep Neural Networks Oct 23, 2017 Benchmarking Knowledge Distillation
— Unverified 00 A Survey of Techniques for Optimizing Transformer Inference Jul 16, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 00 A Survey on Deep Neural Network Compression: Challenges, Overview, and Solutions Oct 5, 2020 Knowledge Distillation Miscellaneous
— Unverified 00 A survey on efficient vision transformers: algorithms, techniques, and performance benchmarking Sep 5, 2023 Benchmarking Knowledge Distillation
— Unverified 00 A Survey on Green Deep Learning Nov 8, 2021 Deep Learning Knowledge Distillation
— Unverified 00