IL-NeRF: Incremental Learning for Neural Radiance Fields with Camera Pose Alignment Dec 10, 2023 Incremental Learning Knowledge Distillation
— Unverified 0Densely Distilling Cumulative Knowledge for Continual Learning May 16, 2024 All Continual Learning
— Unverified 0A Survey on Transformer Compression Feb 5, 2024 Knowledge Distillation Mamba
— Unverified 0Image-to-Video Re-Identification via Mutual Discriminative Knowledge Transfer Jan 21, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Attention-based Knowledge Distillation in Multi-attention Tasks: The Impact of a DCT-driven Loss May 4, 2022 Descriptive Knowledge Distillation
— Unverified 0Compact CNN Models for On-device Ocular-based User Recognition in Mobile Devices Oct 11, 2021 Knowledge Distillation Network Pruning
— Unverified 0Implicit Word Reordering with Knowledge Distillation for Cross-Lingual Dependency Parsing Feb 24, 2025 Cross-Lingual Transfer Dependency Parsing
— Unverified 0Impossible Triangle: What's Next for Pre-trained Language Models? Apr 13, 2022 Data Augmentation Few-Shot Learning
— Unverified 0基于层间知识蒸馏的神经机器翻译(Inter-layer Knowledge Distillation for Neural Machine Translation) Aug 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0Efficient Technical Term Translation: A Knowledge Distillation Approach for Parenthetical Terminology Translation Oct 1, 2024 Knowledge Distillation Machine Translation
— Unverified 0A Survey on Symbolic Knowledge Distillation of Large Language Models Jul 12, 2024 Knowledge Distillation Survey
— Unverified 0Improved Customer Transaction Classification using Semi-Supervised Knowledge Distillation Feb 15, 2021 Classification General Classification
— Unverified 0A Flexible Multi-Task Model for BERT Serving Nov 16, 2021 Knowledge Distillation model
— Unverified 0Improved implicit diffusion model with knowledge distillation to estimate the spatial distribution density of carbon stock in remote sensing imagery Nov 27, 2024 Knowledge Distillation
— Unverified 0Improved knowledge distillation by utilizing backward pass knowledge in neural networks Jan 27, 2023 Knowledge Distillation Model Compression
— Unverified 0Designing Parameter and Compute Efficient Diffusion Transformers using Distillation Feb 20, 2025 Knowledge Distillation NVIDIA Jetson Orin Nano
— Unverified 0Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection Feb 1, 2023 Knowledge Distillation
— Unverified 0Improved Knowledge Distillation via Adversarial Collaboration Nov 29, 2021 Knowledge Distillation
— Unverified 0Joint Architecture and Knowledge Distillation in CNN for Chinese Text Recognition Dec 17, 2019 Handwritten Chinese Text Recognition Knowledge Distillation
— Unverified 0Efficient speech detection in environmental audio using acoustic recognition and knowledge distillation Dec 14, 2023 Knowledge Distillation Model Selection
— Unverified 0A Survey on Recent Teacher-student Learning Studies Apr 10, 2023 Knowledge Distillation Survey
— Unverified 0Efficient Speech Command Recognition Leveraging Spiking Neural Network and Curriculum Learning-based Knowledge Distillation Dec 17, 2024 Edge-computing Knowledge Distillation
— Unverified 0Batch Selection and Communication for Active Learning with Edge Labeling Nov 14, 2023 Active Learning Knowledge Distillation
— Unverified 0Improve Knowledge Distillation via Label Revision and Data Selection Apr 3, 2024 Knowledge Distillation Model Compression
— Unverified 0Active Large Language Model-based Knowledge Distillation for Session-based Recommendation Dec 15, 2024 Active Learning Knowledge Distillation
— Unverified 0Improving Acoustic Scene Classification in Low-Resource Conditions Dec 30, 2024 Acoustic Scene Classification Classification
— Unverified 0Efficient Point Cloud Classification via Offline Distillation Framework and Negative-Weight Self-Distillation Technique Sep 3, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Improving Apple Object Detection with Occlusion-Enhanced Distillation Sep 3, 2024 Knowledge Distillation Object
— Unverified 0Improving Autoregressive NMT with Non-Autoregressive Model Jul 1, 2020 Decoder de-en
— Unverified 0Improving CLIP Robustness with Knowledge Distillation and Self-Training Sep 19, 2023 Knowledge Distillation
— Unverified 0Efficient Open-world Reinforcement Learning via Knowledge Distillation and Autonomous Rule Discovery Nov 24, 2023 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0ComKD-CLIP: Comprehensive Knowledge Distillation for Contrastive Language-Image Pre-traning Model Aug 8, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Improving Conversational Abilities of Quantized Large Language Models via Direct Preference Alignment Jul 3, 2024 Chatbot Computational Efficiency
— Unverified 0DFMSD: Dual Feature Masking Stage-wise Knowledge Distillation for Object Detection Jul 18, 2024 Knowledge Distillation Object
— Unverified 0Improving Defensive Distillation using Teacher Assistant May 14, 2023 Face Recognition Knowledge Distillation
— Unverified 0Improving De-Raining Generalization via Neural Reorganization Jan 1, 2021 Knowledge Distillation
— Unverified 0Efficient Object Detection in Optical Remote Sensing Imagery via Attention-based Feature Distillation Oct 28, 2023 Knowledge Distillation Object
— Unverified 0CoMBO: Conflict Mitigation via Branched Optimization for Class Incremental Segmentation Jan 1, 2025 Knowledge Distillation Semantic Segmentation
— Unverified 0A Survey on Model Compression for Large Language Models Aug 15, 2023 Benchmarking Knowledge Distillation
— Unverified 0Improving Facial Landmark Detection Accuracy and Efficiency with Knowledge Distillation Apr 9, 2024 Emotion Recognition Facial Landmark Detection
— Unverified 0Improving Feature Generalizability with Multitask Learning in Class Incremental Learning Apr 26, 2022 class-incremental learning Class Incremental Learning
— Unverified 0Improving Frame-level Classifier for Word Timings with Non-peaky CTC in End-to-End Automatic Speech Recognition Jun 9, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Efficient Machine Translation with Model Pruning and Quantization Nov 1, 2021 CPU Decoder
— Unverified 0Noise as a Resource for Learning in Knowledge Distillation Oct 11, 2019 Knowledge Distillation
— Unverified 0Improving Generalization of Pre-trained Language Models via Stochastic Weight Averaging Dec 12, 2022 Knowledge Distillation Question Answering
— Unverified 0Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning Aug 26, 2023 Knowledge Distillation Model Compression
— Unverified 0Combining Curriculum Learning and Knowledge Distillation for Dialogue Generation Nov 1, 2021 Dialogue Generation Knowledge Distillation
— Unverified 0Combining Compressions for Multiplicative Size Scaling on Natural Language Tasks Aug 20, 2022 Knowledge Distillation Neural Network Compression
— Unverified 0ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression May 26, 2023 Knowledge Distillation
— Unverified 0JEP-KD: Joint-Embedding Predictive Architecture Based Knowledge Distillation for Visual Speech Recognition Mar 4, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0