Variational Knowledge Distillation for Disease Classification in Chest X-Rays Mar 19, 2021 Classification General Classification
— Unverified 00 Variational Student: Learning Compact and Sparser Networks in Knowledge Distillation Framework Oct 26, 2019 Knowledge Distillation Variational Inference
— Unverified 00 VEM^2L: A Plug-and-play Framework for Fusing Text and Structure Knowledge on Sparse Knowledge Graph Completion Jul 4, 2022 Knowledge Distillation Knowledge Graph Completion
— Unverified 00 Vernacular? I Barely Know Her: Challenges with Style Control and Stereotyping Jun 18, 2024 Knowledge Distillation
— Unverified 00 VIC-KD: Variance-Invariance-Covariance Knowledge Distillation to Make Keyword Spotting More Robust Against Adversarial Attacks Sep 22, 2023 Adversarial Robustness Keyword Spotting
— Unverified 00 VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning Sep 27, 2023 Knowledge Distillation regression
— Unverified 00 Vi-LAD: Vision-Language Attention Distillation for Socially-Aware Robot Navigation in Dynamic Environments Mar 12, 2025 Knowledge Distillation Motion Planning
— Unverified 00 Vision-Based Detection of Uncooperative Targets and Components on Small Satellites Aug 22, 2024 Knowledge Distillation
— Unverified 00 Vision Foundation Models in Medical Image Analysis: Advances and Challenges Feb 20, 2025 Domain Adaptation Federated Learning
— Unverified 00 Vision-Language Models for Edge Networks: A Comprehensive Survey Feb 11, 2025 Autonomous Vehicles Image Captioning
— Unverified 00 Visualizing the embedding space to explain the effect of knowledge distillation Oct 9, 2021 Knowledge Distillation
— Unverified 00 Visualizing the Emergence of Intermediate Visual Patterns in DNNs Nov 5, 2021 Knowledge Distillation
— Unverified 00 Visual-Language Model Knowledge Distillation Method for Image Quality Assessment Jul 21, 2025 Image Quality Assessment Knowledge Distillation
— Unverified 00 Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks Mar 13, 2023 Data Augmentation Knowledge Distillation
— Unverified 00 Visual Relationship Detection Based on Guided Proposals and Semantic Knowledge Distillation May 28, 2018 Common Sense Reasoning Knowledge Distillation
— Unverified 00 Visual Relationship Detection with Internal and External Linguistic Knowledge Distillation Jul 28, 2017 Knowledge Distillation Relationship Detection
— Unverified 00 ViTKD: Practical Guidelines for ViT feature knowledge distillation Sep 6, 2022 Image Classification Knowledge Distillation
— Unverified 00 VL2Lite: Task-Specific Knowledge Distillation from Large Vision-Language Models to Lightweight Networks Jan 1, 2025 Classification image-classification
— Unverified 00 VLM-Assisted Continual learning for Visual Question Answering in Self-Driving Feb 2, 2025 Autonomous Driving Continual Learning
— Unverified 00 VLM-KD: Knowledge Distillation from VLM for Long-Tail Visual Recognition Aug 29, 2024 Knowledge Distillation Language Modeling
— Unverified 00 VPBSD:Vessel-Pattern-Based Semi-Supervised Distillation for Efficient 3D Microscopic Cerebrovascular Segmentation Nov 14, 2024 Brain Segmentation Knowledge Distillation
— Unverified 00 Wakening Past Concepts without Past Data: Class-incremental Learning from Placebos Sep 29, 2021 class-incremental learning Class Incremental Learning
— Unverified 00 Wakening Past Concepts without Past Data: Class-Incremental Learning from Online Placebos Oct 24, 2023 class-incremental learning Class Incremental Learning
— Unverified 00 Wake Vision: A Tailored Dataset and Benchmark Suite for TinyML Computer Vision Applications May 1, 2024 Human Detection Knowledge Distillation
— Unverified 00 Walsh-domain Neural Network for Power Amplifier Behavioral Modelling and Digital Predistortion Feb 15, 2024 Knowledge Distillation
— Unverified 00 Wasserstein Contrastive Representation Distillation Dec 15, 2020 Contrastive Learning Knowledge Distillation
— Unverified 00 Wavelet Knowledge Distillation: Towards Efficient Image-to-Image Translation Mar 12, 2022 Image-to-Image Translation Knowledge Distillation
— Unverified 00 WAVE: Weight Template for Adaptive Initialization of Variable-sized Models Jun 25, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 Weakly Supervised Cross-lingual Semantic Relation Classification via Knowledge Distillation Nov 1, 2019 Classification Cross-Lingual Transfer
— Unverified 00 Weakly Supervised Dense Video Captioning via Jointly Usage of Knowledge Distillation and Cross-modal Matching May 18, 2021 Caption Generation Cross-Modal Retrieval
— Unverified 00 Weakly-Supervised Domain Adaptation of Deep Regression Trackers via Reinforced Knowledge Distillation Mar 26, 2021 Domain Adaptation Knowledge Distillation
— Unverified 00 Weakly-supervised HOI Detection via Prior-guided Bi-level Representation Learning Mar 2, 2023 Human-Object Interaction Detection Knowledge Distillation
— Unverified 00 Weakly Supervised Monocular 3D Detection with a Single-View Image Feb 29, 2024 Knowledge Distillation Object Localization
— Unverified 00 Weakly Supervised Semantic Segmentation via Alternative Self-Dual Teaching Dec 17, 2021 Knowledge Distillation Semantic Segmentation
— Unverified 00 Weak-to-Strong Backdoor Attack for Large Language Models Sep 26, 2024 Backdoor Attack Knowledge Distillation
— Unverified 00 Wearable Accelerometer Foundation Models for Health via Knowledge Distillation Dec 15, 2024 Activity Recognition cross-modal alignment
— Unverified 00 WebChild 2.0 : Fine-Grained Commonsense Knowledge Distillation Jul 1, 2017 Knowledge Distillation Semantic Parsing
— Unverified 00 Web Content Filtering through knowledge distillation of Large Language Models May 8, 2023 Knowledge Distillation
— Unverified 00 WebUOT-1M: Advancing Deep Underwater Object Tracking with A Million-Scale Benchmark May 30, 2024 Knowledge Distillation Object Tracking
— Unverified 00 WeChat Neural Machine Translation Systems for WMT20 Oct 1, 2020 Knowledge Distillation Machine Translation
— Unverified 00 WeChat Neural Machine Translation Systems for WMT21 Aug 5, 2021 Knowledge Distillation Machine Translation
— Unverified 00 WeClick: Weakly-Supervised Video Semantic Segmentation with Click Annotations Jul 7, 2021 Knowledge Distillation Model Compression
— Unverified 00 Weight Averaging: A Simple Yet Effective Method to Overcome Catastrophic Forgetting in Automatic Speech Recognition Oct 27, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Weight Decay Scheduling and Knowledge Distillation for Active Learning Aug 1, 2020 Active Learning Knowledge Distillation
— Unverified 00 Weight Distillation: Transferring the Knowledge in Neural Network Parameters Sep 19, 2020 Knowledge Distillation Machine Translation
— Unverified 00 Weighted KL-Divergence for Document Ranking Model Refinement Jun 10, 2024 Contrastive Learning Document Ranking
— Unverified 00 Weight Squeezing: Reparameterization for Compression and Fast Inference May 30, 2020 Knowledge Distillation Model Compression
— Unverified 00 Robustness Challenges in Model Distillation and Pruning for Natural Language Understanding Oct 16, 2021 Knowledge Distillation Model Compression
— Unverified 00 What do larger image classifiers memorise? Oct 9, 2023 image-classification Image Classification
— Unverified 00 What Happens When Small Is Made Smaller? Exploring the Impact of Compression on Small Data Pretrained Language Models Apr 6, 2024 Knowledge Distillation Language Modeling
— Unverified 00