Unlocking Real-Time Fluorescence Lifetime Imaging: Multi-Pixel Parallelism for FPGA-Accelerated Processing Oct 9, 2024 Knowledge Distillation Scheduling
— Unverified 0Unlock the Power: Competitive Distillation for Multi-Modal Large Language Models Nov 14, 2023 Knowledge Distillation Transfer Learning
— Unverified 0Unpaired Learning for Deep Image Deraining With Rain Direction Regularizer Jan 1, 2021 Knowledge Distillation Rain Removal
— Unverified 0Unraveling Key Factors of Knowledge Distillation Dec 14, 2023 Knowledge Distillation Machine Translation
— Unverified 0Unseen Object Instance Segmentation with Fully Test-time RGB-D Embeddings Adaptation Apr 21, 2022 Instance Segmentation Knowledge Distillation
— Unverified 0Unsupervised 3D Perception with 2D Vision-Language Distillation for Autonomous Driving Sep 25, 2023 Autonomous Driving Knowledge Distillation
— Unverified 0Unsupervised Deep Digital Staining For Microscopic Cell Images Via Knowledge Distillation Mar 3, 2023 Colorization Knowledge Distillation
— Unverified 0Unsupervised Domain Adaptation for Segmentation with Black-box Source Model Aug 16, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0Unsupervised Learning of Neural Networks to Explain Neural Networks (extended abstract) Jan 21, 2019 Knowledge Distillation Object
— Unverified 0Unsupervised Representation Transfer for Small Networks: I Believe I Can Distill On-the-Fly Dec 1, 2021 Knowledge Distillation Linear evaluation
— Unverified 0Unsupervised Text Style Transfer via LLMs and Attention Masking with Multi-way Interactions Feb 21, 2024 In-Context Learning Knowledge Distillation
— Unverified 0Unveiling Context-Aware Criteria in Self-Assessing LLMs Oct 28, 2024 Knowledge Distillation
— Unverified 0Unveiling Incomplete Modality Brain Tumor Segmentation: Leveraging Masked Predicted Auto-Encoder and Divergence Learning Jun 12, 2024 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0Unveiling the Unseen Potential of Graph Learning through MLPs: Effective Graph Learners Using Propagation-Embracing MLPs Nov 20, 2023 Graph Learning Graph Neural Network
— Unverified 0Uplifting Range-View-based 3D Semantic Segmentation in Real-Time with Multi-Sensor Fusion Jul 12, 2024 3D Semantic Segmentation Autonomous Driving
— Unverified 0Using Advanced LLMs to Enhance Smaller LLMs: An Interpretable Knowledge Distillation Approach Aug 13, 2024 Knowledge Distillation
— Unverified 0Using a GAN to Generate Adversarial Examples to Facial Image Recognition Nov 30, 2021 Face Recognition Generative Adversarial Network
— Unverified 0Using Explainable Boosting Machine to Compare Idiographic and Nomothetic Approaches for Ecological Momentary Assessment Data Apr 4, 2022 Interpretable Machine Learning Knowledge Distillation
— Unverified 0Using Knowledge Distillation to improve interpretable models in a retail banking context Sep 30, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Using Perturbed Length-aware Positional Encoding for Non-autoregressive Neural Machine Translation Jul 29, 2021 Knowledge Distillation Machine Translation
— Unverified 0Using the Past Knowledge to Improve Sentiment Classification Nov 1, 2020 Classification Knowledge Distillation
— Unverified 0V2X-VLM: End-to-End V2X Cooperative Autonomous Driving Through Large Vision-Language Models Aug 17, 2024 Autonomous Driving Contrastive Learning
— Unverified 0Vanilla Feature Distillation for Improving the Accuracy-Robustness Trade-Off in Adversarial Training Jun 5, 2022 Knowledge Distillation
— Unverified 0Variational Information Distillation for Knowledge Transfer Apr 11, 2019 Knowledge Distillation Transfer Learning
— Unverified 0Variational Knowledge Distillation for Disease Classification in Chest X-Rays Mar 19, 2021 Classification General Classification
— Unverified 0Variational Student: Learning Compact and Sparser Networks in Knowledge Distillation Framework Oct 26, 2019 Knowledge Distillation Variational Inference
— Unverified 0VEM^2L: A Plug-and-play Framework for Fusing Text and Structure Knowledge on Sparse Knowledge Graph Completion Jul 4, 2022 Knowledge Distillation Knowledge Graph Completion
— Unverified 0Vernacular? I Barely Know Her: Challenges with Style Control and Stereotyping Jun 18, 2024 Knowledge Distillation
— Unverified 0VIC-KD: Variance-Invariance-Covariance Knowledge Distillation to Make Keyword Spotting More Robust Against Adversarial Attacks Sep 22, 2023 Adversarial Robustness Keyword Spotting
— Unverified 0VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning Sep 27, 2023 Knowledge Distillation regression
— Unverified 0Vi-LAD: Vision-Language Attention Distillation for Socially-Aware Robot Navigation in Dynamic Environments Mar 12, 2025 Knowledge Distillation Motion Planning
— Unverified 0Vision-Based Detection of Uncooperative Targets and Components on Small Satellites Aug 22, 2024 Knowledge Distillation
— Unverified 0Vision Foundation Models in Medical Image Analysis: Advances and Challenges Feb 20, 2025 Domain Adaptation Federated Learning
— Unverified 0Vision-Language Models for Edge Networks: A Comprehensive Survey Feb 11, 2025 Autonomous Vehicles Image Captioning
— Unverified 0Visualizing the embedding space to explain the effect of knowledge distillation Oct 9, 2021 Knowledge Distillation
— Unverified 0Visualizing the Emergence of Intermediate Visual Patterns in DNNs Nov 5, 2021 Knowledge Distillation
— Unverified 0Visual-Language Model Knowledge Distillation Method for Image Quality Assessment Jul 21, 2025 Image Quality Assessment Knowledge Distillation
— Unverified 0Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks Mar 13, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Visual Relationship Detection Based on Guided Proposals and Semantic Knowledge Distillation May 28, 2018 Common Sense Reasoning Knowledge Distillation
— Unverified 0Visual Relationship Detection with Internal and External Linguistic Knowledge Distillation Jul 28, 2017 Knowledge Distillation Relationship Detection
— Unverified 0ViTKD: Practical Guidelines for ViT feature knowledge distillation Sep 6, 2022 Image Classification Knowledge Distillation
— Unverified 0VL2Lite: Task-Specific Knowledge Distillation from Large Vision-Language Models to Lightweight Networks Jan 1, 2025 Classification image-classification
— Unverified 0VLM-Assisted Continual learning for Visual Question Answering in Self-Driving Feb 2, 2025 Autonomous Driving Continual Learning
— Unverified 0VLM-KD: Knowledge Distillation from VLM for Long-Tail Visual Recognition Aug 29, 2024 Knowledge Distillation Language Modeling
— Unverified 0VPBSD:Vessel-Pattern-Based Semi-Supervised Distillation for Efficient 3D Microscopic Cerebrovascular Segmentation Nov 14, 2024 Brain Segmentation Knowledge Distillation
— Unverified 0Wakening Past Concepts without Past Data: Class-incremental Learning from Placebos Sep 29, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Wakening Past Concepts without Past Data: Class-Incremental Learning from Online Placebos Oct 24, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Wake Vision: A Tailored Dataset and Benchmark Suite for TinyML Computer Vision Applications May 1, 2024 Human Detection Knowledge Distillation
— Unverified 0Walsh-domain Neural Network for Power Amplifier Behavioral Modelling and Digital Predistortion Feb 15, 2024 Knowledge Distillation
— Unverified 0Wasserstein Contrastive Representation Distillation Dec 15, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0