Fair Feature Importance Scores for Interpreting Tree-Based Methods and Surrogates Oct 6, 2023 Fairness Feature Importance
— Unverified 0LumiNet: The Bright Side of Perceptual Knowledge Distillation Oct 5, 2023 Classification Knowledge Distillation
Code Code Available 1DED: Diagnostic Evidence Distillation for acne severity grading on face images Oct 5, 2023 Acne Severity Grading Diagnostic
Code Code Available 0Improving Knowledge Distillation with Teacher's Explanation Oct 4, 2023 Knowledge Distillation
— Unverified 0I^2KD-SLU: An Intra-Inter Knowledge Distillation Framework for Zero-Shot Cross-Lingual Spoken Language Understanding Oct 4, 2023 Intent Detection Knowledge Distillation
— Unverified 0Heterogeneous Federated Learning Using Knowledge Codistillation Oct 4, 2023 Federated Learning image-classification
— Unverified 0Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication Oct 4, 2023 Decoder Knowledge Distillation
— Unverified 0SEA: Sparse Linear Attention with Estimated Attention Mask Oct 3, 2023 Knowledge Distillation Language Modeling
Code Code Available 1Can a student Large Language Model perform as well as it's teacher? Oct 3, 2023 Knowledge Distillation Language Modeling
— Unverified 0Learnable Cross-modal Knowledge Distillation for Multi-modal Learning with Missing Modality Oct 2, 2023 Knowledge Distillation
— Unverified 0KGEx: Explaining Knowledge Graph Embeddings via Subgraph Sampling and Knowledge Distillation Oct 2, 2023 Knowledge Distillation Knowledge Graph Embeddings
— Unverified 0Towards LogiGLUE: A Brief Survey and A Benchmark for Analyzing Logical Reasoning Capabilities of Language Models Oct 2, 2023 Knowledge Distillation Language Modelling
— Unverified 0Towards Fixing Clever-Hans Predictors with Counterfactual Knowledge Distillation Oct 2, 2023 counterfactual Knowledge Distillation
— Unverified 0Distilling Influences to Mitigate Prediction Churn in Graph Neural Networks Oct 2, 2023 Knowledge Distillation Node Classification
Code Code Available 0Adaptive Decoupled Pose Knowledge Distillation Oct 1, 2023 Knowledge Distillation Pose Estimation
Code Code Available 0NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge Distillation Sep 30, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 1Distilling Inductive Bias: Knowledge Distillation Beyond Model Compression Sep 30, 2023 Inductive Bias Knowledge Distillation
— Unverified 0Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation Sep 29, 2023 Image Generation Knowledge Distillation
— Unverified 0Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation Sep 29, 2023 Cross-Lingual Question Answering Cross-Lingual Transfer
Code Code Available 0An Enhanced Low-Resolution Image Recognition Method for Traffic Environments Sep 28, 2023 Computational Efficiency Knowledge Distillation
— Unverified 0Distill to Delete: Unlearning in Graph Networks with Knowledge Distillation Sep 28, 2023 GPU Graph Neural Network
— Unverified 0Distilling ODE Solvers of Diffusion Models into Smaller Steps Sep 28, 2023 Denoising Knowledge Distillation
— Unverified 0Inherit with Distillation and Evolve with Contrast: Exploring Class Incremental Semantic Segmentation Without Exemplar Memory Sep 27, 2023 Class-Incremental Semantic Segmentation Contrastive Learning
— Unverified 0DualVC 2: Dynamic Masked Convolution for Unified Streaming and Non-Streaming Voice Conversion Sep 27, 2023 Decoder Knowledge Distillation
— Unverified 0VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning Sep 27, 2023 Knowledge Distillation regression
— Unverified 0Cold & Warm Net: Addressing Cold-Start Users in Recommender Systems Sep 27, 2023 Knowledge Distillation Meta-Learning
— Unverified 0Contrastive Continual Multi-view Clustering with Filtered Structural Fusion Sep 26, 2023 Clustering Contrastive Learning
— Unverified 0Learning Using Generated Privileged Information by Text-to-Image Diffusion Models Sep 26, 2023 Classification Knowledge Distillation
— Unverified 0Event Stream-based Visual Object Tracking: A High-Resolution Benchmark Dataset and A Novel Baseline Sep 26, 2023 Knowledge Distillation Object Tracking
Code Code Available 2Noise-Tolerant Few-Shot Unsupervised Adapter for Vision-Language Models Sep 26, 2023 image-classification Image Classification
— Unverified 0DONNAv2 -- Lightweight Neural Architecture Search for Vision tasks Sep 26, 2023 Denoising Image Denoising
— Unverified 0ADU-Depth: Attention-based Distillation with Uncertainty Modeling for Depth Estimation Sep 26, 2023 3D geometry Depth Estimation
— Unverified 0DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation Sep 26, 2023 3D Object Detection Autonomous Driving
Code Code Available 1Unsupervised 3D Perception with 2D Vision-Language Distillation for Autonomous Driving Sep 25, 2023 Autonomous Driving Knowledge Distillation
— Unverified 0Data Upcycling Knowledge Distillation for Image Super-Resolution Sep 25, 2023 Image Super-Resolution Knowledge Distillation
Code Code Available 0DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning Sep 24, 2023 Data-free Knowledge Distillation Diversity
— Unverified 0Multivariate Prototype Representation for Domain-Generalized Incremental Learning Sep 24, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Poster: Self-Supervised Quantization-Aware Knowledge Distillation Sep 22, 2023 Knowledge Distillation Quantization
— Unverified 0VIC-KD: Variance-Invariance-Covariance Knowledge Distillation to Make Keyword Spotting More Robust Against Adversarial Attacks Sep 22, 2023 Adversarial Robustness Keyword Spotting
— Unverified 0Triple-View Knowledge Distillation for Semi-Supervised Semantic Segmentation Sep 22, 2023 Decoder Feature Importance
— Unverified 0Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 0Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 0Elevating Skeleton-Based Action Recognition with Efficient Multi-Modality Self-Supervision Sep 21, 2023 Action Recognition Knowledge Distillation
Code Code Available 0A Sentence Speaks a Thousand Images: Domain Generalization through Distilling CLIP with Language Guidance Sep 21, 2023 Domain Generalization Knowledge Distillation
Code Code Available 1EPTQ: Enhanced Post-Training Quantization via Hessian-guided Network-wise Optimization Sep 20, 2023 Knowledge Distillation object-detection
Code Code Available 2Dense 2D-3D Indoor Prediction with Sound via Aligned Cross-Modal Distillation Sep 20, 2023 3D Scene Reconstruction Depth Estimation
Code Code Available 0Language-Oriented Communication with Semantic Coding and Knowledge Distillation for Text-to-Image Generation Sep 20, 2023 Image Generation In-Context Learning
— Unverified 0Weight Averaging Improves Knowledge Distillation under Domain Shift Sep 20, 2023 Domain Generalization Knowledge Distillation
Code Code Available 1Incorporating Ultrasound Tongue Images for Audio-Visual Speech Enhancement Sep 19, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Improving CLIP Robustness with Knowledge Distillation and Self-Training Sep 19, 2023 Knowledge Distillation
— Unverified 0