Point Segment and Count: A Generalized Framework for Object Counting Jan 1, 2024 Few-shot Object Counting and Detection Knowledge Distillation
Code Code Available 2FCS: Feature Calibration and Separation for Non-Exemplar Class Incremental Learning Jan 1, 2024 class-incremental learning Class Incremental Learning
Code Code Available 1DIOD: Self-Distillation Meets Object Discovery Jan 1, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 1KD-DETR: Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Jan 1, 2024 General Knowledge Knowledge Distillation
— Unverified 0Scene-adaptive and Region-aware Multi-modal Prompt for Open Vocabulary Object Detection Jan 1, 2024 Knowledge Distillation object-detection
— Unverified 0Uncertainty-Guided Never-Ending Learning to Drive Jan 1, 2024 Autonomous Driving Continual Learning
— Unverified 0Robust Distillation via Untargeted and Targeted Intermediate Adversarial Samples Jan 1, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0Scaled Decoupled Distillation Jan 1, 2024 Knowledge Distillation
Code Code Available 2CaKDP: Category-aware Knowledge Distillation and Pruning Framework for Lightweight 3D Object Detection Jan 1, 2024 3D Object Detection Knowledge Distillation
Code Code Available 1Building Vision-Language Models on Solid Foundations with Masked Distillation Jan 1, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0LiSA: LiDAR Localization with Semantic Awareness Jan 1, 2024 Knowledge Distillation Semantic Segmentation
Code Code Available 2IQ-VFI: Implicit Quadratic Motion Estimation for Video Frame Interpolation Jan 1, 2024 Knowledge Distillation Motion Estimation
— Unverified 0VkD: Improving Knowledge Distillation using Orthogonal Projections Jan 1, 2024 Image Generation Knowledge Distillation
Code Code Available 2Distribution-aware Knowledge Prototyping for Non-exemplar Lifelong Person Re-identification Jan 1, 2024 Diversity Knowledge Distillation
Code Code Available 1Distilling CLIP with Dual Guidance for Learning Discriminative Human Body Shape Representation Jan 1, 2024 Knowledge Distillation Person Re-Identification
— Unverified 0C2KD: Bridging the Modality Gap for Cross-Modal Knowledge Distillation Jan 1, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Curriculum-scheduled Knowledge Distillation from Multiple Pre-trained Teachers for Multi-domain Sequential Recommendation Jan 1, 2024 Knowledge Distillation Recommendation Systems
Code Code Available 0SecFormer: Fast and Accurate Privacy-Preserving Inference for Transformer Models via SMPC Jan 1, 2024 Knowledge Distillation Privacy Preserving
Code Code Available 0Compressing Deep Image Super-resolution Models Dec 31, 2023 Image Super-Resolution Knowledge Distillation
— Unverified 0Explainability-Driven Leaf Disease Classification Using Adversarial Training and Knowledge Distillation Dec 30, 2023 Adversarial Attack Classification
— Unverified 0ClST: A Convolutional Transformer Framework for Automatic Modulation Recognition by Knowledge Distillation Dec 29, 2023 Automatic Modulation Recognition Knowledge Distillation
— Unverified 0FerKD: Surgical Label Adaptation for Efficient Distillation Dec 29, 2023 Knowledge Distillation
Code Code Available 1Temporal Knowledge Distillation for Time-Sensitive Financial Services Applications Dec 28, 2023 Anomaly Detection Fraud Detection
— Unverified 0FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning Dec 28, 2023 Diversity Federated Learning
— Unverified 0Layer Attack Unlearning: Fast and Accurate Machine Unlearning via Layer Level Attack and Knowledge Distillation Dec 28, 2023 Knowledge Distillation Machine Unlearning
— Unverified 0X Modality Assisting RGBT Object Tracking Dec 27, 2023 Knowledge Distillation Object
— Unverified 0Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning Dec 27, 2023 Continual Learning graph construction
Code Code Available 0Group Multi-View Transformer for 3D Shape Analysis with Spatial Encoding Dec 27, 2023 3D Classification 3D Shape Recognition
Code Code Available 0AdapterDistillation: Non-Destructive Task Composition with Knowledge Distillation Dec 26, 2023 Knowledge Distillation Retrieval
— Unverified 0Cloud-Device Collaborative Learning for Multimodal Large Language Models Dec 26, 2023 Device-Cloud Collaboration Knowledge Distillation
— Unverified 0Knowledge Distillation of LLM for Automatic Scoring of Science Education Assessments Dec 26, 2023 Knowledge Distillation Mathematical Reasoning
— Unverified 0Revisiting Knowledge Distillation under Distribution Shift Dec 25, 2023 Data Augmentation Diversity
Code Code Available 0Less or More From Teacher: Exploiting Trilateral Geometry For Knowledge Distillation Dec 22, 2023 Bilevel Optimization Click-Through Rate Prediction
— Unverified 0Compressing Image-to-Image Translation GANs Using Local Density Structures on Their Learned Manifold Dec 22, 2023 Density Estimation Image-to-Image Translation
— Unverified 0TinySAM: Pushing the Envelope for Efficient Segment Anything Model Dec 21, 2023 Knowledge Distillation Quantization
Code Code Available 2How to Prune Your Language Model: Recovering Accuracy on the "Sparsity May Cry'' Benchmark Dec 21, 2023 Knowledge Distillation Language Modeling
— Unverified 0Object Attribute Matters in Visual Question Answering Dec 20, 2023 Attribute Graph Neural Network
Code Code Available 0DSFormer: Effective Compression of Text-Transformers by Dense-Sparse Weight Factorization Dec 20, 2023 Knowledge Distillation Natural Language Understanding
— Unverified 0StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation Dec 20, 2023 Knowledge Distillation
Code Code Available 0Fine-Grained Knowledge Selection and Restoration for Non-Exemplar Class Incremental Learning Dec 20, 2023 class-incremental learning Class Incremental Learning
Code Code Available 0Federated Learning with Extremely Noisy Clients via Negative Distillation Dec 20, 2023 Federated Learning Knowledge Distillation
Code Code Available 1Expediting Contrastive Language-Image Pretraining via Self-distilled Encoders Dec 19, 2023 Knowledge Distillation
— Unverified 0Distilling Autoregressive Models to Obtain High-Performance Non-Autoregressive Solvers for Vehicle Routing Problems with Faster Inference Speed Dec 19, 2023 Knowledge Distillation
Code Code Available 1RadOcc: Learning Cross-Modality Occupancy Knowledge through Rendering Assisted Distillation Dec 19, 2023 Knowledge Distillation Prediction
— Unverified 0Decoupled Knowledge with Ensemble Learning for Online Distillation Dec 18, 2023 Ensemble Learning Knowledge Distillation
Code Code Available 0Your Student is Better Than Expected: Adaptive Teacher-Student Collaboration for Text-Conditional Diffusion Models Dec 17, 2023 Image Generation Knowledge Distillation
Code Code Available 1DistilVPR: Cross-Modal Knowledge Distillation for Visual Place Recognition Dec 17, 2023 Knowledge Distillation Visual Place Recognition
Code Code Available 1Mixed Distillation Helps Smaller Language Model Better Reasoning Dec 17, 2023 Knowledge Distillation Language Modeling
— Unverified 0Symmetrical Bidirectional Knowledge Alignment for Zero-Shot Sketch-Based Image Retrieval Dec 16, 2023 Image Retrieval Knowledge Distillation
Code Code Available 0Simple Image-level Classification Improves Open-vocabulary Object Detection Dec 16, 2023 Knowledge Distillation Object
Code Code Available 1