Multi-adversarial Faster-RCNN with Paradigm Teacher for Unrestricted Object Detection Dec 11, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0Complete-to-Partial 4D Distillation for Self-Supervised Point Cloud Sequence Representation Learning Dec 10, 2022 Knowledge Distillation Representation Learning
— Unverified 0LEAD: Liberal Feature-based Distillation for Dense Retrieval Dec 10, 2022 Document Ranking Knowledge Distillation
— Unverified 0Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection Dec 8, 2022 Knowledge Distillation
— Unverified 0Occlusion-Robust FAU Recognition by Mining Latent Space of Masked Autoencoders Dec 8, 2022 Knowledge Distillation
— Unverified 0Life-long Learning for Multilingual Neural Machine Translation with Knowledge Distillation Dec 6, 2022 Knowledge Distillation Machine Translation
— Unverified 0Open World DETR: Transformer based Open World Object Detection Dec 6, 2022 Knowledge Distillation Object
— Unverified 0Leveraging Different Learning Styles for Improved Knowledge Distillation in Biomedical Imaging Dec 6, 2022 Knowledge Distillation Model Compression
— Unverified 0DA-CIL: Towards Domain Adaptive Class-Incremental 3D Object Detection Dec 5, 2022 3D Object Detection class-incremental learning
— Unverified 0Single image calibration using knowledge distillation approaches Dec 5, 2022 Camera Calibration Incremental Learning
— Unverified 0The RoyalFlush System for the WMT 2022 Efficiency Task Dec 3, 2022 Decoder GPU
— Unverified 0StructVPR: Distill Structural Knowledge with Weighting Samples for Visual Place Recognition Dec 2, 2022 Image Retrieval Knowledge Distillation
— Unverified 0Injecting Spatial Information for Monaural Speech Enhancement via Knowledge Distillation Dec 2, 2022 Knowledge Distillation Speech Enhancement
— Unverified 0Distilling Reasoning Capabilities into Smaller Language Models Dec 1, 2022 GSM8K Knowledge Distillation
Code Code Available 0Coordinating Cross-modal Distillation for Molecular Property Prediction Nov 30, 2022 Graph Regression Graph Representation Learning
— Unverified 0Explicit Knowledge Transfer for Weakly-Supervised Code Generation Nov 30, 2022 Code Generation Few-Shot Learning
— Unverified 0HEAT: Hardware-Efficient Automatic Tensor Decomposition for Transformer Compression Nov 30, 2022 Efficient Exploration Knowledge Distillation
— Unverified 0Hint-dynamic Knowledge Distillation Nov 30, 2022 Knowledge Distillation
— Unverified 0Random Copolymer inverse design system orienting on Accurate discovering of Antimicrobial peptide-mimetic copolymers Nov 30, 2022 Activity Prediction Knowledge Distillation
— Unverified 0Attention-Based Depth Distillation with 3D-Aware Positional Encoding for Monocular 3D Object Detection Nov 30, 2022 3D Object Detection Depth Estimation
Code Code Available 0Feature-domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution Nov 29, 2022 Image Super-Resolution Knowledge Distillation
— Unverified 0SgVA-CLIP: Semantic-guided Visual Adapting of Vision-Language Models for Few-shot Image Classification Nov 28, 2022 Few-Shot Image Classification Few-Shot Learning
Code Code Available 0Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition Nov 28, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0BJTU-WeChat's Systems for the WMT22 Chat Translation Task Nov 28, 2022 Denoising Knowledge Distillation
— Unverified 0Lightning Fast Video Anomaly Detection via Adversarial Knowledge Distillation Nov 28, 2022 Anomaly Detection Knowledge Distillation
Code Code Available 0Class-aware Information for Logit-based Knowledge Distillation Nov 27, 2022 Knowledge Distillation
— Unverified 0EPIK: Eliminating multi-model Pipelines with Knowledge-distillation Nov 27, 2022 Knowledge Distillation Transliteration
— Unverified 0SKDBERT: Compressing BERT via Stochastic Knowledge Distillation Nov 26, 2022 Knowledge Distillation Language Modeling
— Unverified 0Structural Knowledge Distillation for Object Detection Nov 23, 2022 Feature Importance Knowledge Distillation
— Unverified 0On the Transferability of Visual Features in Generalized Zero-Shot Learning Nov 22, 2022 Generalized Zero-Shot Learning Knowledge Distillation
Code Code Available 0Blind Knowledge Distillation for Robust Image Classification Nov 21, 2022 Classification image-classification
Code Code Available 0Privacy in Practice: Private COVID-19 Detection in X-Ray Images (Extended Version) Nov 21, 2022 Knowledge Distillation Membership Inference Attack
Code Code Available 0Understanding and Improving Knowledge Distillation for Quantization-Aware Training of Large Transformer Encoders Nov 20, 2022 Knowledge Distillation Model Compression
Code Code Available 0AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation Nov 20, 2022 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Scalable Collaborative Learning via Representation Sharing Nov 20, 2022 Federated Learning Knowledge Distillation
— Unverified 0DASECount: Domain-Agnostic Sample-Efficient Wireless Indoor Crowd Counting via Few-shot Learning Nov 18, 2022 Crowd Counting Few-Shot Learning
— Unverified 0DETRDistill: A Universal Knowledge Distillation Framework for DETR-families Nov 17, 2022 Knowledge Distillation object-detection
— Unverified 0Knowledge distillation for fast and accurate DNA sequence correction Nov 17, 2022 Knowledge Distillation
— Unverified 0D^3ETR: Decoder Distillation for Detection Transformer Nov 17, 2022 Decoder Knowledge Distillation
— Unverified 0Is Smaller Always Faster? Tradeoffs in Compressing Self-Supervised Speech Transformers Nov 17, 2022 Knowledge Distillation Model Compression
Code Code Available 0Sub-Graph Learning for Spatiotemporal Forecasting via Knowledge Distillation Nov 17, 2022 Diversity Graph Learning
— Unverified 0Yield Evaluation of Citrus Fruits based on the YoloV5 compressed by Knowledge Distillation Nov 16, 2022 Knowledge Distillation
— Unverified 0An Investigation of the Combination of Rehearsal and Knowledge Distillation in Continual Learning for Spoken Language Understanding Nov 15, 2022 class-incremental learning Class Incremental Learning
Code Code Available 0Instance-aware Model Ensemble With Distillation For Unsupervised Domain Adaptation Nov 15, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0An Efficient Active Learning Pipeline for Legal Text Classification Nov 15, 2022 Active Learning Classification
— Unverified 0Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Nov 15, 2022 General Knowledge Knowledge Distillation
Code Code Available 0Feature Correlation-guided Knowledge Transfer for Federated Self-supervised Learning Nov 14, 2022 Feature Correlation Federated Learning
— Unverified 0Structured Knowledge Distillation Towards Efficient and Compact Multi-View 3D Detection Nov 14, 2022 Knowledge Distillation
— Unverified 0An Interpretable Neuron Embedding for Static Knowledge Distillation Nov 14, 2022 Knowledge Distillation
— Unverified 0Long-Range Zero-Shot Generative Deep Network Quantization Nov 13, 2022 Knowledge Distillation Quantization
— Unverified 0