Aware of the History: Trajectory Forecasting with the Local Behavior Data Jul 20, 2022 Knowledge Distillation Prediction
— Unverified 0Model Compression for Resource-Constrained Mobile Robots Jul 20, 2022 Knowledge Distillation model
— Unverified 0Knowledge distillation with a class-aware loss for endoscopic disease detection Jul 19, 2022 Diagnostic Knowledge Distillation
— Unverified 0Context Unaware Knowledge Distillation for Image Retrieval Jul 19, 2022 Image Retrieval Knowledge Distillation
Code Code Available 0FedX: Unsupervised Federated Learning with Cross Knowledge Distillation Jul 19, 2022 Contrastive Learning Federated Learning
Code Code Available 1Informative knowledge distillation for image anomaly segmentation Jul 19, 2022 Anomaly Detection Anomaly Segmentation
Code Code Available 1Learning Knowledge Representation with Meta Knowledge Distillation for Single Image Super-Resolution Jul 18, 2022 Image Super-Resolution Knowledge Distillation
— Unverified 0Class-incremental Novel Class Discovery Jul 18, 2022 Incremental Learning Knowledge Distillation
Code Code Available 1Rethinking Data Augmentation for Robust Visual Question Answering Jul 18, 2022 Data Augmentation Knowledge Distillation
Code Code Available 1TSPipe: Learn from Teacher Faster with Pipelines Jul 17, 2022 GPU Knowledge Distillation
Code Code Available 0Subclass Knowledge Distillation with Known Subclass Labels Jul 17, 2022 Binary Classification Knowledge Distillation
— Unverified 0SSMTL++: Revisiting Self-Supervised Multi-Task Learning for Video Anomaly Detection Jul 16, 2022 Anomaly Detection Knowledge Distillation
— Unverified 0Multi-Level Branched Regularization for Federated Learning Jul 14, 2022 Federated Learning Knowledge Distillation
Code Code Available 1Dynamic Low-Resolution Distillation for Cost-Efficient End-to-End Text Spotting Jul 14, 2022 global-optimization Knowledge Distillation
— Unverified 0Rethinking Attention Mechanism in Time Series Classification Jul 14, 2022 Classification Knowledge Distillation
— Unverified 0Large-scale Knowledge Distillation with Elastic Heterogeneous Computing Resources Jul 14, 2022 Knowledge Distillation
Code Code Available 1Deep versus Wide: An Analysis of Student Architectures for Task-Agnostic Knowledge Distillation of Self-Supervised Speech Models Jul 14, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Rich Feature Distillation with Feature Affinity Module for Efficient Image Dehazing Jul 13, 2022 Contrastive Learning image-classification
— Unverified 0DSPNet: Towards Slimmable Pretrained Networks based on Discriminative Self-supervised Learning Jul 13, 2022 Knowledge Distillation Linear evaluation
— Unverified 0ProDiff: Progressive Fast Diffusion Model For High-Quality Text-to-Speech Jul 13, 2022 Denoising GPU
Code Code Available 3Re2G: Retrieve, Rerank, Generate Jul 13, 2022 Fact Checking Fact Verification
Code Code Available 1SlimSeg: Slimmable Semantic Segmentation with Boundary Supervision Jul 13, 2022 Knowledge Distillation Segmentation
— Unverified 0Distilled Non-Semantic Speech Embeddings with Binary Neural Networks for Low-Resource Devices Jul 12, 2022 Emotion Recognition Keyword Spotting
Code Code Available 0Contrastive Deep Supervision Jul 12, 2022 Contrastive Learning Fine-Grained Image Classification
Code Code Available 1Normalized Feature Distillation for Semantic Segmentation Jul 12, 2022 Knowledge Distillation Model Compression
— Unverified 0Knowledge Condensation Distillation Jul 12, 2022 Knowledge Distillation
Code Code Available 1HEAD: HEtero-Assists Distillation for Heterogeneous Object Detectors Jul 12, 2022 Knowledge Distillation Object
Code Code Available 1Cross-Architecture Knowledge Distillation Jul 12, 2022 Knowledge Distillation
— Unverified 0Fast-Vid2Vid: Spatial-Temporal Compression for Video-to-Video Synthesis Jul 11, 2022 GPU Knowledge Distillation
Code Code Available 12DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds Jul 10, 2022 3D Semantic Segmentation Autonomous Driving
Code Code Available 21st Place Solution to the EPIC-Kitchens Action Anticipation Challenge 2022 Jul 10, 2022 Action Anticipation Knowledge Distillation
— Unverified 0FairDistillation: Mitigating Stereotyping in Language Models Jul 10, 2022 Knowledge Distillation
Code Code Available 1Improving Streaming End-to-End ASR on Transformer-based Causal Models with Encoder States Revision Strategies Jul 6, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Low-resource Low-footprint Wake-word Detection using Knowledge Distillation Jul 6, 2022 Knowledge Distillation speech-recognition
— Unverified 0PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient Jul 5, 2022 Knowledge Distillation object-detection
— Unverified 0GLANCE: Global to Local Architecture-Neutral Concept-based Explanations Jul 5, 2022 Disentanglement Feature Importance
Code Code Available 0Open-Vocabulary Multi-Label Classification via Multi-Modal Knowledge Transfer Jul 5, 2022 Image-text matching Knowledge Distillation
Code Code Available 1ACT-Net: Asymmetric Co-Teacher Network for Semi-supervised Memory-efficient Medical Image Segmentation Jul 5, 2022 Image Segmentation Knowledge Distillation
Code Code Available 0A Generative Framework for Personalized Learning and Estimation: Theory, Algorithms, and Privacy Jul 5, 2022 Federated Learning Knowledge Distillation
— Unverified 0VEM^2L: A Plug-and-play Framework for Fusing Text and Structure Knowledge on Sparse Knowledge Graph Completion Jul 4, 2022 Knowledge Distillation Knowledge Graph Completion
— Unverified 0FasterAI: A Lightweight Library for Creating Sparse Neural Networks Jul 3, 2022 Knowledge Distillation
— Unverified 0PrUE: Distilling Knowledge from Sparse Teacher Networks Jul 3, 2022 Knowledge Distillation
Code Code Available 0Speech Emotion: Investigating Model Representations, Multi-Task Learning and Knowledge Distillation Jul 2, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0Lost in Distillation: A Case Study in Toxicity Modeling Jul 1, 2022 Knowledge Distillation
— Unverified 0Asynchronous Convergence in Multi-Task Learning via Knowledge Distillation from Converged Tasks Jul 1, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0KroneckerBERT: Significant Compression of Pre-trained Language Models Through Kronecker Decomposition and Knowledge Distillation Jul 1, 2022 Knowledge Distillation Language Modeling
— Unverified 0Why Knowledge Distillation Amplifies Gender Bias and How to Mitigate from the Perspective of DistilBERT Jul 1, 2022 Knowledge Distillation
— Unverified 0End-to-End Simultaneous Speech Translation with Pretraining and Distillation: Huawei Noah’s System for AutoSimTranS 2022 Jul 1, 2022 Decoder Knowledge Distillation
— Unverified 0FitHuBERT: Going Thinner and Deeper for Knowledge Distillation of Speech Self-Supervised Learning Jul 1, 2022 Knowledge Distillation Phoneme Recognition
Code Code Available 1ListBERT: Learning to Rank E-commerce products with Listwise BERT Jun 30, 2022 Knowledge Distillation Learning-To-Rank
— Unverified 0