BEBERT: Efficient and Robust Binary Ensemble BERT Oct 28, 2022 Binarization Computational Efficiency
Code Code Available 0Teacher-Student Architecture for Knowledge Learning: A Survey Oct 28, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0Completely Heterogeneous Federated Learning Oct 28, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 0Lightweight and High-Fidelity End-to-End Text-to-Speech with Multi-Band Generation and Inverse Short-Time Fourier Transform Oct 28, 2022 CPU Knowledge Distillation
Code Code Available 2Semi-UFormer: Semi-supervised Uncertainty-aware Transformer for Image Dehazing Oct 28, 2022 Image Dehazing Knowledge Distillation
— Unverified 0Can Current Explainability Help Provide References in Clinical Notes to Support Humans Annotate Medical Codes? Oct 28, 2022 Knowledge Distillation Medical Code Prediction
— Unverified 0Self-Supervised Learning with Multi-View Rendering for 3D Point Cloud Analysis Oct 28, 2022 Knowledge Distillation Self-Supervised Learning
Code Code Available 1Fast DistilBERT on CPUs Oct 27, 2022 Knowledge Distillation Model Compression
— Unverified 0A Knowledge Distillation Framework For Enhancing Ear-EEG Based Sleep Staging With Scalp-EEG Data Oct 27, 2022 Domain Adaptation EEG
Code Code Available 1QUILL: Query Intent with Large Language Models using Retrieval Augmentation and Multi-stage Distillation Oct 27, 2022 Feature Engineering Knowledge Distillation
— Unverified 0Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks Oct 27, 2022 Knowledge Distillation Quantization
— Unverified 0Too Brittle To Touch: Comparing the Stability of Quantization and Distillation Towards Developing Lightweight Low-Resource MT Models Oct 27, 2022 Knowledge Distillation Machine Translation
Code Code Available 1Weight Averaging: A Simple Yet Effective Method to Overcome Catastrophic Forgetting in Automatic Speech Recognition Oct 27, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Improved Feature Distillation via Projector Ensemble Oct 27, 2022 Knowledge Distillation Multi-Task Learning
Code Code Available 1Li3DeTr: A LiDAR based 3D Detection Transformer Oct 27, 2022 Autonomous Driving Decoder
— Unverified 0Long-tailed Food Classification Oct 26, 2022 Classification Data Augmentation
— Unverified 0GlobalFlowNet: Video Stabilization using Deep Distilled Global Motion Estimates Oct 25, 2022 Knowledge Distillation Optical Flow Estimation
Code Code Available 1Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision Oct 25, 2022 Knowledge Distillation Model Compression
— Unverified 0An Effective Deep Network for Head Pose Estimation without Keypoints Oct 25, 2022 Gaze Estimation Head Pose Estimation
— Unverified 0Referee: Reference-Free Sentence Summarization with Sharper Controllability through Symbolic Knowledge Distillation Oct 25, 2022 Knowledge Distillation Sentence
— Unverified 0Legal-Tech Open Diaries: Lesson learned on how to develop and deploy light-weight models in the era of humongous Language Models Oct 24, 2022 Knowledge Distillation Model Compression
— Unverified 0Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks Oct 24, 2022 Knowledge Distillation Transfer Learning
Code Code Available 1Bootstrapping meaning through listening: Unsupervised learning of spoken sentence embeddings Oct 23, 2022 Acoustic Unit Discovery Contrastive Learning
Code Code Available 1Respecting Transfer Gap in Knowledge Distillation Oct 23, 2022 Knowledge Distillation
— Unverified 0Adaptive Label Smoothing with Self-Knowledge in Natural Language Generation Oct 22, 2022 Knowledge Distillation Text Generation
— Unverified 0Hard Gate Knowledge Distillation -- Leverage Calibration for Robust and Reliable Language Model Oct 22, 2022 Knowledge Distillation Language Modeling
— Unverified 0Augmentation with Projection: Towards an Effective and Efficient Data Augmentation Paradigm for Distillation Oct 21, 2022 Data Augmentation Diversity
— Unverified 0Modeling Document-level Temporal Structures for Building Temporal Dependency Graphs Oct 21, 2022 Knowledge Distillation Sentence
Code Code Available 0Distilling the Undistillable: Learning from a Nasty Teacher Oct 21, 2022 Knowledge Distillation
Code Code Available 0Performance-Efficiency Trade-Offs in Adapting Language Models to Text Classification Tasks Oct 21, 2022 Knowledge Distillation text-classification
— Unverified 0Similarity of Neural Architectures using Adversarial Attack Transferability Oct 20, 2022 Adversarial Attack Diversity
— Unverified 0Semi-supervised object detection based on single-stage detector for thighbone fracture localization Oct 20, 2022 Fracture detection Image Augmentation
— Unverified 0Toward Multiple Specialty Learners for Explaining GNNs via Online Knowledge Distillation Oct 20, 2022 Knowledge Distillation
— Unverified 0A baseline revisited: Pushing the limits of multi-segment models for context-aware translation Oct 19, 2022 Knowledge Distillation Translation
— Unverified 0ADPS: Asymmetric Distillation Post-Segmentation for Image Anomaly Detection Oct 19, 2022 Anomaly Detection Anomaly Localization
— Unverified 0Cross-Modal Fusion Distillation for Fine-Grained Sketch-Based Image Retrieval Oct 19, 2022 Cross-Modal Retrieval Image Retrieval
Code Code Available 1SA-MLP: Distilling Graph Knowledge from GNNs into Structure-Aware MLP Oct 18, 2022 Knowledge Distillation Node Classification
Code Code Available 0On effects of Knowledge Distillation on Transfer Learning Oct 18, 2022 image-classification Image Classification
— Unverified 0Distilling Object Detectors With Global Knowledge Oct 17, 2022 Knowledge Distillation Object
Code Code Available 0Federated Learning with Privacy-Preserving Ensemble Attention Distillation Oct 16, 2022 Federated Learning image-classification
— Unverified 0RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging Oct 15, 2022 Classification Knowledge Distillation
— Unverified 0EfficientVLM: Fast and Accurate Vision-Language Models via Knowledge Distillation and Modal-adaptive Pruning Oct 14, 2022 Caption Generation Knowledge Distillation
Code Code Available 1Learning Generalizable Models for Vehicle Routing Problems via Knowledge Distillation Oct 14, 2022 Knowledge Distillation
Code Code Available 1Improving generalizability of distilled self-supervised speech processing models under distorted settings Oct 14, 2022 Knowledge Distillation
Code Code Available 0Knowledge Distillation approach towards Melanoma Detection Oct 14, 2022 Knowledge Distillation TAG
Code Code Available 0You Can Have Your Data and Balance It Too: Towards Balanced and Efficient Multilingual Models Oct 13, 2022 Cross-Lingual Transfer Knowledge Distillation
— Unverified 0Probabilistic Integration of Object Level Annotations in Chest X-ray Classification Oct 13, 2022 Knowledge Distillation Variational Inference
— Unverified 0Efficient Knowledge Distillation from Model Checkpoints Oct 12, 2022 Knowledge Distillation model
Code Code Available 1Boosting Graph Neural Networks via Adaptive Knowledge Distillation Oct 12, 2022 Graph Classification Graph Mining
— Unverified 0Integrating Translation Memories into Non-Autoregressive Machine Translation Oct 12, 2022 Knowledge Distillation Machine Translation
Code Code Available 0