The Staged Knowledge Distillation in Video Classification: Harmonizing Student Progress by a Complementary Weakly Supervised Framework Jul 11, 2023 Knowledge Distillation Pseudo Label
— Unverified 0The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes Oct 11, 2022 Active Learning Knowledge Distillation
— Unverified 0The USYD-JD Speech Translation System for IWSLT 2021 Jul 24, 2021 Knowledge Distillation NMT
— Unverified 0The USYD-JD Speech Translation System for IWSLT2021 Aug 1, 2021 Knowledge Distillation NMT
— Unverified 0The Xiaomi Text-to-Text Simultaneous Speech Translation System for IWSLT 2022 May 1, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Three Factors to Improve Out-of-Distribution Detection Aug 2, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0TIMA: Text-Image Mutual Awareness for Balancing Zero-Shot Adversarial Robustness and Generalization Ability May 27, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0TimeDistill: Efficient Long-Term Time Series Forecasting with MLP via Cross-Architecture Distillation Feb 20, 2025 Data Augmentation Knowledge Distillation
— Unverified 0TinyM^2Net-V3: Memory-Aware Compressed Multimodal Deep Neural Networks for Sustainable Edge Deployment May 20, 2024 Knowledge Distillation Model Compression
— Unverified 0XtremeDistil: Multi-stage Distillation for Massive Multilingual Models Apr 12, 2020 Knowledge Distillation named-entity-recognition
— Unverified 0TinyViT: Fast Pretraining Distillation for Small Vision Transformers Jul 21, 2022 Image Classification Knowledge Distillation
— Unverified 0TIP: Typifying the Interpretability of Procedures Jun 9, 2017 Knowledge Distillation
— Unverified 0TKD: Temporal Knowledge Distillation for Active Perception Mar 4, 2019 Knowledge Distillation Object
— Unverified 0ToDi: Token-wise Distillation via Fine-Grained Divergence Control May 22, 2025 Instruction Following Knowledge Distillation
— Unverified 0TODM: Train Once Deploy Many Efficient Supernet-Based RNN-T Compression For On-device ASR Models Sep 5, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Tokenizing Electron Cloud in Protein-Ligand Interaction Learning May 25, 2025 Knowledge Distillation Prediction
— Unverified 0Token-Level Ensemble Distillation for Grapheme-to-Phoneme Conversion Apr 6, 2019 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Topic Modeling for Maternal Health Using Reddit Apr 1, 2021 Knowledge Distillation
— Unverified 0Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data Jul 7, 2024 Activity Recognition Deep Learning
— Unverified 0Topology Distillation for Recommender System Jun 16, 2021 Knowledge Distillation Model Compression
— Unverified 0torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation Nov 25, 2020 Image Classification Instance Segmentation
— Unverified 0torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP Oct 26, 2023 image-classification Image Classification
— Unverified 0To Smooth or not to Smooth? On Compatibility between Label Smoothing and Knowledge Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 0Toward Data-centric Directed Graph Learning: An Entropy-driven Approach May 2, 2025 Graph Learning Knowledge Distillation
— Unverified 0Toward Efficient Deep Spiking Neuron Networks:A Survey On Compression Jun 3, 2024 Knowledge Distillation Quantization
— Unverified 0Toward Fair Graph Neural Networks Via Dual-Teacher Knowledge Distillation Nov 30, 2024 Fairness Graph Representation Learning
— Unverified 0Toward Model-centric Heterogeneous Federated Graph Learning: A Knowledge-driven Approach Jan 22, 2025 Diversity Graph Learning
— Unverified 0Toward Multiple Specialty Learners for Explaining GNNs via Online Knowledge Distillation Oct 20, 2022 Knowledge Distillation
— Unverified 0Towards a better understanding of Vector Quantized Autoencoders May 1, 2019 Knowledge Distillation Machine Translation
— Unverified 0Towards Active Participant-Centric Vertical Federated Learning: Some Representations May Be All You Need Oct 23, 2024 All Federated Learning
— Unverified 0Towards a Smaller Student: Capacity Dynamic Distillation for Efficient Image Retrieval Mar 16, 2023 Image Retrieval Knowledge Distillation
— Unverified 0Towards a Unified Foundation Model: Jointly Pre-Training Transformers on Unpaired Images and Text Dec 14, 2021 image-classification Image Classification
— Unverified 0Towards a Unified View of Affinity-Based Knowledge Distillation Sep 30, 2022 image-classification Image Classification
— Unverified 0Towards a Universal Continuous Knowledge Base Dec 25, 2020 Knowledge Distillation text-classification
— Unverified 0Towards Better Query Classification with Multi-Expert Knowledge Condensation in JD Ads Search Aug 2, 2023 Knowledge Distillation
— Unverified 0Reconsidering Learning Objectives in Unbiased Recommendation with Unobserved Confounders Jun 7, 2022 Generalization Bounds Knowledge Distillation
— Unverified 0Towards Building Secure UAV Navigation with FHE-aware Knowledge Distillation Nov 1, 2024 Knowledge Distillation Reinforcement Learning (RL)
— Unverified 0Towards Collaborative Fairness in Federated Learning Under Imbalanced Covariate Shift Jul 11, 2025 Collaborative Fairness Fairness
— Unverified 0Towards Comparable Knowledge Distillation in Semantic Image Segmentation Sep 7, 2023 Image Segmentation Knowledge Distillation
— Unverified 0Towards Cross-modality Medical Image Segmentation with Online Mutual Knowledge Distillation Oct 4, 2020 Cardiac Segmentation Image Segmentation
— Unverified 0Towards Developing a Multilingual and Code-Mixed Visual Question Answering System by Knowledge Distillation Sep 10, 2021 Knowledge Distillation Question Answering
— Unverified 0Towards domain generalisation in ASR with elitist sampling and ensemble knowledge distillation Mar 1, 2023 Domain Adaptation Knowledge Distillation
— Unverified 0Towards Efficient Task-Driven Model Reprogramming with Foundation Models Apr 5, 2023 Knowledge Distillation Transfer Learning
— Unverified 0Towards Explaining Autonomy with Verbalised Decision Tree States Sep 28, 2022 Knowledge Distillation
— Unverified 0Towards Expressive Speaking Style Modelling with Hierarchical Context Information for Mandarin Speech Synthesis Mar 23, 2022 Expressive Speech Synthesis Knowledge Distillation
— Unverified 0Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation Sep 29, 2023 Image Generation Knowledge Distillation
— Unverified 0Towards Fixing Clever-Hans Predictors with Counterfactual Knowledge Distillation Oct 2, 2023 counterfactual Knowledge Distillation
— Unverified 0Towards Full Utilization on Mask Task for Distilling PLMs into NMT Sep 17, 2021 Knowledge Distillation Machine Translation
— Unverified 0Towards General and Fast Video Derain via Knowledge Distillation Aug 10, 2023 Decoder Knowledge Distillation
— Unverified 0CAM-loss: Towards Learning Spatially Discriminative Feature Representations Sep 3, 2021 Few-Shot Learning image-classification
— Unverified 0