Toward Efficient Deep Spiking Neuron Networks:A Survey On Compression Jun 3, 2024 Knowledge Distillation Quantization
— Unverified 00 Toward Fair Graph Neural Networks Via Dual-Teacher Knowledge Distillation Nov 30, 2024 Fairness Graph Representation Learning
— Unverified 00 Toward Model-centric Heterogeneous Federated Graph Learning: A Knowledge-driven Approach Jan 22, 2025 Diversity Graph Learning
— Unverified 00 Toward Multiple Specialty Learners for Explaining GNNs via Online Knowledge Distillation Oct 20, 2022 Knowledge Distillation
— Unverified 00 Towards a better understanding of Vector Quantized Autoencoders May 1, 2019 Knowledge Distillation Machine Translation
— Unverified 00 Towards Active Participant-Centric Vertical Federated Learning: Some Representations May Be All You Need Oct 23, 2024 All Federated Learning
— Unverified 00 Towards a Smaller Student: Capacity Dynamic Distillation for Efficient Image Retrieval Mar 16, 2023 Image Retrieval Knowledge Distillation
— Unverified 00 Towards a Unified Foundation Model: Jointly Pre-Training Transformers on Unpaired Images and Text Dec 14, 2021 image-classification Image Classification
— Unverified 00 Towards a Unified View of Affinity-Based Knowledge Distillation Sep 30, 2022 image-classification Image Classification
— Unverified 00 Towards a Universal Continuous Knowledge Base Dec 25, 2020 Knowledge Distillation text-classification
— Unverified 00 Towards Better Query Classification with Multi-Expert Knowledge Condensation in JD Ads Search Aug 2, 2023 Knowledge Distillation
— Unverified 00 Reconsidering Learning Objectives in Unbiased Recommendation with Unobserved Confounders Jun 7, 2022 Generalization Bounds Knowledge Distillation
— Unverified 00 Towards Building Secure UAV Navigation with FHE-aware Knowledge Distillation Nov 1, 2024 Knowledge Distillation Reinforcement Learning (RL)
— Unverified 00 Towards Collaborative Fairness in Federated Learning Under Imbalanced Covariate Shift Jul 11, 2025 Collaborative Fairness Fairness
— Unverified 00 Towards Comparable Knowledge Distillation in Semantic Image Segmentation Sep 7, 2023 Image Segmentation Knowledge Distillation
— Unverified 00 Towards Cross-modality Medical Image Segmentation with Online Mutual Knowledge Distillation Oct 4, 2020 Cardiac Segmentation Image Segmentation
— Unverified 00 Towards Developing a Multilingual and Code-Mixed Visual Question Answering System by Knowledge Distillation Sep 10, 2021 Knowledge Distillation Question Answering
— Unverified 00 Towards domain generalisation in ASR with elitist sampling and ensemble knowledge distillation Mar 1, 2023 Domain Adaptation Knowledge Distillation
— Unverified 00 Towards Efficient Task-Driven Model Reprogramming with Foundation Models Apr 5, 2023 Knowledge Distillation Transfer Learning
— Unverified 00 Towards Explaining Autonomy with Verbalised Decision Tree States Sep 28, 2022 Knowledge Distillation
— Unverified 00 Towards Expressive Speaking Style Modelling with Hierarchical Context Information for Mandarin Speech Synthesis Mar 23, 2022 Expressive Speech Synthesis Knowledge Distillation
— Unverified 00 Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation Sep 29, 2023 Image Generation Knowledge Distillation
— Unverified 00 Towards Fixing Clever-Hans Predictors with Counterfactual Knowledge Distillation Oct 2, 2023 counterfactual Knowledge Distillation
— Unverified 00 Towards Full Utilization on Mask Task for Distilling PLMs into NMT Sep 17, 2021 Knowledge Distillation Machine Translation
— Unverified 00 Towards General and Fast Video Derain via Knowledge Distillation Aug 10, 2023 Decoder Knowledge Distillation
— Unverified 00 CAM-loss: Towards Learning Spatially Discriminative Feature Representations Sep 3, 2021 Few-Shot Learning image-classification
— Unverified 00 Towards Lifelong Few-Shot Customization of Text-to-Image Diffusion Nov 8, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Towards LogiGLUE: A Brief Survey and A Benchmark for Analyzing Logical Reasoning Capabilities of Language Models Oct 2, 2023 Knowledge Distillation Language Modelling
— Unverified 00 Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts Aug 31, 2023 Contrastive Learning Graph Classification
— Unverified 00 Towards Making Deep Transfer Learning Never Hurt Nov 18, 2019 All Knowledge Distillation
— Unverified 00 Towards Model Agnostic Federated Learning Using Knowledge Distillation Oct 28, 2021 Federated Learning Knowledge Distillation
— Unverified 00 Towards Non-task-specific Distillation of BERT via Sentence Representation Approximation Apr 7, 2020 Knowledge Distillation Sentence
— Unverified 00 Towards On-Board Panoptic Segmentation of Multispectral Satellite Images Apr 5, 2022 Knowledge Distillation Panoptic Segmentation
— Unverified 00 Towards Optimal Trade-offs in Knowledge Distillation for CNNs and Vision Transformers at the Edge Jun 25, 2024 Knowledge Distillation
— Unverified 00 Towards Oracle Knowledge Distillation with Neural Architecture Search Nov 29, 2019 image-classification Image Classification
— Unverified 00 Towards Personalized Federated Learning via Comprehensive Knowledge Distillation Nov 6, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Towards Robust Classification with Image Quality Assessment Apr 14, 2020 Classification General Classification
— Unverified 00 Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach Oct 17, 2024 Earth Observation Federated Learning
— Unverified 00 Towards Scalable and Generalizable Earth Observation Data Mining via Foundation Model Composition Jun 25, 2025 Earth Observation Knowledge Distillation
— Unverified 00 Towards Scalable & Efficient Interaction-Aware Planning in Autonomous Vehicles using Knowledge Distillation Apr 2, 2024 Autonomous Vehicles Decision Making
— Unverified 00 Towards Streaming Egocentric Action Anticipation Oct 11, 2021 Action Anticipation Knowledge Distillation
— Unverified 00 SOCRATES: Text-based Human Search and Approach using a Robot Dog Feb 10, 2023 Knowledge Distillation
— Unverified 00 Towards Unconstrained 2D Pose Estimation of the Human Spine Apr 10, 2025 2D Pose Estimation Active Learning
— Unverified 00 Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning Dec 17, 2020 Deep Learning Knowledge Distillation
— Unverified 00 Towards Understanding Knowledge Distillation May 27, 2021 Knowledge Distillation Transfer Learning
— Unverified 00 Do we need Label Regularization to Fine-tune Pre-trained Language Models? May 25, 2022 Knowledge Distillation Model Compression
— Unverified 00 Towards Unsupervised Crowd Counting via Regression-Detection Bi-knowledge Transfer Aug 12, 2020 Crowd Counting Knowledge Distillation
— Unverified 00 Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture Feb 19, 2025 Knowledge Distillation
— Unverified 00 Towards Zero-Shot Knowledge Distillation for Natural Language Processing Dec 31, 2020 Knowledge Distillation Model Compression
— Unverified 00 Toxicity Detection can be Sensitive to the Conversational Context Nov 19, 2021 Data Augmentation Knowledge Distillation
— Unverified 00