BEExformer: A Fast Inferencing Transformer Architecture via Binarization with Multiple Early Exits Dec 6, 2024 Binarization Knowledge Distillation
— Unverified 00 BERM: Training the Balanced and Extractable Representation for Matching to Improve Generalization Ability of Dense Retrieval May 18, 2023 Information Retrieval Knowledge Distillation
— Unverified 00 BERT Learns to Teach: Knowledge Distillation with Meta Learning Aug 17, 2021 Knowledge Distillation Meta-Learning
— Unverified 00 BeSound: Bluetooth-Based Position Estimation Enhancing with Cross-Modality Distillation Apr 24, 2024 Knowledge Distillation Position
— Unverified 00 Better Knowledge Enhancement for Privacy-Preserving Cross-Project Defect Prediction Dec 23, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Beyond Classification: Knowledge Distillation using Multi-Object Impressions Oct 27, 2021 Classification Knowledge Distillation
— Unverified 00 Beyond Task Vectors: Selective Task Arithmetic Based on Importance Metrics Nov 25, 2024 Knowledge Distillation Multi-Task Learning
— Unverified 00 Beyond the Tip of Efficiency: Uncovering the Submerged Threats of Jailbreak Attacks in Small Language Models Feb 27, 2025 Knowledge Distillation Model Compression
— Unverified 00 Be Your Own Best Competitor! Multi-Branched Adversarial Knowledge Transfer Oct 9, 2020 Decoder image-classification
— Unverified 00 Bi-CryptoNets: Leveraging Different-Level Privacy for Encrypted Inference Feb 2, 2024 Knowledge Distillation Privacy Preserving
— Unverified 00 Bilateral Memory Consolidation for Continual Learning Jan 1, 2023 Continual Learning Knowledge Distillation
— Unverified 00 BiM-VFI: Bidirectional Motion Field-Guided Frame Interpolation for Video with Non-uniform Motions Jan 1, 2025 Knowledge Distillation Motion Estimation
— Unverified 00 Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks Apr 19, 2023 Knowledge Distillation
— Unverified 00 BJTU-WeChat's Systems for the WMT22 Chat Translation Task Nov 28, 2022 Denoising Knowledge Distillation
— Unverified 00 Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack May 3, 2021 Knowledge Distillation Self-Knowledge Distillation
— Unverified 00 Black-box Source-free Domain Adaptation via Two-stage Knowledge Distillation May 13, 2023 Domain Adaptation Knowledge Distillation
— Unverified 00 Block-wise Intermediate Representation Training for Model Compression Oct 20, 2018 Knowledge Distillation model
— Unverified 00 BLSP-KD: Bootstrapping Language-Speech Pre-training via Knowledge Distillation May 29, 2024 Instruction Following Knowledge Distillation
— Unverified 00 BOLT: Bootstrap Long Chain-of-Thought in Language Models without Distillation Feb 6, 2025 In-Context Learning Knowledge Distillation
— Unverified 00 Boosting Accuracy and Robustness of Student Models via Adaptive Adversarial Distillation Jan 1, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 00 BoostingBERT:Integrating Multi-Class Boosting into BERT for NLP Tasks Sep 13, 2020 Ensemble Learning Knowledge Distillation
— Unverified 00 Boosting Contrastive Learning with Relation Knowledge Distillation Dec 8, 2021 Contrastive Learning Knowledge Distillation
— Unverified 00 Boosting Graph Neural Networks via Adaptive Knowledge Distillation Oct 12, 2022 Graph Classification Graph Mining
— Unverified 00 Boosting Lossless Speculative Decoding via Feature Sampling and Partial Alignment Distillation Aug 28, 2024 Knowledge Distillation Language Modelling
— Unverified 00 Boosting Self-Supervision for Single-View Scene Completion via Knowledge Distillation Apr 11, 2024 Depth Estimation Depth Prediction
— Unverified 00 Boost Vision Transformer with GPU-Friendly Sparsity and Quantization May 18, 2023 Benchmarking GPU
— Unverified 00 BOOT: Data-free Distillation of Denoising Diffusion Models with Bootstrapping Jun 8, 2023 Denoising Knowledge Distillation
— Unverified 00 Bootstrapped Representation Learning for Skeleton-Based Action Recognition Feb 4, 2022 Action Recognition Data Augmentation
— Unverified 00 Bootstrapping Chest CT Image Understanding by Distilling Knowledge from X-ray Expert Models Apr 7, 2024 Contrastive Learning Diagnostic
— Unverified 00 Improving Neural Ranking via Lossless Knowledge Distillation Sep 30, 2021 Knowledge Distillation Learning-To-Rank
— Unverified 00 Towards Complementary Knowledge Distillation for Efficient Dense Image Prediction Jan 24, 2024 Implicit Relations Instance Segmentation
— Unverified 00 Breaking the Modality Barrier: Universal Embedding Learning with Multimodal LLMs Apr 24, 2025 Image-text Retrieval Instruction Following
— Unverified 00 Breaking the trade-off in personalized speech enhancement with cross-task knowledge distillation Nov 5, 2022 Knowledge Distillation Speech Enhancement
— Unverified 00 Bridge the Gap between Past and Future: Siamese Model Optimization for Context-Aware Document Ranking May 20, 2025 Document Ranking Information Retrieval
— Unverified 00 Bridging Classical and Quantum Machine Learning: Knowledge Transfer From Classical to Quantum Neural Networks Using Knowledge Distillation Nov 23, 2023 Dimensionality Reduction Image Classification
— Unverified 00 Bridging Fairness and Environmental Sustainability in Natural Language Processing Nov 8, 2022 Dimensionality Reduction Fairness
— Unverified 00 Bridging the gap between Human Action Recognition and Online Action Detection Jan 21, 2021 Action Detection Action Recognition
— Unverified 00 Bridging the Gap Between Patient-specific and Patient-independent Seizure Prediction via Knowledge Distillation Feb 25, 2022 Knowledge Distillation Prediction
— Unverified 00 Bridging the Gap between Prior and Posterior Knowledge Selection for Knowledge-Grounded Dialogue Generation Nov 1, 2020 Decoder Dialogue Generation
— Unverified 00 Bridging the Gap: Unpacking the Hidden Challenges in Knowledge Distillation for Online Ranking Systems Aug 26, 2024 Knowledge Distillation Recommendation Systems
— Unverified 00 Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation May 19, 2025 Knowledge Distillation Prediction
— Unverified 00 Bring the Power of Diffusion Model to Defect Detection Aug 25, 2024 Defect Detection Denoising
— Unverified 00 Brittle Features May Help Anomaly Detection Apr 21, 2021 Anomaly Detection Knowledge Distillation
— Unverified 00 BS-PLCNet 2: Two-stage Band-split Packet Loss Concealment Network with Intra-model Knowledge Distillation Jun 10, 2024 Knowledge Distillation Packet Loss Concealment
— Unverified 00 Multihop: Leveraging Complex Models to Learn Accurate Simple Models Sep 14, 2021 Explainable artificial intelligence Knowledge Distillation
— Unverified 00 Building a Few-Shot Cross-Domain Multilingual NLU Model for Customer Care Jun 4, 2025 Intent Detection Knowledge Distillation
— Unverified 00 Building a Multi-domain Neural Machine Translation Model using Knowledge Distillation Apr 15, 2020 Domain Adaptation Knowledge Distillation
— Unverified 00 Building Lightweight Semantic Segmentation Models for Aerial Images Using Dual Relation Distillation Jun 25, 2025 Knowledge Distillation Relation
— Unverified 00 Building Vision-Language Models on Solid Foundations with Masked Distillation Jan 1, 2024 Contrastive Learning Knowledge Distillation
— Unverified 00 C2KD: Bridging the Modality Gap for Cross-Modal Knowledge Distillation Jan 1, 2024 Knowledge Distillation Transfer Learning
— Unverified 00