InfantCryNet: A Data-driven Framework for Intelligent Analysis of Infant Cries Sep 29, 2024 Knowledge Distillation Model Compression
— Unverified 00 InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation Jun 25, 2024 Knowledge Distillation
— Unverified 00 Information Extraction from Heterogeneous Documents without Ground Truth Labels using Synthetic Label Generation and Knowledge Distillation Nov 22, 2024 Anomaly Detection document understanding
— Unverified 00 Information-Theoretic GAN Compression with Variational Energy-based Model Mar 28, 2023 Image Enhancement Knowledge Distillation
— Unverified 00 Inherit with Distillation and Evolve with Contrast: Exploring Class Incremental Semantic Segmentation Without Exemplar Memory Sep 27, 2023 Class-Incremental Semantic Segmentation Contrastive Learning
— Unverified 00 InhibiDistilbert: Knowledge Distillation for a ReLU and Addition-based Transformer Mar 20, 2025 Knowledge Distillation Model Compression
— Unverified 00 Initial Classifier Weights Replay for Memoryless Class Incremental Learning Aug 31, 2020 All class-incremental learning
— Unverified 00 Injecting Explainability and Lightweight Design into Weakly Supervised Video Anomaly Detection Systems Dec 28, 2024 Anomaly Detection Binary Classification
— Unverified 00 Injecting Spatial Information for Monaural Speech Enhancement via Knowledge Distillation Dec 2, 2022 Knowledge Distillation Speech Enhancement
— Unverified 00 Inplace knowledge distillation with teacher assistant for improved training of flexible deep neural networks May 18, 2021 image-classification Image Classification
— Unverified 00 In-situ animal behavior classification using knowledge distillation and fixed-point quantization Sep 9, 2022 Classification Knowledge Distillation
— Unverified 00 Instance-aware Model Ensemble With Distillation For Unsupervised Domain Adaptation Nov 15, 2022 Domain Adaptation Knowledge Distillation
— Unverified 00 In Teacher We Trust: Learning Compressed Models for Pedestrian Detection Dec 1, 2016 Knowledge Distillation Pedestrian Detection
— Unverified 00 Integrated Multi-Level Knowledge Distillation for Enhanced Speaker Verification Sep 14, 2024 Knowledge Distillation Speaker Verification
— Unverified 00 Integrating Arithmetic Learning Improves Mathematical Reasoning in Smaller Models Feb 18, 2025 Data Augmentation GSM8K
— Unverified 00 Integrating ChatGPT into Secure Hospital Networks: A Case Study on Improving Radiology Report Analysis Feb 14, 2024 Contrastive Learning Knowledge Distillation
— Unverified 00 Integration of Pre-trained Networks with Continuous Token Interface for End-to-End Spoken Language Understanding Apr 15, 2021 intent-classification Intent Classification
— Unverified 00 Interactive DualChecker for Mitigating Hallucinations in Distilling Large Language Models Aug 22, 2024 In-Context Learning Knowledge Distillation
— Unverified 00 Interactive Knowledge Distillation Jul 3, 2020 image-classification Image Classification
— Unverified 00 Interactive Multi-fidelity Learning for Cost-effective Adaptation of Language Model with Sparse Human Supervision Oct 31, 2023 Informativeness Knowledge Distillation
— Unverified 00 Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition Nov 28, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Intermediate Distillation: Data-Efficient Distillation from Black-Box LLMs for Information Retrieval Jun 18, 2024 Information Retrieval Knowledge Distillation
— Unverified 00 Interpretable discovery of new semiconductors with machine learning Jan 12, 2021 BIG-bench Machine Learning Knowledge Distillation
— Unverified 00 Interpretable Foreground Object Search As Knowledge Distillation Jul 20, 2020 Knowledge Distillation Object
— Unverified 00 Interpretable Traces, Unexpected Outcomes: Investigating the Disconnect in Trace-Based Knowledge Distillation May 20, 2025 Information Retrieval Knowledge Distillation
— Unverified 00 Interruption-Aware Cooperative Perception for V2X Communication-Aided Autonomous Driving Apr 24, 2023 Autonomous Driving Autonomous Vehicles
— Unverified 00 Intrinsic Image Decomposition for Robust Self-supervised Monocular Depth Estimation on Reflective Surfaces Mar 28, 2025 Depth Estimation Depth Prediction
— Unverified 00 Introspective Learning by Distilling Knowledge from Online Self-explanation Sep 19, 2020 Knowledge Distillation
— Unverified 00 Intuitive Access to Smartphone Settings Using Relevance Model Trained by Contrastive Learning Jul 15, 2023 Contrastive Learning Knowledge Distillation
— Unverified 00 Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models Feb 27, 2025 Knowledge Distillation Self-Knowledge Distillation
— Unverified 00 IP-MOT: Instance Prompt Learning for Cross-Domain Multi-Object Tracking Oct 30, 2024 Knowledge Distillation Language Modelling
— Unverified 00 IQ-VFI: Implicit Quadratic Motion Estimation for Video Frame Interpolation Jan 1, 2024 Knowledge Distillation Motion Estimation
— Unverified 00 Is Label Smoothing Truly Incompatible with Knowledge Distillation: An Empirical Study Apr 1, 2021 image-classification Image Classification
— Unverified 00 Is LLM the Silver Bullet to Low-Resource Languages Machine Translation? Mar 31, 2025 Articles Knowledge Distillation
— Unverified 00 Isotonic Data Augmentation for Knowledge Distillation Jul 3, 2021 Attribute Data Augmentation
— Unverified 00 ISP Distillation Jan 25, 2021 Knowledge Distillation Object Recognition
— Unverified 00 Iterative Dual Domain Adaptation for Neural Machine Translation Dec 16, 2019 Domain Adaptation Knowledge Distillation
— Unverified 00 Iterative Graph Self-Distillation Oct 23, 2020 Contrastive Learning Graph Learning
— Unverified 00 Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition Feb 4, 2022 Classification Knowledge Distillation
— Unverified 00 JEP-KD: Joint-Embedding Predictive Architecture Based Knowledge Distillation for Visual Speech Recognition Mar 4, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 基于层间知识蒸馏的神经机器翻译(Inter-layer Knowledge Distillation for Neural Machine Translation) Aug 1, 2021 Knowledge Distillation Machine Translation
— Unverified 00 Joint Architecture and Knowledge Distillation in CNN for Chinese Text Recognition Dec 17, 2019 Handwritten Chinese Text Recognition Knowledge Distillation
— Unverified 00 Joint-DetNAS: Upgrade Your Detector with NAS, Pruning and Dynamic Distillation May 27, 2021 Knowledge Distillation Neural Architecture Search
— Unverified 00 Joint Diffusion models in Continual Learning Nov 12, 2024 Continual Learning Knowledge Distillation
— Unverified 00 Joint Feature Distribution Alignment Learning for NIR-VIS and VIS-VIS Face Recognition Apr 25, 2022 Face Recognition Heterogeneous Face Recognition
— Unverified 00 Joint Input and Output Coordination for Class-Incremental Learning Sep 9, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 Jointly Learning Knowledge Embedding and Neighborhood Consensus with Relational Knowledge Distillation for Entity Alignment Jan 25, 2022 Benchmarking Entity Alignment
— Unverified 00 Joint Optimization of Streaming and Non-Streaming Automatic Speech Recognition with Multi-Decoder and Knowledge Distillation May 22, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Joint Semantic Knowledge Distillation and Masked Acoustic Modeling for Full-band Speech Restoration with Improved Intelligibility Sep 14, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Joint Speech Activity and Overlap Detection with Multi-Exit Architecture Sep 24, 2022 Action Detection Activity Detection
— Unverified 00