Kernel Modulation: A Parameter-Efficient Method for Training Convolutional Neural Networks Mar 29, 2022 Meta-Learning Model Compression
— Unverified 0KIMERA: Injecting Domain Knowledge into Vacant Transformer Heads Sep 29, 2021 Information Retrieval Model Compression
— Unverified 0KMIR: A Benchmark for Evaluating Knowledge Memorization, Identification and Reasoning Abilities of Language Models Feb 28, 2022 General Knowledge Memorization
— Unverified 0Knowledge Distillation: A Survey Jun 9, 2020 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation Based Semantic Communications For Multiple Users Nov 23, 2023 Decoder Knowledge Distillation
— Unverified 0Knowledge Distillation Beyond Model Compression Jul 3, 2020 Knowledge Distillation model
— Unverified 0Knowledge Distillation for Image Restoration : Simultaneous Learning from Degraded and Clean Images Jan 16, 2025 Decoder Image Reconstruction
— Unverified 0Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation Dec 9, 2021 image-classification Image Classification
— Unverified 0Knowledge Distillation for Object Detection: from generic to remote sensing datasets Jul 18, 2023 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation for Oriented Object Detection on Aerial Images Jun 20, 2022 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation for Swedish NER models: A Search for Performance and Efficiency May 1, 2021 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation in Federated Learning: a Survey on Long Lasting Challenges and New Solutions Jun 16, 2024 Federated Learning Knowledge Distillation
— Unverified 0Knowledge Distillation in Vision Transformers: A Critical Review Feb 4, 2023 Decoder image-classification
— Unverified 0Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher Oct 20, 2020 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation on Graphs: A Survey Feb 1, 2023 Knowledge Distillation Model Compression
— Unverified 0Knowledge distillation via adaptive instance normalization Mar 9, 2020 Knowledge Distillation Model Compression
— Unverified 0Knowledge distillation via softmax regression representation learning Jan 1, 2021 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation with Multi-granularity Mixture of Priors for Image Super-Resolution Apr 3, 2024 Image Super-Resolution Knowledge Distillation
— Unverified 0Representative Teacher Keys for Knowledge Distillation Model Compression Based on Attention Mechanism for Image Classification Jun 26, 2022 GPU image-classification
— Unverified 0Know What You Don't Need: Single-Shot Meta-Pruning for Attention Heads Nov 7, 2020 Informativeness Meta-Learning
— Unverified 0KroneckerBERT: Learning Kronecker Decomposition for Pre-trained Language Models via Knowledge Distillation Sep 13, 2021 Knowledge Distillation Language Modeling
— Unverified 0KroneckerBERT: Significant Compression of Pre-trained Language Models Through Kronecker Decomposition and Knowledge Distillation Jul 1, 2022 Knowledge Distillation Language Modeling
— Unverified 0Kronecker Decomposition for GPT Compression Oct 15, 2021 Knowledge Distillation Language Modeling
— Unverified 0L4Q: Parameter Efficient Quantization-Aware Fine-Tuning on Large Language Models Feb 7, 2024 Few-Shot Learning In-Context Learning
— Unverified 0LadaBERT: Lightweight Adaptation of BERT through Hybrid Model Compression Apr 8, 2020 Blocking Knowledge Distillation
— Unverified 0Language model compression with weighted low-rank factorization Jun 30, 2022 Language Modeling Language Modelling
— Unverified 0Large Language Model Compression via the Nested Activation-Aware Decomposition Mar 21, 2025 Language Modeling Language Modelling
— Unverified 0Large receptive field strategy and important feature extraction strategy in 3D object detection Jan 22, 2024 3D Object Detection Autonomous Driving
— Unverified 0Large-Scale Generative Data-Free Distillation Dec 10, 2020 Knowledge Distillation Model Compression
— Unverified 0LatentLLM: Attention-Aware Joint Tensor Compression May 23, 2025 Model Compression Tensor Decomposition
— Unverified 0LayerCollapse: Adaptive compression of neural networks Nov 29, 2023 Computational Efficiency image-classification
— Unverified 0Layer-specific Optimization for Mixed Data Flow with Mixed Precision in FPGA Design for CNN-based Object Detectors Sep 3, 2020 Bayesian Optimization Model Compression
— Unverified 0LCQ: Low-Rank Codebook based Quantization for Large Language Models May 31, 2024 Model Compression Quantization
— Unverified 0Learning a Neural Diff for Speech Models Aug 3, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Learning-Based Symbol Level Precoding: A Memory-Efficient Unsupervised Learning Approach Nov 15, 2021 Model Compression
— Unverified 0Learning Compressed Embeddings for On-Device Inference Mar 18, 2022 Model Compression Recommendation Systems
— Unverified 0Learning Disentangled Representation with Mutual Information Maximization for Real-Time UAV Tracking Aug 20, 2023 CPU Model Compression
— Unverified 0Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning Sep 29, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Learning Efficient Object Detection Models with Knowledge Distillation Dec 1, 2017 Knowledge Distillation Model Compression
— Unverified 0Learning by Sampling and Compressing: Efficient Graph Representation Learning with Extremely Limited Annotations Mar 13, 2020 Graph Embedding Graph Representation Learning
— Unverified 0MixMix: All You Need for Data-Free Compression Are Feature and Data Mixing Nov 19, 2020 All Knowledge Distillation
— Unverified 0Learning Interpretation with Explainable Knowledge Distillation Nov 12, 2021 Knowledge Distillation Model Compression
— Unverified 0Learning Low-Rank Approximation for CNNs May 24, 2019 Model Compression
— Unverified 0Learning Low-Rank Representations for Model Compression Nov 21, 2022 Clustering model
— Unverified 0Model Compression Method for S4 with Diagonal State Space Layers using Balanced Truncation Feb 25, 2024 Model Compression
— Unverified 0Learning to Collide: Recommendation System Model Compression with Learned Hash Functions Mar 28, 2022 Model Compression
— Unverified 0Learning to Prune Deep Neural Networks via Reinforcement Learning Jul 9, 2020 Deep Reinforcement Learning Model Compression
— Unverified 0Legal-Tech Open Diaries: Lesson learned on how to develop and deploy light-weight models in the era of humongous Language Models Oct 24, 2022 Knowledge Distillation Model Compression
— Unverified 0LegoDNN: Block-grained Scaling of Deep Neural Networks for Mobile Vision Dec 18, 2021 Knowledge Distillation Model Compression
— Unverified 0Leveraging Different Learning Styles for Improved Knowledge Distillation in Biomedical Imaging Dec 6, 2022 Knowledge Distillation Model Compression
— Unverified 0