Wired Perspectives: Multi-View Wire Art Embraces Generative AI Nov 26, 2023 Knowledge Distillation
— Unverified 0Unlearning via Sparse Representations Nov 26, 2023 Knowledge Distillation
— Unverified 0Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification Nov 26, 2023 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Cosine Similarity Knowledge Distillation for Individual Class Information Transfer Nov 24, 2023 Knowledge Distillation Model Compression
— Unverified 0Maximizing Discrimination Capability of Knowledge Distillation with Energy Function Nov 24, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Efficient Open-world Reinforcement Learning via Knowledge Distillation and Autonomous Rule Discovery Nov 24, 2023 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Pseudo-label Correction for Instance-dependent Noise Using Teacher-student Framework Nov 24, 2023 Knowledge Distillation Pseudo Label
— Unverified 0Knowledge Distillation Based Semantic Communications For Multiple Users Nov 23, 2023 Decoder Knowledge Distillation
— Unverified 0Efficient and Robust Jet Tagging at the LHC with Knowledge Distillation Nov 23, 2023 Inductive Bias Jet Tagging
Code Code Available 0Some Like It Small: Czech Semantic Embedding Models for Industry Applications Nov 23, 2023 Image Retrieval Knowledge Distillation
Code Code Available 1Bridging Classical and Quantum Machine Learning: Knowledge Transfer From Classical to Quantum Neural Networks Using Knowledge Distillation Nov 23, 2023 Dimensionality Reduction Image Classification
— Unverified 0Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning Nov 23, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Education distillation:getting student models to learn in shcools Nov 23, 2023 Incremental Learning Knowledge Distillation
— Unverified 0Efficient Transformer Knowledge Distillation: A Performance Review Nov 22, 2023 Knowledge Distillation Model Compression
— Unverified 0EA-KD: Entropy-based Adaptive Knowledge Distillation Nov 22, 2023 image-classification Image Classification
— Unverified 0Point, Segment and Count: A Generalized Framework for Object Counting Nov 21, 2023 Knowledge Distillation Object
Code Code Available 1HoVer-UNet: Accelerating HoVerNet with UNet-based multi-class nuclei segmentation via knowledge distillation Nov 21, 2023 Instance Segmentation Knowledge Distillation
Code Code Available 1FreeKD: Knowledge Distillation via Semantic Frequency Prompt Nov 20, 2023 Knowledge Distillation
Code Code Available 1Unveiling the Unseen Potential of Graph Learning through MLPs: Effective Graph Learners Using Propagation-Embracing MLPs Nov 20, 2023 Graph Learning Graph Neural Network
— Unverified 0Expanding Scene Graph Boundaries: Fully Open-vocabulary Scene Graph Generation via Visual-Concept Alignment and Retention Nov 18, 2023 Concept Alignment Graph Generation
Code Code Available 1LightBTSeg: A lightweight breast tumor segmentation model using ultrasound images via dual-path joint knowledge distillation Nov 18, 2023 Knowledge Distillation Lesion Detection
— Unverified 0Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers Nov 17, 2023 Knowledge Distillation
— Unverified 0Semi-supervised ViT knowledge distillation network with style transfer normalization for colorectal liver metastases survival prediction Nov 17, 2023 Generative Adversarial Network Knowledge Distillation
— Unverified 0A Knowledge Distillation Approach for Sepsis Outcome Prediction from Multivariate Clinical Time Series Nov 16, 2023 Knowledge Distillation Time Series
— Unverified 0Multistage Collaborative Knowledge Distillation from a Large Language Model for Semi-Supervised Sequence Generation Nov 15, 2023 Constituency Parsing Knowledge Distillation
Code Code Available 0Unlock the Power: Competitive Distillation for Multi-Modal Large Language Models Nov 14, 2023 Knowledge Distillation Transfer Learning
— Unverified 0Distilling the Unknown to Unveil Certainty Nov 14, 2023 Knowledge Distillation Out of Distribution (OOD) Detection
Code Code Available 0Batch Selection and Communication for Active Learning with Edge Labeling Nov 14, 2023 Active Learning Knowledge Distillation
— Unverified 0Teach me with a Whisper: Enhancing Large Language Models for Analyzing Spoken Transcripts using Speech Embeddings Nov 13, 2023 Knowledge Distillation Language Modeling
— Unverified 0On Elastic Language Models Nov 13, 2023 Information Retrieval Knowledge Distillation
— Unverified 0Quantized Distillation: Optimizing Driver Activity Recognition Models for Resource-Constrained Environments Nov 10, 2023 Activity Recognition Autonomous Driving
Code Code Available 1DONUT-hole: DONUT Sparsification by Harnessing Knowledge and Optimizing Learning Efficiency Nov 9, 2023 document understanding Key Information Extraction
— Unverified 0Text Representation Distillation via Information Bottleneck Principle Nov 9, 2023 Knowledge Distillation Retrieval
Code Code Available 0Object-centric Cross-modal Feature Distillation for Event-based Object Detection Nov 9, 2023 Knowledge Distillation Object
— Unverified 0Bridging Dimensions: Confident Reachability for High-Dimensional Controllers Nov 8, 2023 Knowledge Distillation OpenAI Gym
Code Code Available 0Preference-Consistent Knowledge Distillation for Recommender System Nov 8, 2023 Knowledge Distillation Recommendation Systems
Code Code Available 0What is Lost in Knowledge Distillation? Nov 7, 2023 Knowledge Distillation Model Compression
— Unverified 0Supervised domain adaptation for building extraction from off-nadir aerial images Nov 7, 2023 Domain Adaptation Earth Observation
— Unverified 0Data exploitation: multi-task learning of object detection and semantic segmentation on partially annotated data Nov 7, 2023 Knowledge Distillation Multi-Task Learning
Code Code Available 0Reducing Spatial Fitting Error in Distillation of Denoising Diffusion Models Nov 7, 2023 Attribute Denoising
Code Code Available 0Asymmetric Masked Distillation for Pre-Training Small Foundation Models Nov 6, 2023 Action Classification Action Recognition
Code Code Available 0Co-training and Co-distillation for Quality Improvement and Compression of Language Models Nov 6, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Cross-Level Distillation and Feature Denoising for Cross-Domain Few-Shot Classification Nov 4, 2023 Classification Cross-Domain Few-Shot
Code Code Available 1After-Stroke Arm Paresis Detection using Kinematic Data Nov 3, 2023 Action Classification Knowledge Distillation
— Unverified 0Comparative Knowledge Distillation Nov 3, 2023 Data Augmentation Knowledge Distillation
Code Code Available 0Data-Free Distillation of Language Model by Text-to-Text Transfer Nov 3, 2023 Data-free Knowledge Distillation Diversity
— Unverified 0Distilling Out-of-Distribution Robustness from Vision-Language Foundation Models Nov 2, 2023 Data Augmentation Domain Generalization
Code Code Available 1An Efficient Detection and Control System for Underwater Docking using Machine Learning and Realistic Simulation: A Comprehensive Approach Nov 2, 2023 Generative Adversarial Network Image-to-Image Translation
— Unverified 0Multilingual DistilWhisper: Efficient Distillation of Multi-task Speech Models via Language-Specific Experts Nov 2, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 1Implicit Chain of Thought Reasoning via Knowledge Distillation Nov 2, 2023 Knowledge Distillation Math
Code Code Available 1