SOTAVerified

Active Learning

Active Learning is a paradigm in supervised machine learning which uses fewer training examples to achieve better optimization by iteratively training a predictor, and using the predictor in each iteration to choose the training examples which will increase its chances of finding better configurations and at the same time improving the accuracy of the prediction model

Source: Polystore++: Accelerated Polystore System for Heterogeneous Workloads

Papers

Showing 15511575 of 3073 papers

TitleStatusHype
Approximate Bayesian Computation with Domain Expert in the LoopCode0
Competition over data: how does data purchase affect users?0
TrustAL: Trustworthy Active Learning using Knowledge Distillation0
Challenges and Opportunities for Machine Learning Classification of Behavior and Mental State from Images0
Optimal Data Selection: An Online Distributed ViewCode0
Cold Start Active Learning Strategies in the Context of Imbalanced Classification0
Little Help Makes a Big Difference: Leveraging Active Learning to Improve Unsupervised Time Series Anomaly Detection0
How Low Can We Go? Pixel Annotation for Semantic Segmentation0
DebtFree: Minimizing Labeling Cost in Self-Admitted Technical Debt Identification using Semi-Supervised Learning0
ADAPT: An Open-Source sUAS Payload for Real-Time Disaster Prediction and Response with AI0
Keeping Deep Lithography Simulators Updated: Global-Local Shape-Based Novelty Detection and Active Learning0
Analytic Mutual Information in Bayesian Neural Networks0
Active Learning Polynomial Threshold Functions0
HC4: A New Suite of Test Collections for Ad Hoc CLIRCode0
Partition-Based Active Learning for Graph Neural NetworksCode0
Batch versus Sequential Active Learning for Recommender Systems0
Efficient Sampling-Based Bayesian Active Learning for synaptic characterization0
Optimizing Active Learning for Low Annotation Budgets0
Improving the quality control of seismic data through active learning0
Improving Data Augmentation in Low-resource Question Answering with Active Learning in Multiple Stages0
AcTune: Uncertainty-Aware Active Self-Training for Active Fine-Tuning of Pretrained Language Models0
Active Gradual Machine Learning for Entity ResolutionCode0
Is More Data Better? Using Transformers-Based Active Learning for Efficient and Effective Detection of Abusive Language0
Cost-Effective Training in Low-Resource Neural Machine Translation0
Beyond Simple Meta-Learning: Multi-Purpose Models for Multi-Domain, Active and Continual Few-Shot Learning0
Show:102550
← PrevPage 63 of 123Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1TypiClustAccuracy93.2Unverified
2PT4ALAccuracy93.1Unverified
3Learning lossAccuracy91.01Unverified
4CoreGCNAccuracy90.7Unverified
5Core-setAccuracy89.92Unverified
6Random Baseline (Resnet18)Accuracy88.45Unverified
7Random Baseline (VGG16)Accuracy85.09Unverified