SOTAVerified

Computational Efficiency

Methods and optimizations to reduce the computational resources (e.g., time, memory, or power) needed for training and inference in models. This involves techniques that streamline processing, optimize algorithms, or leverage hardware to enhance performance without compromising accuracy.

Papers

Showing 421430 of 4891 papers

TitleStatusHype
Probabilistic Emulation of the Community Radiative Transfer Model Using Machine Learning0
High-performance training and inference for deep equivariant interatomic potentialsCode4
Observability conditions for neural state-space models with eigenvalues and their roots of unity0
Research on Cloud Platform Network Traffic Monitoring and Anomaly Detection System based on Large Language Models0
LLMs meet Federated Learning for Scalable and Secure IoT Management0
A LoRA-Based Approach to Fine-Tuning LLMs for Educational Guidance in Resource-Constrained SettingsCode0
SUPRA: Subspace Parameterized Attention for Neural Operator on General Domains0
Simulating biochemical reactions: The Linear Noise Approximation can capture non-linear dynamics0
Feature Selection via GANs (GANFS): Enhancing Machine Learning Models for DDoS Mitigation0
DistilQwen2.5: Industrial Practices of Training Distilled Open Lightweight Language Models0
Show:102550
← PrevPage 43 of 490Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ViTaLHamming Loss0.05Unverified