SOTAVerified

SHAKTI: A 2.5 Billion Parameter Small Language Model Optimized for Edge AI and Low-Resource Environments

2024-10-15Unverified0· sign in to hype

Syed Abdul Gaffar Shakhadri, Kruthika KR, Rakshit Aralimatti

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We introduce Shakti, a 2.5 billion parameter language model specifically optimized for resource-constrained environments such as edge devices, including smartphones, wearables, and IoT systems. Shakti combines high-performance NLP with optimized efficiency and precision, making it ideal for real-time AI applications where computational resources and memory are limited. With support for vernacular languages and domain-specific tasks, Shakti excels in industries such as healthcare, finance, and customer service. Benchmark evaluations demonstrate that Shakti performs competitively against larger models while maintaining low latency and on-device efficiency, positioning it as a leading solution for edge AI.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
BBHShakti-LLM (2.5B)Accuracy58.2Unverified
BoolQShakti-LLM (2.5B)Accuracy61.1Unverified
HellaSwagShakti-LLM (2.5B)Accuracy52.4Unverified
MedQAShakti-LLM (2.5B)Accuracy60.3Unverified
MMLqwen-LLM 7BAccuracy71.8Unverified
PIQAShakti-LLM (2.5B)Accuracy86.2Unverified
TriviaQAShakti-LLM (2.5B)EM58.2Unverified
TruthfulQAShakti-LLM (2.5B)Accuracy68.4Unverified

Reproductions