SOTAVerified

Mediapipe and CNNs for Real-Time ASL Gesture Recognition

2023-05-09Unverified0· sign in to hype

Rupesh Kumar, Ashutosh Bajpai, Ayush Sinha

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This research paper describes a realtime system for identifying American Sign Language (ASL) movements that employs modern computer vision and machine learning approaches. The suggested method makes use of the Mediapipe library for feature extraction and a Convolutional Neural Network (CNN) for ASL gesture classification. The testing results show that the suggested system can detect all ASL alphabets with an accuracy of 99.95%, indicating its potential for use in communication devices for people with hearing impairments. The proposed approach can also be applied to additional sign languages with similar hand motions, potentially increasing the quality of life for people with hearing loss. Overall, the study demonstrates the effectiveness of using Mediapipe and CNN for real-time sign language recognition, making a significant contribution to the field of computer vision and machine learning.

Tasks

Reproductions