LangAI: Real-Time Sign Language Recognition and Translation using Convolution Neural Networks and Image Processing Algorithms
Abstract – People with hearing disabilities have extreme difficulties with communication and are often times unable to share information, ideas, and emotions. Currently, the most effective way to bridge this communication barrier is through the use of hand gesture-based sign language. But using hand gestures requires sign language interpreters to help translate hand signals into spoken word, and if a sign language interpreter is unavailable, people with hearing disabilities remain unheard to the rest of the population. This research aims to develop, LangAI, a smart mobile application that efficiently and accurately translates American Sign Language into the English alphabet in real-time to reduce the dependence people with hearing disabilities have on sign language translators. This application utilizes a fine-tuned VGG-16 CNN model that was found to be the most optimal after a comparative analysis of five different CNN architectures where it possessed the highest validation accuracy of 96.14% and F1 score of 0.975. LangAI’s supplemental features include a learning tab where users are connected to open-source sign language learning resources and a mindfulness tab where users can learn fun facts about this history of sign language and other communication methods. LangAI has the ability to significantly aid people with hearing disabilities around the world. It solves all of the shortcomings of current solutions and research to create direct positive human impact which has the potential to progressively improve the lives of countless people who are often marginalized due to their disability.