Application of Artificial Intelligence to Assist People with Hearing Impairments Learn American Sign Language

From Knoesis wiki
Revision as of 03:26, 22 September 2022 by Dipesh (Talk | contribs)

Jump to: navigation, search

ASL Application

This application intends to serve as a way to aid the deaf community and those around them by providing a pathway to teach ASL, teach spoken English, and translate between the two languages.

People

Undergraduate Student: MacKenzie Meier

Mentor: Dr. Nick Boltin

Introduction

Nearly 466 million individuals globally are functionally deaf (432 million adults, 34 million children) [1]. Almost 90% of deaf children are born to hearing parents who do not know sign language, and many do not bother to learn it [2]. While 80% of these children will improve hearing function through a cochlear implant, their language skills are significantly delayed by about 50% in comparison to normal hearing children simply because they are missing the foundation for language in their primary years [3,4,5]. Research also shows that hearing loss can lead to cognitive decline in geriatric populations. However, when individuals with age-related hearing loss had increased sensory input, researchers saw a reverse in cognitive decline and improved brain plasticity [6,7]. This can be speculated to learning sign language as it increases visual sensory input which can thus increase synaptic plasticity in the brain and reverse cognitive decline.

It is evident that sign language can improve cognitive delay in children and potentially recover cognitive decline associated with age-related hearing loss in the elderly [5,7]. Yet, there are struggles that the signing community faces. A human translator is often required to accompany them to social settings, but it is not always feasible or available. In this work, we propose a comprehensive application called ASL Buddy that looks at taking on these challenges. The app will use artificial intelligence and biometrics to teach sign language to children and adults while serving as a translational tool.

Current Technology

The idea of using artificial intelligence in teaching, interpreting, and translating American Sign Language (ASL) is being explored across the United States. One application already released to the public is Ace ASL, the first artificial intelligence-based app to hit the market. It scans movements during instruction and provides feedback on if the sign was repeated correctly. However, Ace ASL is limited in that it currently only recognizes the alphabet and numbers. Researchers at Michigan State University have created another application called Deep ASL. They utilize a biosensing software called Leap Motion that tracks hand movements to translate ASL to English. While this is a novel idea, there is a severe limitation in its usability. ASL is much more than just hand movements. ASL includes body language and facial expressions integrated with hand gestures. Therefore, any developed software should track the hands, face, and body. ASL Buddy is designed to overcome these challenges by capturing biometrics of the face, body, and hands to ensure translation accuracy and become the #1 AI-driven software assistant for the deaf community.