Difference between revisions of "Application of Artificial Intelligence to Assist People with Hearing Impairments Learn American Sign Language"

From Knoesis wiki
Jump to: navigation, search
(Wireframe Of Application)
(Wireframe Of Application)
 
Line 35: Line 35:
  
 
=Wireframe Of Application=
 
=Wireframe Of Application=
Below shows the wireframe of the application specifying the welcome pages, extension bar navigation, settings, profile, and the app contents. For a better view, visit the link:https://emailsc-my.sharepoint.com/:b:/g/personal/meierml_email_sc_edu/EXX22lduaKdElkQaUH6nwQ4BHHmiPON-9ix78g7q-GQjjQ?e=nrk6cJ
+
Below shows the wireframe of the application specifying the welcome pages, extension bar navigation, settings, profile, and the app contents. For a better view, visit the link:https://emailsc-my.sharepoint.com/:b:/g/personal/meierml_email_sc_edu/EXX22lduaKdElkQaUH6nwQ4B5M-NimKBYDjuitU6nz2gBQ?e=noRcUW
 
<html>
 
<html>
 
<center>
 
<center>

Latest revision as of 17:40, 27 September 2022

ASL Application

This application intends to serve as a way to aid the deaf community and those around them by providing a pathway to teach ASL, teach spoken English, and translate between the two languages.

People

Undergraduate Student: MacKenzie Meier

Mentor: Dr. Nick Boltin

Introduction

Nearly 466 million individuals globally are functionally deaf (432 million adults, 34 million children) [1]. Almost 90% of deaf children are born to hearing parents who do not know sign language, and many do not bother to learn it [2]. While 80% of these children will improve hearing function through a cochlear implant, their language skills are significantly delayed by about 50% in comparison to normal hearing children simply because they are missing the foundation for language in their primary years [3,4,5]. Research also shows that hearing loss can lead to cognitive decline in geriatric populations. However, when individuals with age-related hearing loss had increased sensory input, researchers saw a reverse in cognitive decline and improved brain plasticity [6,7]. This can be speculated to learning sign language as it increases visual sensory input which can thus increase synaptic plasticity in the brain and reverse cognitive decline.

It is evident that sign language can improve cognitive delay in children and potentially recover cognitive decline associated with age-related hearing loss in the elderly [5,7]. Yet, there are struggles that the signing community faces. A human translator is often required to accompany them to social settings, but it is not always feasible or available. In this work, we propose a comprehensive application called ASL Buddy© that looks at taking on these challenges. The app will use artificial intelligence and biometrics to teach sign language to children and adults while serving as a translational tool.

Current Technology

The idea of using artificial intelligence in teaching, interpreting, and translating American Sign Language (ASL) is being explored across the United States. One application already released to the public is Ace ASL, the first artificial intelligence-based app to hit the market. It scans movements during instruction and provides feedback on if the sign was repeated correctly. However, Ace ASL is limited in that it currently only recognizes the alphabet and numbers. Researchers at Michigan State University have created another application called Deep ASL. They utilize a biosensing software called Leap Motion that tracks hand movements to translate ASL to English. While this is a novel idea, there is a severe limitation in its usability. ASL is much more than just hand movements. ASL includes body language and facial expressions integrated with hand gestures. Therefore, any developed software should track the hands, face, and body.

How ASL Buddy© Will Overcome These Limitations

ASL Buddy© is designed to overcome these challenges by capturing biometrics of the face, body, and hands to ensure translation accuracy and become the #1 AI-driven software assistant for the deaf community. Additionally, it will be more advanced as it will include both aspects of teaching and translating in both American Sign Language (ASL) and English.

Application Development

The development of ASL Buddy© is to be done in four main phases: 1) performing a lengthy literature review to stay informed on the deaf community, 2) developing a prototype using a wireframe software, 3) run a series of human-computer interaction tests to gain feedback, and 4) develop a software prototype.

Integrated App Design

Below is the flowchart defining the integrated app design of ASL Buddy©. It shows the generalized steps taken from when the application is first opened, to navigation through the app, navigation to the lessons, to show where the Artificial Intelligence input takes place, to finally reach the destination of AI logic and feedback.

Wireframe Of Application

Below shows the wireframe of the application specifying the welcome pages, extension bar navigation, settings, profile, and the app contents. For a better view, visit the link:https://emailsc-my.sharepoint.com/:b:/g/personal/meierml_email_sc_edu/EXX22lduaKdElkQaUH6nwQ4B5M-NimKBYDjuitU6nz2gBQ?e=noRcUW

Human-Computer Interaction Testing

In the project's third phase, we will start testing human factors using our wireframe prototype on task-specific personas. The personas being tested include a variety of ages, backgrounds, and needs for this application. They will be assigned to willing participants in this study who will navigate through the application after being given tasks included in these personas. The participant will then rate the usability of the app and provide feedback on what could be improved. In addition, the application will be consulted by a panel of professionals in human-computer interaction, speech pathology, and ASL instructors. Feedback from the HCI professionals and testing participants will be analyzed, and necessary edits will be made to the prototype.

References

1. World Health Organization. (n.d.). Deafness and hearing loss. World Health Organization. Retrieved May 25, 2022, from https://www.who.int/news-room/fact-sheets/detail/deafness-andhearing-loss

2. Dougherty, E. (2017, March 6). Studying language acquisition in Deaf Children: The brink. Boston University. Retrieved May 13, 2022, from https://www.bu.edu/articles/2017/asl-languageacquisition/

3. Humphries, T., Kushalnagar, P., Mathur, G., Napoli, D. J., Padden, C., Rathmann, C., & Smith, S. R. (2012). Language acquisition for deaf children: Reducing the harms of zero tolerance to the use of alternative approaches. Harm Reduction Journal, 9(1). https://doi.org/10.1186/1477-7517-9-16

4. Ostojić S, Djoković S, Dimić N, Mikić B. Cochlear implant--speech and language development in deaf and hard of hearing children following implantation. Vojnosanit Pregl. 2011 Apr;68(4):349- 52. doi: 10.2298/vsp1104349o. PMID: 21627020.

5. Schiff, N. B., & Ventry, I. M. (1976). Communication problems in hearing children of deaf parents. Journal of Speech and Hearing Disorders, 41(3), 348–358. https://doi.org/10.1044/jshd.4103.348

6. Denise Manahan-Vaughan, Olena Shchyglo, Mirko Feldmann, Daniela Beckmann. Hippocampal Synaptic Plasticity, Spatial Memory, and Neurotransmitter Receptor Expression Are Profoundly Altered by Gradual Loss of Hearing Ability. Cerebral Cortex, 2020; DOI: 10.1093/cercor/bhaa061

7. Hewitt, D. (2017). Age-related hearing loss and cognitive decline: You haven't heard the half of it. Figure 1, Ace ASL: The first ASL app Frontiers in Aging Neuroscience, 9. https://doi.org/10.3389/fnagi.2017.00112