Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Revealing sign language for individuals who are deaf and mute through the utilization of neural networks.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Abstract:
      المقال يركز على تطوير أنظمة آلية تستخدم الشبكات العصبية لترجمة لغة الإشارة للأفراد الصم والبكم إلى نصوص تقليدية. قام الباحثون بإنشاء نماذج حققت معدلات دقة عالية باستخدام ثلاثة خوارزميات: الغابة العشوائية (98%)، والانحدار اللوجستي (99%)، وشجرة القرار (91%). تسلط الدراسة الضوء على أهمية لغة الإشارة كوسيلة تواصل أساسية لأكثر من 70 مليون شخص على مستوى العالم، وتهدف إلى تعزيز الشمولية من خلال تمكين أولئك غير المألوفين بلغة الإشارة من فهم إشاراتها. تضمنت المنهجية تدريب النماذج على مجموعة بيانات من الصور، مما يظهر إمكانيات كبيرة لتحسين التواصل بين الأفراد الصم والسامعين. [Extracted from the article]
    • Abstract:
      Living without effective communication poses significant challenges for humans. Individuals employ various methods to convey and share their thoughts and ideas between the sender and receiver. Two of the most prevalent means of communication are verbal speech, which relies on auditory perception, and non-verbal communication through gestures involving bodily movements such as hand gestures and facial expressions. Sign language, specifically categorized as a gestural language, is a unique form of communication that relies on visual perception for understanding and expression. While many individuals incorporate gestures into their communication, for deaf individuals, sign language is often their primary and essential means of communication. Individuals who are deaf and dumb rely on communication to interact with others, gain knowledge, and engage in their surroundings. Sign language serves as a crucial link that reduces the distance between them and the broader society. In order to enhance this communication, we've created models with the ability to identify sign language gestures and translate them into conventional text. Through the training of these models on a dataset employing neural networks, remarkable outcomes have been attained. This technology enables individuals, without prior knowledge of sign language, to understand the intentions and messages of individuals with disabilities, fostering greater inclusivity and accessibility in our society. Three algorithms were used to achieve this work and the findings show very good outcomes, i.e. Random Forest at 98%, Logistic Regression at 99%, and Decision Tree at 91%. [ABSTRACT FROM AUTHOR]
    • Abstract:
      Copyright of Albaydaa University Journal is the property of Albaydha University Journal and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)