Publication Type : Conference Paper
Publisher : IEEE
Source : 2024 12th International Conference on Intelligent Systems and Embedded Design (ISED)
Url : https://doi.org/10.1109/ised63599.2024.10957074
Campus : Coimbatore
School : School of Artificial Intelligence
Year : 2024
Abstract : Effective communication is fundamental to human interaction, yet approximately 6.1% of the global population is deprived of this ability due to damaged vocal cords. This impairment hinders their capacity to share thoughts and emotions, isolating them from their community. Traditional methods like hand gestures and sign language offer a solution, but the lack of universal knowledge of these signs creates further obstacles. To bridge this communication gap, a novel system has been introduced that utilizes Arduino Mega and flex sensors as the hardware foundation, and the K-Nearest Neighbors (KNN) machine learning algorithm as the software core, this model facilitates the recognition and interpretation of hand gestures. This innovative approach promises to dismantle the barriers faced by those with speech impairments, fostering and understanding in everyday interactions.
Cite this Research Publication : Surya Ha, K S Venkatram, Vishal Seshadri B, Sanggit Saaran K C S, Malathi M, Rahul Satheesh, Hand Gesture-Driven Speech Aid for Mute Individuals, 2024 12th International Conference on Intelligent Systems and Embedded Design (ISED), IEEE, 2024, https://doi.org/10.1109/ised63599.2024.10957074