Research Article

Towards Adaptive Real-Time Hand Sign Language Interfaces Addressing User Independence, Pose and Occlusion Variability with Machine Learning

by  Leena Chandrashekar, Sanjay S.B., Raghavendra M. Hegde, Samarth Shinnur, Samrudh B.R.
journal cover
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 187 - Issue 66
Published: December 2025
Authors: Leena Chandrashekar, Sanjay S.B., Raghavendra M. Hegde, Samarth Shinnur, Samrudh B.R.
10.5120/ijca2025926116
PDF

Leena Chandrashekar, Sanjay S.B., Raghavendra M. Hegde, Samarth Shinnur, Samrudh B.R. . Towards Adaptive Real-Time Hand Sign Language Interfaces Addressing User Independence, Pose and Occlusion Variability with Machine Learning. International Journal of Computer Applications. 187, 66 (December 2025), 41-51. DOI=10.5120/ijca2025926116

                        @article{ 10.5120/ijca2025926116,
                        author  = { Leena Chandrashekar,Sanjay S.B.,Raghavendra M. Hegde,Samarth Shinnur,Samrudh B.R. },
                        title   = { Towards Adaptive Real-Time Hand Sign Language Interfaces Addressing User Independence, Pose and Occlusion Variability with Machine Learning },
                        journal = { International Journal of Computer Applications },
                        year    = { 2025 },
                        volume  = { 187 },
                        number  = { 66 },
                        pages   = { 41-51 },
                        doi     = { 10.5120/ijca2025926116 },
                        publisher = { Foundation of Computer Science (FCS), NY, USA }
                        }
                        %0 Journal Article
                        %D 2025
                        %A Leena Chandrashekar
                        %A Sanjay S.B.
                        %A Raghavendra M. Hegde
                        %A Samarth Shinnur
                        %A Samrudh B.R.
                        %T Towards Adaptive Real-Time Hand Sign Language Interfaces Addressing User Independence, Pose and Occlusion Variability with Machine Learning%T 
                        %J International Journal of Computer Applications
                        %V 187
                        %N 66
                        %P 41-51
                        %R 10.5120/ijca2025926116
                        %I Foundation of Computer Science (FCS), NY, USA
Abstract

Human-Computer Interaction (HCI) has emerged as a critical component of navigating and connecting with the digital world as technology has advanced. Hand gesture recognition has received substantial interest as a natural and intuitive communication interface. This paper describes the design and implementation of a real-time hand gesture detection system for supporting HCI, with a special focus on assisting the hearing and speech challenged. The proposed system captures hand gestures using a real-time video camera and creates a bespoke dataset that is robust by accounting for user, posture, and occlusion variability. A Convolutional Neural Network (CNN) is used to extract features, with 21 important features identified for each hand gesture. These features are then classified using a Random Forest method, which achieves an overall accuracy of 94.58% over several instances. Recognized gestures are translated into text and speech, allowing for efficient and convenient communication. The method allows you to combine various gestures to make whole sentences, which are often used in regular interactions. Performance assessment under different lighting circumstances reveals a PSNR of 3 to 4.27 dB, suggesting robustness to illumination fluctuations. A graphical user interface (GUI) with a feedback system allows for seamless two-way interaction, which improves usability and accessibility.

References
  • Alonazi, Mohammed, Hira Ansar, Naif Al Mudawi, Saud S. Alotaibi, Nouf Abdullah Almujally, Abdulwahab Alazeb, Ahmad Jalal, Jaekwang Kim and Moohong Min. “Smart Healthcare Hand Gesture Recognition Using CNN-Based Detector and Deep Belief Network.” IEEE Access 11 (2023): 84922-84933
  • Abu Salem Musa Miah, Md. Al Mehedi Hasan and J. Shin, "Dynamic Hand Gesture Recognition Using Multi-Branch Attention Based Graph and General Deep Learning Model," in IEEE Access, vol. 11, pp. 4703-4716, 2023, doi: 10.1109/ACCESS.2023.3235368.
  • Dewi, Christine, Abbott Po Shun Chen, and Henoch Juli Christanto, "Deep Learning for Highly Accurate Hand Recognition Based on Yolov7 Model" Big Data and Cognitive Computing 2023, 7(1):53. https://doi.org/10.3390/bdcc7010053.
  • Shin J, Matsuoka A, Hasan MAM, Srizon AY. American Sign Language Alphabet Recognition by Extracting Feature from Hand Pose Estimation. Sensors. 2021; 21(17):5856. https://doi.org/10.3390/s21175856.
  • Chen, Renxiang, and Xia Tian, "Gesture Detection and Recognition Based on Object Detection in Complex Background," 2023 Applied Sciences 13, no. 7: 4480. https://doi.org/10.3390/app13074480.
  • Shashidhar R, A. S. Manjunath and B. N. Arunakumari, "Indian Sign Language to Speech Conversion Using Convolutional Neural Network," 2022 IEEE 2nd Mysore Sub Section International Conference (MysuruCon), Mysuru, India, 2022, pp. 1-5, doi: 10.1109/MysuruCon55714.2022.9972574.
  • Han, Jeong-Seop, et al. "A study on real-time hand gesture recognition technology by machine learning-based mediapipe." Journal of System and Management Sciences 12.2 (2022): 462-476.
  • Sung, George, et al. "On-device real-time hand gesture recognition." arXiv preprint arXiv:2111.00038 (2021).
  • Wang, J., Ivrissimtzis, I., Li, Z. et al., “Hand gesture recognition for user-defined textual inputs and gestures. Univ Access Inf Soc ., 2024. https://doi.org/10.1007/s10209-024-01139-6.
  • Jyotishman Bora & Dehingia, Saine & Boruah, Abhijit & Chetia, Anuraag & Gogoi, Dikhit, “Real-time Assamese Sign Language Recognition using MediaPipe and Deep Learning,” 2023 Procedia Computer Science. 218. 1384-1393. 10.1016/j.procs.2023.01.117.
  • Sangjun, O. & Mallipeddi, Rammohan & Lee, Minho, “Real Time Hand Gesture Recognition Using Random Forest and Linear Discriminant Analysis”, In Proceedings of the 3rd International Conference on Human-Agent Interaction (HAI '15). Association for Computing Machinery, New York, NY, USA, 279–282. https://doi.org/10.1145/2814940.2814997.
  • Bhushan, Shashi, Mohammed Alshehri, Ismail Keshta, Ashish Kumar Chakraverti, Jitendra Rajpurohit, and Ahed Abugabah. 2022. "An Experimental Analysis of Various Machine Learning Algorithms for Hand Gesture Recognition" Electronics 11, no. 6: 968. https://doi.org/10.3390/electronics11060968.
  • Shin, Jungpil & Miah, Abu Saleh Musa & Akiba, Yuto & Hirooka, Koki & Hassan, Najmul & Hwang, Yong. (2024). Korean Sign Language Alphabet Recognition through the Integration of Handcrafted and Deep Learning-Based Two-Stream Feature Extraction Approach. IEEE Access. PP. 1-1. 10.1109/ACCESS.2024.3399839.
  • Mohammadi, Zahra, Alireza Akhavanpour, Razieh Rastgoo, and Mohammad Sabokrou. "Diverse hand gesture recognition dataset." Multimedia Tools and Applications 83, no. 17 (2024): 50245-50267.
  • Suharjito, Suharjito & Wiryana, Fanny & Zahra, Amalia. (2018). Feature Extraction Methods in Sign Language Recognition System: A Literature Review. 11-15. 10.1109/INAPR.2018.8626857.
  • S. Adhikary, A. K. Talukdar and K. Kumar Sarma, "A Vision-based System for Recognition of Words used in Indian Sign Language Using MediaPipe," 2021 Sixth International Conference on Image Information Processing (ICIIP), Shimla, India, 2021, pp. 390-394, doi: 10.1109/ICIIP53038.2021.9702551.
  • Antonio Guadalupe Cruz Bautista, Jose-Joel Gonzalez-Barbosa, Juan Bautista, Hurtado-Ramos, Francisco-Javier Ornelas-Rodriguez, Erick-Alejandro Gonzalez-Barbosa, "Hand Features Extractor Using Hand Contour - A Case Study", Automatika - Journal for Control, Measurement, Electronics, Computing and Communications, Vol. 61, No. 1, pp. 99-108, 2020.
  • Ribó, Alba & Warchoł, Dawid & Oszust, Mariusz, “An Approach to Gesture Recognition with Skeletal Data Using Dynamic Time Warping and Nearest Neighbor Classifier,” Journal of Intelligent Learning Systems and Applications, 2016, Vol. 8. pp. 1-8. 10.5815/ijisa.2016.06.01.
  • A. S, A. Potluri, S. M. George, G. R and A. S, "Indian Sign Language Recognition Using Random Forest Classifier," 2021 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT), Bangalore, India, 2021, pp. 1-6, doi: 10.1109/CONECCT52877.2021.9622672.
  • Joshi, Harita & Golhar, Vaibhav & Gundawar, Janhavi & Gangurde, Akash & Yenkikar, Anuradha & Sable, Nilesh. (2024). Real-Time Sign Language Recognition and Sentence Generation. SSRN Electronic Journal. 10.2139/ssrn.4992818.
  • V. Bansal, S. Sinha, R. Astya, A. K. Sagar and K. Sahu, "A Hybrid Approach to Sign Language Recognition Using MediaPipe and Machine Learning," 2025 IEEE International Students' Conference on Electrical, Electronics and Computer Science (SCEECS), Bhopal, India, 2025, pp. 1-6, doi: 10.1109/SCEECS64059.2025.10940312.
  • V. Radhika, C. R. Prasad and A. Chakradhar, "Smartphone-Based Human Activities Recognition System using Random Forest Algorithm," 2022 International Conference for Advancement in Technology (ICONAT), Goa, India, 2022, pp. 1-4, doi: 10.1109/ICONAT53423.2022.9726006.
  • B. Ben Atitallah et al., "Hand Sign Recognition System Based on EIT Imaging and Robust CNN Classification," in IEEE Sensors Journal, Vol. 22, no. 2, pp. 1729-1737, 15 Jan.15, 2022, doi: 10.1109/JSEN.2021.3130982.
  • Mohit Patil, Pranay Pathole, Hrishikesh Patil, Ashutosh Raut, Prof. S S Jadhav. “Indian Sign Language Recognition,” International Journal of Scientific Research Engineering Trends, Volume 6, Issue 4, July-Aug-2020, ISSN (Online): 2395-566X.
  • Mohammadi, Zahra, Alireza Akhavanpour, Razieh Rastgoo, and Mohammad Sabokrou. "Diverse hand gesture recognition dataset." Multimedia Tools and Applications 83, no. 17 (2024): 50245-50267.
  • Pala, Greeshma, et al. "Machine learning-based hand sign recognition." 2021 International Conference on Artificial Intelligence and Smart Systems (ICAIS). IEEE, 2021.
Index Terms
Computer Science
Information Sciences
No index terms available.
Keywords

Human Computer Interaction Sign Language Hand Gesture Recognition User and Pose Independence Occlusion Variability

Powered by PhDFocusTM