Hand-Gesture Detection Using Principal Component Analysis (PCA) and Adaptive Neuro-Fuzzy Inference System (ANFIS)

Anif Hanifa Setianingrum, Arifa Fauzia, Dzul Fadli Rahman

Abstract


Sign language is a non-verbal language that Deaf persons exclusively count on to connect with their social environment.The problem that occurs in two-way communication using sign language is a misunderstanding when learning new terms that need to be taught to deaf and mute people. To minimize these misunderstandings, a system is needed that can assist in correcting hand gestures so that there is no misinterpretation in teaching new terms. Several optimality properties of PCA have been identified namely: variance of extracted features is maximized; the extracted features are uncorrelated; finds best linear approximation in the mean-square sense and maximizes information contained in the extracted feature. The classification uses the Adaptive Neuro-Fuzzy Inference System (ANFIS) method. From the results of experiments with different image size variables, the largest accuracy was obtained with an image size of 449x449 of 76.20%. While the lowest accuracy of 52.38% is obtained through scenarios with image sizes of 57x57 and 45x45. Therefore, differences in the use of image sizes have an influence on the accuracy of hand signal prediction. The smaller the size given, the smaller the accuracy obtained. This is indicated by the decreasing accuracy value when given a smaller size in the four scenarios that have been studied.


Keywords


Hand Signal; PCA; ANFIS; Simulation

Full Text:

PDF

References


S. S. Berge, “How sign language interpreters use multimodal actions to coordinate turn-taking in group work between deaf and hearing upper secondary school students,” Interpret. Int. J. Res. Pr. Interpret., vol. 20, no. 1, pp. 102–131, 2018, doi: 10.1075/INTP.00004.BER.

Y. He, A. Kuerban, Q. Yu, and Q. Xie, “Design and Implementation of a Sign Language Translation System for Deaf People,” Proc. - 2021 3rd Int. Conf. Nat. Lang. Process. ICNLP 2021, pp. 15–154, 2021, doi: 10.1109/ICNLP52887.2021.00031.

A. Wadhawan and P. Kumar, “Sign Language Recognition Systems: A Decade Systematic Literature Review,” Arch. Comput. Methods Eng., vol. 28, no. 3, pp. 785–813, 2021, doi: 10.1007/S11831-019-09384-2.

X. Feng, Y. Jiang, X. Yang, M. Du, and X. Li, “Computer vision algorithms and hardware implementations: A survey,” Integration, vol. 69, pp. 309–320, 2019, doi: 10.1016/J.VLSI.2019.07.005.

M. S. Nixon and A. S. Aguado, “Feature extraction and image processing for computer vision,” Featur. Extr. Image Process. Comput. Vis., pp. 1–626, 2019, doi: 10.1016/C2017-0-02153-5.

N. Arif and E. Nursantosa, “Image Pattern Recognition in Spatial Data using Artificial Neural Network,” IOP Conf. Ser. Earth Environ. Sci., vol. 884, no. 1, 2021, doi: 10.1088/1755-1315/884/1/012050.

O. A. Adegbola, I. A. Adeyemo, F. A. Semire, S. I. Popoola, and A. A. Atayero, “A principal component analysis-based feature dimensionality reduction scheme for content-based image retrieval system,” Telkomnika (Telecommunication Comput. Electron. Control., vol. 18, no. 4, pp. 1892–1896, 2020, doi: 10.12928/TELKOMNIKA.V18I4.11176.

I. A. Fiqhi, R. Diana, and I. Wita, “Hijaiyah Letters Sign Language Recognition for Deaf and Hearing Impaired Children Based on Principal Component Analysis Method,” J. Phys. Conf. Ser., vol. 1569, no. 3, 2020, doi: J. Phys. Conf. Ser., vol. 1569, no. 3, Jul. 2020, doi: 10.1088/1742-6596/1569/3/032077.

S. A. Maleki, M. A. Tinati, and B. M. Tazehkand, “Image Interpolation Based on Adaptive Neuro-Fuzzy Inference System,” 2019 3rd Int. Conf. Imaging, Signal Process. Commun. ICISPC 2019, pp. 78–84, 2019, doi: 10.1109/ICISPC.2019.8935878.

C. Helgeson, V. Srikrishnan, K. Keller, and N. Tuana, “Why simpler computer simulation models can be epistemically better for informing decisions,” Philos. Sci., vol. 88, no. 2, pp. 213–233, 2021, doi: 10.1086/711501.

V. Kourbetis and S. Karipi, “How can you talk about bilingual education of the deaf if you do not teach sign language as a first language?,” Discuss. Biling. Deaf Child. Essays Honor Robert Hoffmeister, pp. 113–131, 2021, doi: 10.4324/9780367808686-8-10.

A. Abutalipov, A. Janaliyeva, M. Mukushev, A. Cerone, and A. Sandygulova, “Handshape Classification in a Reverse Dictionary of Sign Languages for the Deaf,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 12611 LNCS, pp. 217–226, 2021, doi: 10.1007/978-3-030-70650-0_14.

V. Mande, A. Glasser, B. Dingman, and M. Huenerfauth, “Deaf Users’ Preferences AmongWake-Up Approaches during Sign-Language Interaction with Personal Assistant Devices,” Conf. Hum. Factors Comput. Syst. - Proc., 2021, doi: 10.1145/3411763.3451592.




DOI: https://doi.org/10.15408/jti.v15i1.24869 Abstract - 0 PDF - 0

Refbacks

  • There are currently no refbacks.


Copyright (c) 2022 Anif Hanifa Setianingrum, Arifa Fauzia, Dzul Fadli Rahman

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

3rd Floor, Dept. of Informatics, Faculty of Science and Technology, UIN Syarif Hidayatullah Jakarta
Jl. Ir. H. Juanda No.95, Cempaka Putih, Ciputat Timur.
Kota Tangerang Selatan, Banten 15412
Tlp/Fax: +62 21 74019 25/ +62 749 3315
Handphone: +62 8128947537
E-mail: jurnal-ti@apps.uinjkt.ac.id


Creative Commons Licence
Jurnal Teknik Informatika by Prodi Teknik Informatika Universitas Islam Negeri Syarif Hidayatullah Jakarta is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at http://journal.uinjkt.ac.id/index.php/ti.

JTI Visitor Counter: View JTI Stats

 Flag Counter