Analysis and data processing systems

ANALYSIS AND DATA PROCESSING SYSTEMS

Print ISSN: 2782-2001          Online ISSN: 2782-215X
English | Русский

Recent issue
№2(98) April - June 2025

Recognition of Russian and Indian sign languages used by the deaf people

Issue No 2-3 (79) April - September 2020
Authors:

Elakkiya R.,
Grif Mikhail G.,
Prikhodko Alexey L.,
Bakaev Maxim A.
DOI: http://dx.doi.org/10.17212/1814-1196-2020-2-3-57-76
Abstract

In our paper, we consider approaches towards the recognition of sign languages used by the deaf people in Russia and India. The structure of the recognition system for individual gestures is proposed based on the identification of its five components: configuration, orientation, localization, movement and non-manual markers. We overview the methods applied for the recognition of both individual gestures and continuous Indian and Russian sign languages. In particular we consider the problem of building corpuses of sign languages, as well as sets of training data (datasets). We note the similarity of certain individual gestures in Russian and Indian sign languages and specify the structure of the local dataset for static gestures of the Russian sign language. For the dataset, 927 video files with static one-handed gestures were collected and converted to JSON using the OpenPose library. After analyzing 21 points of the skeletal model of the right hand, the obtained reliability for the choice of points equal to 0.61, which was found insufficient. It is noted that the recognition of individual gestures and sign speech in general is complicated by the need for accurate tracking of various components of the gestures, which are performed quite quickly and are complicated by overlapping hands and faces. To solve this problem, we further propose an approach related to the development of a biosimilar neural network, which is to process visual information similarly to the human cerebral cortex: identification of lines, construction of edges, detection of movements, identification of geometric shapes, determination of the direction and speed of the objects movement. We are currently testing a biologically similar neural network proposed by A.V. Kugaevskikh on video files from the Russian sign language dataset.


Keywords: Russian sign language, Indian sign language, gesture recognition, deaf sign components, artificial neural network, machine learning, training data sets

References

1. Indian Sign Language Research and Training Centre (ISLRTC). History. – URL: http://www.islrtc.nic.in/history-0 (accessed: 13.10.2020).



2. A multilingual multimedia Indian sign language dictionary tool / T. Dasgupta, S. Shukla, S. Kumar, S. Diwakar, A. Basu // The 6th Workshop on Asian Language Resources (ALR 6): Proceedings of the Workshop. – Hyderabad, India, 2008. – P. 57–64.



3. ISL dictionary launch / Indian Sign Language Research and Training Centre. – URL: http://www.islrtc.nic.in/isl-dictionary-launch (accessed: 13.10.2020). – Пер. загл.: Запуск словаря ISL.



4. Tavari N.V., Deorankar A.V., Chatur P.N. Hand gesture recognition of Indian sign language to aid physically impaired people // International Journal of Engineering Research and Applications. – 2014. – Spec. iss. ICIAC, vol. 5. – P. 60–66.



5. Vasishta M., Woodward J., Santis S. de. An introduction to Indian sign language: (Focus on Delhi). – New Delhi, India: All India Fedeartion of the Deaf, 1980. – 176 p.



6. Indian sign language dictionary. – URL: http://indiansignlanguage.org/dictionary/ (accessed: 13.10.2020). – Пер. загл.: Словарь индийского языка жестов.



7. Королькова О.О. Определение объема «Полного словаря русского языка жестов» // Современные исследования социальных проблем. – 2014. – № 3 (19). – С. 69–74.



8. Видео словарь русского языка жестов // Институт социальной реабилитации НГТУ: сайт. – Новосибирск, 2011. – URL: http://www.nisor.ru/snews/oa-/ (дата обращения: 13.10.2020).



9. Гейльман И.Ф. Специфические средства общения глухих: дактилология и мимика: в 4 ч. – Л.: ВОГ, 1975–1979. – 4 ч.



10. Словарь русского жестового языка / В.З. Базоев и др. – М.: Флинта, 2009. – 525 с.



11. Фрадкина Р.Н. Говорящие руки: тематический словарь жестового языка глухих России. – М.: МосгорВОГ, 2001. – 598 с.



12. Королькова О.О. Особенности омонимии и полисемии в русском жестовом языке (на материале видеословаря русского жестового языка) // В мире научных открытий. – 2013. – № 5-1 (41). – С. 169–184.



13. Королькова О.О. Особенности жестов русского жестового языка, названиями которых являются омонимы русского языка // В мире научных открытий. – 2015. – № 7-8 (67). – С. 2931–2942.



14. Tripathi K, Baranwal N, Nandi GC. Continuous dynamic Indian Sign Language gesture recognition with invariant backgrounds // 2015 International Conference on Advances in Computing, Communications and Informatics (ICACCI). – Kochi, India, 2015. – P. 2211–2216.



15. Rekha J., Bhattacharya J., Majumder S. Shape, texture and local movement hand gesture features for Indian sign language recognition // 3rd International Conference on Trendz in Information Sciences & Computing (TISC 2011). – Chennai, India, 2011. – P. 30–35.



16. Lilha H., Shivmurthy D. Evaluation of features for automated transcription of dual-handed sign language alphabets // 2011 International Conference on Image Information Processing. – Shimla, India, 2011. – P. 1–5.



17. Adithya V., Vinod P.R., Gopalakrishnan U. Artificial neural network based method for Indian sign language recognition // 2013 IEEE Conference on Information & Communication Technologies. – Thuckalay, Tamil Nadu, India, 2013. – P. 1080–1085.



18. Dixit K., Jalal A.S. Automatic Indian sign language recognition system // 2013 3rd IEEE International Advance Computing Conference (IACC). – Ghaziabad, India, 2013. – P. 883–887.



19. Sahoo A.K., Ravulakollu K.K. Vision based Indian sign language character recognition // Journal of Theoretical & Applied Information Technology. – 2014. – Vol. 67, iss. 3.



20. Indian Sign Language gesture classification as single or double handed gestures / A. Singh, S. Arora, P. Shukla, A. Mittal // 2015 Third International Conference on Image Information Processing (ICIIP). – Waknaghat, India, 2015. – P. 378–381.



21. Gangrade J., Bharti J., Mulye A. Recognition of Indian Sign Language using ORB with bag of visual words by Kinect Sensor // IETE Journal of Research. – 2020. – 15 March. – P. 1–5. – DOI: 10.1080/03772063.2020.1739569.



22. Bhuyan M.K., Ghosh D., Bora P.K. Continuous hand gesture segmentation and co-articulation detection // Computer vision, graphics and image processing: 5th Indian conference, ICVGIP 2006, Madurai, India, December 13–16, 2006: proceedings. – Berlin; New York: Springer, 2006. – P. 564–575.



23. Li H., Greenspan M. Segmentation and recognition of continuous gestures // 2007 IEEE International Conference on Image Processing. – 2007. – Vol. 1. – P. I-365–I-368.



24. Bhuyan M.K., Bora P.K., Ghosh D. Trajectory guided recognition of hand gestures having only global motions // World Academy of Science, Engineering and Technology. – 2008. – Vol. 2, N 9. – P.2012–2023.



25. Kishore P.V., Kumar P.R. Segment, track, extract, recognize and convert sign language videos to voice/text // International Journal of Advanced Computer Science and Applications. – 2012. – Vol. 3, N 6. – P. 35–47.



26. Nanivadekar P.A., Kulkarni V. Indian sign language recognition: database creation, hand tracking and segmentation // 2014 International Conference on Circuits, Systems, Communication and Information Technology Applications (CSCITA). – Mumbai, India, 2014. – P. 358–363.



27. 4-Camera model for sign language recognition using elliptical Fourier descriptors and ANN / P.V. Kishore, M.V. Prasad, C.R. Prasad, R. Rahul // 2015 International Conference on Signal Processing and Communication Engineering Systems. – Guntur, India, 2015. – P. 34–38.



28. Athira P.K., Sruthi C.J., Lijiya A. A signer independent sign language recognition with co-articulation elimination from live videos: an Indian scenario // Journal of King Saud University – Computer and Information Sciences. – 2019. – DOI: 10.1016/j.jksuci.2019.05.002.



29. Indian sign language recognition system using new fusion based edge operator / M.V. Prasad, P.V. Kishore, E.K. Kumar, D.A. Kumar // Journal of Theoretical & Applied Information Technology. – 2016. – Vol. 88 (3). – P. 574–584.



30. Bhuyan M.K., Ghosh D., Bora P.K. A frame work of hand gesture recognition with applications to sign language // 2006 Annual IEEE India Conference. – New Delhi, India, 2006. – P. 1–6.



31. Agrawal S.C., Jalal A.S., Bhatnagar C. Recognition of Indian Sign Language using feature fusion // 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI). – Kharagpur, India, 2012. – P. 1–5.



32. Joshi G., Vig R., Singh S. Analysis of Zernike moment-based features for sign language recognition // Intelligent Communication, Control and Devices. – Singapore: Springer, 2018. – P. 1335–1343.



33. S3DRGF: spatial 3-D relational geometric features for 3-D sign language representation and recognition / D.A. Kumar, A.S. Sastry, P.V. Kishore, E.K. Kumar, M.T. Kumar // IEEE Signal Processing Letters. – 2019. – Vol. 26 (1). – P. 169–173.



34. Kaur B., Joshi G., Vig R. Identification of ISL alphabets using discrete orthogonal moments // Wireless Personal Communications. – 2017. – Vol. 95 (4). – P. 4823–4845.



35. Raheja J.L., Mishra A., Chaudhary A. Indian Sign Language recognition using SVM 1 // Pattern Recognition and Image Analysis. – 2016. – Vol. 26 (2). – P. 434–441.



36. A multimodal framework for sensor based sign language recognition / P. Kumar, H. Gauba, P.P. Roy, D.P. Dogra // Neurocomputing. – 2017. – Vol. 259. – P. 21–38.



37. Joshi G., Vig R., Singh S. DCA-based unimodal feature-level fusion of orthogonal moments for Indian sign language dataset // IET Computer Vision. – 2018. – Vol. 12 (5). – P. 570–577.



38. A depth-based Indian Sign Language recognition using Microsoft Kinect / T. Raghuveera, R. Deepthi, R. Mangalashri, R. Akshaya // Sadhana. – 2020. – Vol. 45, N 1. – P. 34.



39. Grif M.G., Prihodko A.L. Approach to the Sign language gesture recognition framework based on HamNoSys analysis // Actual Problems of Electronic Instrument Engineering (APEIE-2018): proceedings. – Novosibirsk, 2018. – Vol. 1, pt. 4. – P. 426–429. – DOI: 1109/APEIE.2018.8545086.



40. Grif M.G., Lukoyanychev A.V. Gesture localization in the test mode in the integral system of sign language training // Journal of Physics: Conference Series. – 2019. – Vol. 1333. – P. 032023.



41. Börstell C. Differential object marking in sign languages // Glossa: a Journal of General Linguistics. – 2019. – Vol. 4 (1).



42. Polinsky M. Sign languages in the context of heritage language: a new direction in language research // Sign Language Studies. – 2018. – Vol. 18 (3). – P. 412–428.



43. Ryumin D., Karpov A.A. Towards automatic recognition of sign language gestures using kinect 2.0 // International Conference on Universal Access in Human-Computer Interaction. – Cham: Springer, 2017. – P. 89–101.



44. Sign language numeral gestures recognition using convolutional neural network / I. Gruber, D. Ryumin, M. Hrúz, A. Karpov // Interactive Collaborative Robotics. – Cham: Springer, 2018. – P. 70–77.



45. Розалиев В.Л. Автоматизация распознавания кистей рук человека с помощью Kinect для перевода жестового языка // Известия Волгоградского государственного технического университета. – 2015. – № 6 (163). – C. 74–78.



46. Распознавания дактильных жестов русского языка глухих / Н.С. Дорофеев, В.Л. Розалиев, Ю.А. Орлова, А.Н. Солошенко // Известия Волгоградского государственного технического университета. – 2013. – № 14 (117). – C. 42–45.



47. Константинов В.М., Орлова Ю.А., Розалиев В.Л. Разработка 3D-модели тела человека с использованием MS Kinect // Известия Волгоградского государственного технического университета. – 2015. – № 6 (163). – C. 65–69.



48. Климов А.С., Розалиев В.Л., Орлова Ю.А. Автоматизация построения объемной модели головы человека // Известия Волгоградского государственного технического университета. – 2014. – № 25 (152). – C. 67–71.



49. Фан Н.Х., Спицын В.Г. Распознавание формы руки на видеопоследовательности в режиме реального времени на основе Surf-дескрипторов и нейронной сети // Электромагнитные волны и электронные системы. – 2012. – T. 17, № 7. – С. 31–39.



50. IIITA-ROBITA Indian Sign Language Gesture Database. – URL: https://robita.iiita.ac.in/dataset.php (accessed: 14.10.2020).



51. Ansari Z.A., Harit G. Nearest neighbour classification of Indian sign language gestures using Kinect camera // Sadhana. – 2016. – Vol. 41 (2). – P. 161–182.



52. Singha J., Das K. Recognition of Indian sign language in live video // arXiv preprint. – arXiv:1306.1301, 2013.



53. Elakkiya R., Vanitha V. Interactive real time fuzzy class level gesture similarity measure based sign language recognition using artificial neural networks // Journal of Intelligent & Fuzzy Systems. – 2019. – Vol. 37, N 5. – P. 6855–6864.



54. Elakkiya R., Selvamani K. Enhanced dynamic programming approach for subunit modelling to handle segmentation and recognition ambiguities in sign language // Journal of Parallel and Distributed Computing. – 2018. – Vol. 117. – P. 246–255.



55. Kugaevskikh A.V., Sogreshilin A.A. Analyzing the efficiency of segment boundary detection using neural networks // Optoelectronics Instrumentation and Data Processing. – 2019. – Vol. 55, N 4. – P. 414–422. – DOI: 10.3103/S8756699019040137.



56. Visual cortex // WikipediA. – URL: https://en.wikipedia.org/wiki/Visual_cortex (accessed: 14.10.2020).



57. Jones J.P., Palmer L.A. An evaluation of the two-dimensional Gabor filter model of simple receptive fields in cat striate cortex // Journal of Neurophysiology. – 1987. – Vol. 58 (6). – P. 1233–1258.



58. Two-streams hypothesis // WikipediA. – URL: https://en.wikipedia.org/wiki/Two-streams_hypothesis (accessed: 14.10.2020).

For citation:

Elakkiya R., Grif M.G., Prikhodko A.L., Bakaev M.A. Recognition of Russian and Indian sign languages used by the deaf people. Nauchnyi vestnik Novosibirskogo gosudarstvennogo tekhnicheskogo universiteta = Science bulletin of the Novosibirsk state technical university, 2020, no. 2–3 (79), pp. 57–76. DOI: 10.17212/1814-1196-2020-2-3-57-76.

Views: 1616