Hand Sign Interpretation through Virtual Reality Data Processing

  • Teja Endra Eng Tju Universitas Budi Luhur
  • Muhammad Umar Shalih Universitas Budi Luhur

Abstract

The research lays the groundwork for further advancements in VR technology, aiming to develop devices capable of interpreting sign language into speech via intelligent systems. The uniqueness of this study lies in utilizing the Meta Quest 2 VR device to gather primary hand sign data, subsequently classified using Machine Learning techniques to evaluate the device's proficiency in interpreting hand signs. The initial stages emphasized collecting hand sign data from VR devices and processing the data to comprehend sign patterns and characteristics effectively. 1021 data points, comprising ten distinct hand sign gestures, were collected using a simple application developed with Unity Editor. Each data contained 14 parameters from both hands, ensuring alignment with the headset to prevent hand movements from affecting body rotation and accurately reflecting the user's facing direction. The data processing involved padding techniques to standardize varied data lengths resulting from diverse recording periods. The Interpretation Algorithm Development involved Recurrent Neural Networks tailored to data characteristics. Evaluation metrics encompassed Accuracy, Validation Accuracy, Loss, Validation Loss, and Confusion Matrix. Over 15 epochs, validation accuracy notably stabilized at 0.9951, showcasing consistent performance on unseen data. The implications of this research serve as a foundation for further studies in the development of VR devices or other wearable gadgets that can function as sign language interpreters.

Published
2024-06-03
How to Cite
Tju, T. E. E., & Shalih, M. U. (2024). Hand Sign Interpretation through Virtual Reality Data Processing. Jurnal Ilmu Komputer Dan Informasi, 17(2), 185-194. https://doi.org/10.21609/jiki.v17i2.1280