Publication:
Sinhala Sign Language Interpreter Optimized for Real–Time Implementation on a Mobile Device

Research Projects

Organizational Units

Journal Issue

Abstract

This paper proposes a framework for a vision based Sinhala Sign Language interpreter targeted for implementation on a portable device, optimized for real-time use. The translator is aimed at enabling conversation between a hearing-impaired and a non-signing individual. The scope covers both static and dynamic signs, portrayed using the right hand. Skin segmentation and contour extraction followed by a combination of hand detection and tracking algorithms isolate the signing hand against varied background conditions. A Convolutional Neural Network model was developed to extract and classify the features of the chosen static signs. A standard, expandable dataset of Sinhala static signs was prepared for this task. Dynamic signs were modeled as a tree data structure using a sequence of static signs. The model was optimized using motion based temporal segmentation between consecutive signs, to minimize the processing overhead. The interpreter recorded an average accuracy of 99.5% and 81.2% on the static sign dataset, and combined dataset of static and dynamic signs, respectively. A response time of333 ms was resulted between the occurrence and prediction of a sign, demonstrating the effectiveness of the framework for real-time use.

Description

Keywords

Sinhala Sign Language, Interpreter, Optimized, Real – Time, Implementation, Mobile Device

Citation

I. D. V. J. Dhanawansa and R. M. T. P. Rajakaruna, "Sinhala Sign Language Interpreter Optimized for Real – Time Implementation on a Mobile Device," 2021 10th International Conference on Information and Automation for Sustainability (ICIAfS), 2021, pp. 422-427, doi: 10.1109/ICIAfS52090.2021.9605996.

Endorsement

Review

Supplemented By

Referenced By