Engineers at the Artificial Intelligence Research Center of the Faculty of Engineering of Near East University have developed a software program that turns sign language into written text
Date Added: 21 February 2019, 11:04
Last Updated Date:19 November 2020, 14:14

When it comes to technology's ability to translate sign language into text, it means that we really talk about innovative goals for the benefit of mankind. Near East University, which is the leading gear of quality education, science and innovative breakthroughs, has once again realized a first. Engineers of the Artificial Intelligence Research Center of the Faculty of Engineering of Near East University have developed a software program that turns sign language into text. The program will enable people with hearing and speaking disabilities to communicate with hearing people who don't know the sign language.

The Directorate of Press and Public Relations Office of Near East University released that a new software program that developed by the engineers at Near East University Faculty of Engineering Applied Intelligence Research Center is able to translate sign language for people with hearing disabilities, and turns it into letters and text for the hearing people who do not understand sign language.

Taking into consideration not only the limited options for people with hearing disabilities who use sign language as their primary means of communication but also the challenges that may confront a deaf person trying to converse with folks who do not know signing language, NEU engineers have developed a sign-recognition software program that can translate sign language into text.

The Program has learning ability...
NEU engineers have pre-trained the program to detect the movements of a person's hands and convert these signs into text. First, the engineers used a 3D camera to track and record the movement of hands and developed a software program and pre-trained it to detect the hands and signs in the image by using object detection technique, a powerful deep learning algorithm. Then, each movement of hands in the image was matched with a letter by using object classification technique and this was taught the computer. By using almost 5 thousand hand images and data set consisting of approximately 50 thousand images for 28 letters corresponding to the hand movements, the system has been programmed to identify the hand and convert the signs into written text. Thus, the program has gained the ability to detect the hand movements of someone, who speaks with the sign language in front of camera, and convert these signs into written text.

Successful results have been achieved by using Inception Algorithm…
The program developed by the Near East University engineers is the result of a unique study as it grounds on Inception algorithm, which is a state-of-the-art network for solving and benchmarking image recognition and detection algorithms.

Expressing that they used Single Shot Multibox Detector (SSD), one of the newest and fastest algorithms to detect the object in an image and which is also used in autonomous cars, NEU engineers stated that they also used Inception algorithm, which is a kind of convolutional algorithm that not used in this field of research before, for object classification and feature extraction.

The Use of the App on Mobile Phone is being tested...
Underlining that testing the use of the app on computer environment had been completed, Murat Arslan, one of the engineers who developed the software, stated that they were testing the use of the app on mobiles. He noted that they aimed at facilitating the communication of the people whose primary means of communication was sign language. "When we conclude the tests successfully, by means of our novel program, the mobile phone can convert the sign language into written text and the hearing person, who doesn’t know sign language, will be able to understand what the person with hearing disability says. Then, we aim to develop a program capable to convert written text and spoken language into sign language. Thus, a communication will be ensured between a hearing person and a person with hearing disability" said he

Works to Serve the Humanity have been carried out...
NEU engineers, who have developed many software programs for autonomous car, for enabling the disabled to use the mouse and the cursor with head and eye movements, for manual remote object control, for the detection of epilepsy from brain signals, for the detection of the sleeping mode of the driver in the vehicle, stated that they had been continuing to conduct different research studies to serve the humanity adding that they would share the results of these studies through scientific publications and conferences under the name of Near East University and the Applied Artificial Intelligence Center.