Google’s engineers have been making significant progress in their efforts to interpret sign language using artificial intelligence. For the first time, a powerful system that can translate the language has been built into a smartphone thanks to an algorithm.
Several millions of people around the world use sign language as their only form of expression. Google’s research teams have recently developed a system based on artificial intelligence (AI) and designed to translate sign language, as reported by the tech website Presse-Citron. The new device is capable of interpreting hand movements in real time using a smartphone camera.
An extensive database
This is the first time this innovative technology has been built into a smartphone. In designing the tool, researchers studied a database of more than 30,000 images of hand signs, taken in a variety of lighting conditions.
They then trained the programme to recognise the different signs one by one. Making use of AI, the programme learned to compare each sign with all the others, the aim being to interpret and translate speech in sign language.
Further development needed
However, sign language can also be expressed using facial expressions, a facet the Google algorithm cannot yet handle. Writing on the company’s blog, the research team acknowledged that the software is only a “basis for the comprehension of sign language”.
The developers are now working on their algorithm to enable it to analyse facial expressions. The tool they have created can be accessed on open source, which allows them to invite other companies to work on the technology.
Contact Allianz Partners
Jun 20, 2019
The medical application of AI is opening up new horizons. The diagnosis of lung cancer, the deadliest form of the disease in the world, can be greatly enhanced thanks to an algorithm de [...]
May 26, 2018
The Centre Médico-Psychologique Signes in Poitiers now offers mental healthcare for deaf patients. A specially created team provides support for the deaf and hard of hearing as part of [...]