Engineers from the CNRS in Grenoble (France) are teaching Nina, the robot they created four years ago, to adopt human social behaviour. The robot is being trained using deep learning and algorithms, as well as by copying interactions between man and machine.
Four years ago, in the heart of the French National Centre for Scientific Research (CNRS), the Image Speech Automatic Signal laboratory (GIPSA-lab) in Grenoble (France) created Nina, a 1.02-metre-tall humanoid robot. It talks through its mouth and has articulated lips, as well as a jaw and ears. Nina also has brown eyes that look immense in the centre of its small head. Its eyes are covered with moving eyelids and will soon be topped with eyebrows.
A robot that is still learning
Beyond its humanoid appearance, what makes Nina so unusual is that it has been in training for the last four years. Using speech, expressions and gestures, the robot is learning “how to behave in a socially-acceptable manner”, explains Gérard Bailly, director of research at the CNRS.
But Nina is not alone in her training. The machine is controlled by a CNRS engineer, who steps inside the robot by donning a virtual reality headset. They can thus hear the sounds picked up by the robot’s ears as well as seeing the images recorded by cameras in its eyes. But a special system also makes Nina copy the movements of the human “pilot”, who can activate all fifty motors that move the robot as they wish, and can also make it talk.
When robots base their behaviour on humans
The little robot is not autonomous just yet, but this could change. “Just like a human child learns from adults, we would like our robot to imitate the interactions it sees when being piloted by a human brain,” hopes Frédéric Elisei, the CNRS engineer who brings Nina to life. And, to do this, the specialists rely on deep learning methods and algorithms.
The engineers expose the robot to multiplied and repeated exchanges between man and machine. The aim is to create “rich and natural interactions” that can be used as models so that Nina ends up acting like a human, knowing what to do and who to look at depending on the context.
A need to connect with human reactions
Because, like humans do during face to face meetings, “we need to be able to guess what the robot’s intentions are, what it is thinking, and this is communicated through small glances and head movements, which show what its points of interest are,” says Frédéric Elisei.
Contact Allianz Partners
Dec 21, 2017
Autonomous robot Tug is already present in various hospitals and even a few hotels, and has been designed to help employees with simple tasks. For example, the machine can move around c [...]
Aug 5, 2017
The Moley cooking robot can accurately mimic the actions of a chef, so it can do your cooking for you.