Friends of the Nonverbal Communication Blog, this week we present the paper “Expressing robot personality through talking body language”, by Zabala, U.; Rodríguez, I.; Martínez-Otzeta, J. M. and Lazkano, E. (2021), in which authors investigate whether it is possible for social robots to express emotions correctly through their body language.

Robotics it is not an invention from science fiction movies, it is a reality that each day is more present in our lives.

This week we will talk about social robots. This kind of robots are thought, for instance, for helping dependent or sick people, or even to accompany people in circumstances of isolation.

As they are robots whose aim is to interact with people, it is very important for them to master the nuances of human communication, and this includes verbal and nonverbal language.

Moreover, they must be capable of expressing affection or perceiving human emotions and possessing distinctive personalities. In this way, they could create social bonds with people.

That is why authors try to improve this aspect of robots’ communication. They made a series of changes in their software and their mechanics, in order to coordinate their bodies and what they want to say or transmit.

Within nonverbal language, as we already know, many areas exist. We have gestures, postures and movements of the body and face to convey information about the emotions we feel. Therefore, we see that making a robot’s body language resemble to a human’s, is not an easy task.

Authors carry out a series of changes in the chosen robot in order to achieve their purpose.

On one hand, we have the importance of body movements. It is necessary for them to be coordinated with the speech. That is why authors adapted the speed of the gestures’ execution to the intended emotion, in order to better convey it.

For instance, if the emotion is understood as positive, the gesture will be executed livelier than when the emotion is depicted as negative or neutral.

Authors also modified head movements. When a neutral emotion is portrayed, the robot head will simply look forward. The robot will tilt the head in other situations: in case of positive emotions the head will direct upwards, while with negative emotions it will go downwards.

The same would happen with the chest.

We already know how important facial expression in nonverbal communication is. In the case of the robot used in this investigation, the only manipulable facial expression by the programmers are the eyes.

Authors decided to install LED lights in robot’s eyes. These can be programmed in different manners. The color intensity changes according to the intensity of the emotion that the robot wants to transmit or show.

Therefore, two colors with different intensities existed for each one of the three types of emotions. Negative emotions appeared in blue, while there was gray for neutral ones and yellow for the positive ones.

Paralinguistics was an area that authors also wanted to explore. People modulate the intonation of their voices according to the context and add emphasis to their speeches. Plus, intonation is also correlated with the speaker’s mood.

One of the limitations of the used robot is that it does not provide a way to directly manage voice intonation. But it is possible to adjust some voice parameters, as volume and speed.

In order to check whether with the changes in the robot desired results were obtained, authors carried out two tests.

First, they made the robot read a definition authors found in Wikipedia. They manipulate its functioning to its body language flow within the three types of emotions proposed (negative, neutral, positive).

Afterwards, the robot read a book and authors tried to adapt its body language to what was happening in the different chapters. For example, if sad scenes appeared, robot showed negative emotions. The same happened for happy scenes and positive emotions.

Authors considered their experiments a success, because with the configurated changes, the robot’s emotions were understood paying attention to its body language.

In this way, it is easier to create a personality for the robot and it also facilitates to establish bonds or relations with people.

A limitation in this study is that authors achieve the robot to react with emotions, but only when large speeches appeared. In other words, it looks like the robot does not react as successfully with short sentences.

Authors point out that they will correct these limitations. Plus, they say that there is a need for a public evaluation, a performance in front of real people.

In this way, we would know whether if authors have achieved for the robot to show emotions with its both verbal and nonverbal behavior.

Besides, they were looking for an emulation of human behavior, so, if the results in front of people are positive, this objective would be achieved.


Write A Comment

NonVerbal Communication Blog