Tag

technology

Browsing

Friends of the Nonverbal Communication Blog, this week we present the paper “Online communication and body language” by Paradisi, P.; Raglianti, M. and Sebastiani, L. (2021), in which authors comment some hypotheses about the changes that online communication is bringing on to nonverbal communication.

The progress of digital technologies is having a deep impact on interpersonal communication.

The emergence of Covid-19 brought to light the need to further exploit digital technologies online to transfer interpersonal relationships to this context. Due to the need for physical isolation, we were also forced to adapt through a very fast process.

Therefore, the natural modality of face-to-face interaction today is often replaced by interactions through online communication platforms.

In fact, these types of platforms are now used much more routinely for meetings, courses, etc., all of them in different contexts: work environments, educational ones, and in general for any activity that involves social interaction.

Even older people, who previously were only marginal users of these technologies, were forced to use them as their only opportunity to maintain social contact with those close to them.

This new way of communication has brought about a great improvement in the possibilities of social interaction by overcoming the limitations of time and space. However, this have also modified the rules of communication, for example, those related to proxemics.

How is this? When we communicate through online video platforms, the distance that separates the image from the screen and the real interlocutor is a few tens of centimeters, which is less than the distance between the people involved in a face-to-face conversation.

Such closeness would presuppose an intimacy between people that does not really exist and a mutual predisposition to the potential use of the tactile channel (handshake, hug, etc.).

The problems noted suggest that online communication changes are complex and should be studied.

The body language is crucial both in nonverbal communication based on emotions, and in social interactions based on cognition. Therefore, it is foreseeable that the extensive use of online technologies may have important effects on cognitive processes, not only those related to educational activities, but also those related to emotional relationships in social life.

An example proposed by authors is “dance therapy.” In this therapy, body movements are used to promote personal and social well-being. The social component through bodily interaction plays a crucial role in this type of therapy: it is played with distances, perspectives, and reciprocity, creating a communicative context where movement takes place.

Studies that have previously been carried out, have shown that online meditation is compatible with the idea of ​​working with oneself, but this does not happen regarding interactions with other members of the group.

Authors suggest that the “human touch” plays a crucial role in establishing a sense of closeness between people, in addition, it facilitates affiliative behavior and social bonding. In fact, previous studies have shown a close relationship between a pleasant social touch and the release of oxytocin (modulator of social behavior and emotions).

The sense of smell is also involved in human nonverbal social communication; in fact, through it, we may inadvertently transmit personal information. And this sense would also be impaired by online communication.

Therefore, authors conclude that smell and touch are absent in online social interactions, visual stimuli are limited to 2d perception, while auditory stimuli are practically no different; there are changes in the relationship between perceived distances and knowledge, and there are no direct bodily interactions.

When people are online, those who interact cannot retrieve most of the relevant characteristics of the environment and the bodily behavior of others, adapting theirs accordingly.

These changes can undermine the emotional and empathic aspects of interpersonal communication.

A better understanding of these aspects might require a partial revision of the classical theories of communication, to consider the new modalities introduced by online interactions.

An open question, which authors consider for further investigation, is the quantification of perceived virtual distances in online interactions.

Although it seems that only negative points are observed, authors encourage us to approach the matter differently. We should not think about what we lose, but what lies ahead and what is new in this unexplored context.

Friends of the Nonverbal Communication Club, this week we present the paper “Nonverbal Communication in Virtual Reality: Nodding as a social signal in virtual interactions”, by Aburumman, N.; Gillies, M.; Ward, J. A. and Hamilton, A. F. C. (2022), in which authors carry out a series of experiments to know how the nodding affects the perception that users have of human avatars in virtual reality contexts. 

Face-to-face interaction is a central part of human life, used to convey ideas, share information, understand others’ intentions and emotions, build trust, make decisions…. 

An important goal for computational science researchers is the design of virtual environments, including virtual humans and immersive virtual reality contexts, that can simulate a real face-to-face conversation. It is also an important goal for researchers in psychology to understand how humans behave during interactions and to test theories about which aspects of these interactions are most meaningful.

Whether in a physical or virtual setting, human communication involves both verbal exchanges and nonverbal behaviors.

Nonverbal communication is an effective and expressive tool used to send and receive social signals that humans have been using for thousands of years before the ability to communicate with words was developed. Therefore, both the analysis and synthesis of nonverbal communication is an essential part of human-computer interaction research.

Although physical communication is still more powerful, modern communication is often mediated by technology, and it takes place virtually.

Virtual reality is a digital form of communication that can facilitate the creation of immersive real-time interaction and enhance social presence in virtual environments. 

In the present study, virtual reality was employed in the experiments as the authors felt that it had unparalleled potential to impact the future of numerous sectors, such as virtual conferencing, education, consulting, social rehabilitation, medical care….

They also included nonverbal communication, which refers to such disparate aspects as nodding the head, maintaining eye contact, leaning forward or backward, body orientation, among many others. In particular, nodding plays an important role in regulating an interaction, signaling who should take the floor, for example, or whether or not someone is interested in a particular item. 

This type of signaling is commonly referred to as backchannelling, and often occurs to send subtle messages in a face-to-face interaction. Including this element in virtual environments, therefore, can be very important to make the interlocutor feel comfortable and heard.

In this paper, authors implement several experiments involving virtual interaction between a human-controlled avatar and a virtual human whose behavior is controlled by a computer program. In these experiments, authors focus on four different types of nonverbal cues that are very important in human face-to-face interaction: blinking, head nodding, facial expressions, and gaze shifting. In addition, they specifically manipulated the nodding behavior between two different virtual humans.

The experiments were conducted at the social interaction laboratory at University College London. Data could be collected from 21 participants, of which 15 were female and 6 were male, with an average age of 24 years.

The style of the virtual avatars was unrealistic, cartoon-like, as this type of virtual human is preferred over more realistic ones.

In the first task, participants were told that they would have a conversation with two different virtual humans in virtual reality, and discuss a series of facts about some U.S. states. The participant meets the first virtual human (Anna). She introduced herself, and asked the participant to introduce him/herself. Then, Anna performed a 45-55 second monologue, where she read facts about a US state and then, for 35-45 seconds, Anna and the participant discussed. After that, the process was done in reverse. In total, the participant had to complete four attempts with Anna and four with the other virtual human, Beth. 

Authors designed these two virtual humans to provide identical blinks, facial expressions, and changing gaze behaviors. The only difference between the behavior of the two virtual humans is that one of them manifested a naturalistic nodding behavior that depended on the actions of its partner, while the other only exhibited a preconfigured head movement. 

The second task used a virtual maze to implicitly measure the participant’s proximity, trust, and attraction to the virtual humans. 

Virtual humans Anna and Beth were placed at decision points in the maze; and the participant could choose to approach one or the other for advice on how to complete the activity. 

A positive impact of naturalistic nodding was found, showing that participants liked more, and trusted more, the virtual human who nodded in this way, as she was rated significantly higher than the other virtual human. 

When participants were asked what virtual human had shown more attention to what he/she was saying, opinions continued along these lines, and the virtual human with a naturalistic nod was perceived as more engaged in the interaction.

Furthermore, in the maze experiment, participants were closer to the virtual human who nodded more. 

These results support the claim that mimicry functions as a kind of social glue, and that by copying another person’s actions it is possible to generate trust and sympathy. 

Future studies could test how this extends to other types of conversation and other social groups, for example, by introducing the variable of gender. 

If you want to know more about nonverbal behavior and how it affects personal relationships, visit our Master of Science in Nonverbal and Deceptive Behavior, which you can take in English or Spanish, with special grants for readers of the Nonverbal Communication Blog.

Friends of the Nonverbal Communication Blog, this week we present the paper “Robot lecture for enhancing presentation in lecture” by Ishino, T.; Goto, M. y Kashihara, A. (2022), in which authors carry out an experiment to know whether the use of robots with specialized skills in nonverbal communication is positive and beneficial for the learning process of students in class.

For some years now, the use of robots, especially small ones, has been spreading in various contexts such as care, nursing, education, guidance, hospitality… and moreover, people’s interest in implementing robots in some of these areas is growing exponentially, especially in education. 

In this article, authors focus on the use of communication robots to give lectures or short lessons in small classes. 

In a lecture, it is generally very important to present the contents with slides to support the oral presentation, so that a better and easier understanding by the students is achieved. This requires teachers to control the students’ attention, both to the slides and to the oral presentation, and this must be done by means of many non-verbal elements: the eyes, gestures, paralanguage, etcetera. 

For example, if teachers want to draw students’ attention to an important point on a particular slide, they should turn their face towards the presentation and point with a direct gesture simultaneously

On the other hand, nonverbal behavior that is histrionic, excessive, unnecessary, would prevent students from keeping their attention on understanding the content. Consequently, it is essential for teachers or lecture speakers to have some training in nonverbal communication. 

However, even for experienced communicators, it is not so easy to make proper use of the learned tools of nonverbal communication and maintain it throughout the lecture. And if we bear in mind that there are also inexperienced people who do not know the effective techniques in this type of situation, the matter becomes more complex. 

Those with less experience tend to concentrate more on oral explanation and leave aside non-verbal communication. As a result, the learning process for students will be more difficult. 

The authors propose the use of robots to give lectures, replacing human teachers. The aim in the experiment was to reproduce nonverbal behavior as adequate as possible for the students to pay attention to the most important contents of the lecture. 

The robot reproduced the presentation that was part of the supporting material of the lecture or class, and directed its face and gestures accordingly. 

The study compares the effectiveness of human-delivered and robot-delivered lectures in terms of student learning. 

The participants were 36 university students. Three different video lectures lasting 5-6 minutes were prepared. 

The obtained results reported that the robots had difficulties in performing accurate speaker behavior, due to their obvious limitations (they are not human beings), but their behavior was recognizable. 

In the case of a pointing gesture, performed by human teachers, it is required to point to precise locations. If it is imprecise, it can lead to confusion on the part of the students, and they will lose attention. The pointing gesture by a robot tends to be firmer, so students would pay immediate attention in the direction pointed. 

However, to make up for the possible shortcomings of robots in terms of gestures, the authors propose using laser pointers or visual effects on the slides.

As a point that also needs to be improved, the authors mention that the robot needs to recognize the learning and behavioral states in the classroom on the part of the students. For example, if there are people who feel that the lecture is difficult, the robot will have to present a different nonverbal behavior that helps to change this perception. 

The results are positive in terms of attention when it comes to lectures given by the robot, possibly because of the novelty factor, although it is also mentioned that they are short lectures and this can be a point in favor. For this reason, the authors propose the use of hybrid models where robots make the introductions to certain topics and human teachers explain the complex parts or those that require a less “technological” factor. 

In the future, authors intend to learn more about the applications of robots in the field of education. In the meantime, they invite other researchers to investigate the subject, in order to include more and more of this type of technology in our lives. 

If you want to know more about nonverbal behavior and how it influences our personal relationships, visit our Nonverbal Communication Certificate, a 100% online program certificated by the Heritage University (Washington) with special discounts for readers of the Nonverbal Communication Blog.

Friends of the Nonverbal Communication Blog, this week we present the paper “Survey On Emotional Body Gesture Recognition” by Noroozi, F.; Kaminska, D.; Corneanu, C. P.; Sapinski, T.; Escalera, S. and Anbarjafari, G. (2018), in which authors make a brief revision about some of the systems used for the recognition of body gestures and their decoding.

We know that nonverbal language plays an indispensable role in our daily communication. Plus, people constantly change the nonverbal cues that we emit through body movement and facial expressions.

Although it is a significant aspect of human social psychology, the first studies on body language became popular in the 1960s.

But probably the considered most important work was published long before the 20th century: “The expression of emotions in man and animals”, by Darwin. He observed, for example, that people around the world used facial expressions in similar ways.

This was later studied by Paul Ekman, who, with Friesen, developed the Facial Action Coding System (FACS) to classify human facial expressions.

Such is the role of nonverbal communication that many researchers agree that it is body movements that allow relationships and bonds to be formed, not words.

Gestures would be one of the most important forms of nonverbal communication. They include movements of the hands, head, and other parts of the body that allow people to communicate their feelings and emotions.

Most of the basic gestures are the same all over the world: when we are happy, we smile; when we are angry, we frown.

Head position also reveals a lot of information about emotional state. For example, people tend to speak more if the listener encourages them by nodding. If the chin is raised, it can mean that the person is showing superiority or even arrogance, while exposing the neck can be interpreted as a sign of submission.

We point out, as always, the need to consider the context and different parts of the body to correctly interpret the emotional state.

Although emotions can be expressed in different ways, the automatic recognition of them has focused mainly on facial expressions and speech, leaving work on gestures / body movements and posture in the background.

In this article, authors attempt to provide an overview of the techniques for automatic recognition of emotions from body gestures.

We refer to digital and technological recognition systems. To use them, you must first use a database, either publicly accessible or private, to search for images or videos.

The first step is to detect the bodies of the people as a whole and subtract the background. The pose is then detected and tracked to reduce irrelevant data variation caused by the pose. Finally, an adequate representation of the data must be made, and techniques applied to map it.

Most of the data available in public databases contain acted expressions, but show clear and undistorted emotions. However, some researchers report that they do not reflect real world conditions. For this reason, many experts recommend using movies, reality shows or live programs, where the quality of the material may not be optimal, but it is much more real.

The applications of emotional body gesture recognition are mainly of three types.

First, there are those systems that detect the emotions of users.

Second, there are animated conversational agents, real or virtual, such as robots or avatars that are expected to act like humans.

Finally, there are the systems that can be applied in videotelephony, videoconferencing, stress monitoring tools, violence detection or video surveillance, among other areas.

Automatic recognition systems can use information sources that are based on the face, voice, and body gestures at the same time. Therefore, if the system can combine emotional and social aspects of the context, and make a decision based on the available cues, it can be a useful assistant for humans.

An example of posture estimation and monitoring, in this case of several people, is “Arttrack”. With this software, cutting-edge results are achieved using a technology capable of detecting and associating the body joints of the same person.

This model is especially useful when trying to formulate articulate posture tracking. It allows, therefore, to solve the association problem for different people in the same scene.

However, in general, current representations remain superficial. Although recently experts are learning to give them depth and relevance for the recognition of affections, there is still a long way to go.

One limitation is the scarcity of body gestures and multimedia affective data. Another would be the lack of consensus regarding the interpretation of gestures.

In general, for a comprehensive human affective analysis from body language, body gesture recognition must learn from emotional facial recognition.

Friends of the Nonverbal Communication Blog, this week we present the paper “Nonverbal Behaviors ‘Speak’ Relational Messages of Dominance, Trust, and Composure” by Burgoon, J. K.; Wang, X.; Chen, X.; Pentland, S. J. and Dunbar, N. E. (2021), in which authors wonder what the nonverbal signals of dominance, trust and composure are, and, moreover, how easily is for technological devices to perceive them.

We know that, thanks to nonverbal signals we can interpret more accurately the messages we receive in interpersonal relationships, as well as we are able to emit them more effectively.

Due to their importance, without them our communication skills would be greatly diminished.

Until few years ago, we only had the human capacity of observation to study nonverbal behavior, but with the development of new technologies, it seems that the precision with which this type of behavior is investigated, could become more objective and studied in more detail.

Authors of this paper choose three relevant traits of personality, which are dominance-submission, composure-nervousness and trust-distrust.

In addition, they propose to detect them with the help of technological devices. In this way, authors can check to what extent we can trust technology in this matter.

We will talk about dominance-submission first. Dominance is one of the most recognized human personality and behavior traits in personal relationships.

Authors point out some nonverbal behaviors that may be related to dominance. For instance, silence, lower vocal pitch, loudness or rapid speech rate, in relation to prosody.

Previous studies have reported that facial expressions such as lowered brows or a non-smiling mouth are associated with perceived dominance too.

Regarding body movements, the contraction of the body and gaze avoidance would be associated with the opposite extreme of dominance, that is, submission.

In dominant people, more expansive body postures appear, with upward inclinations of the head.

On the other hand, we have composure-nervousness. Generally, when levels of composure or calm increase during interactions, more positive outcomes appear. For instance, manager composure leads to increased employee satisfaction.

People with this personality trait are thought to have a pleasant facial gesture, frequently showing emotions in their voices, being expressive, talking a lot. They have a relaxed head and body posture and tend to be relaxed in general.

Regarding prosody, they tend to have a lower tone of voice, a contained and relaxed laugh and a moderate volume.

On the other hand, it is considered that people who are nervous are more rigid, tense, tend to avoid eye contact and have a higher tone of voice.

Finally, trust-mistrust appears. This is expressed in interpersonal relationships, and usually appears in the form of reciprocity, convergence and synchrony when two or more people interact. Furthermore, it seems that it is difficult to associate these traits with nonverbal behaviors objectively.

To carry out their study and find out if technological devices can be used to study these mental states in an objective way and precisely, authors designed an experiment.

A total of 379 people participated in it. Authors used board games with a certain role-playing component, in which volunteers interacted in small groups.

These groups were divided into two, creating a rivalry between them within the same game.

Participants’ faces, gestures and body movements were measured with cameras and microphones that were in the devices each one of them had in front of.

Later, after watching the recordings of their companions, they were told that they had to rate other participants on the dimensions of dominance-submission, composure-nervousness and trust-distrust, according to a list of relevant factors.

Results showed that dominance was associated with the majority of factors (101/150). The perception of dominance, according to the results, was associated with a high volume of voice, a more expressive facial nonverbal behavior, more head movements and longer speaking turns.

On the other hand, we have the nonverbal signals of nervousness and composure. Authors originally believed that nervousness would be perceived by a high-pitched tone of voice, but results were not consistent with this idea. However, they did confirm others, such as that people who are nervous tend to have a more rigid body posture.

Finally, we have trust-mistrust. It was the most difficult to detect. No facial expression or body movement was found to suggest that an individual was dealing with a person who could be trustworthy.

Authors suggest this happened because none of the participants knew each other and, therefore, it was very difficult to establish a relationship of trust or mistrust in such a short time.

In a nutshell, results tell us that, although thanks to technological devices we can objectively register nonverbal behaviors, the help of people is still needed, so that nothing is overlooked.

An important advance of this study is that for the recognition of personality traits, groups of people were used, instead of making pairs, which has been the most common way to carrying out this type of studies.

Authors propose to continue improving technological resources in order to make their performances better and, in the future, to use them as 100% accurate nonverbal detectors.

Friends of the Nonverbal Communication Blog, this week we present the paper “Expressing robot personality through talking body language”, by Zabala, U.; Rodríguez, I.; Martínez-Otzeta, J. M. and Lazkano, E. (2021), in which authors investigate whether it is possible for social robots to express emotions correctly through their body language.

Robotics it is not an invention from science fiction movies, it is a reality that each day is more present in our lives.

This week we will talk about social robots. This kind of robots are thought, for instance, for helping dependent or sick people, or even to accompany people in circumstances of isolation.

As they are robots whose aim is to interact with people, it is very important for them to master the nuances of human communication, and this includes verbal and nonverbal language.

Moreover, they must be capable of expressing affection or perceiving human emotions and possessing distinctive personalities. In this way, they could create social bonds with people.

That is why authors try to improve this aspect of robots’ communication. They made a series of changes in their software and their mechanics, in order to coordinate their bodies and what they want to say or transmit.

Within nonverbal language, as we already know, many areas exist. We have gestures, postures and movements of the body and face to convey information about the emotions we feel. Therefore, we see that making a robot’s body language resemble to a human’s, is not an easy task.

Authors carry out a series of changes in the chosen robot in order to achieve their purpose.

On one hand, we have the importance of body movements. It is necessary for them to be coordinated with the speech. That is why authors adapted the speed of the gestures’ execution to the intended emotion, in order to better convey it.

For instance, if the emotion is understood as positive, the gesture will be executed livelier than when the emotion is depicted as negative or neutral.

Authors also modified head movements. When a neutral emotion is portrayed, the robot head will simply look forward. The robot will tilt the head in other situations: in case of positive emotions the head will direct upwards, while with negative emotions it will go downwards.

The same would happen with the chest.

We already know how important facial expression in nonverbal communication is. In the case of the robot used in this investigation, the only manipulable facial expression by the programmers are the eyes.

Authors decided to install LED lights in robot’s eyes. These can be programmed in different manners. The color intensity changes according to the intensity of the emotion that the robot wants to transmit or show.

Therefore, two colors with different intensities existed for each one of the three types of emotions. Negative emotions appeared in blue, while there was gray for neutral ones and yellow for the positive ones.

Paralinguistics was an area that authors also wanted to explore. People modulate the intonation of their voices according to the context and add emphasis to their speeches. Plus, intonation is also correlated with the speaker’s mood.

One of the limitations of the used robot is that it does not provide a way to directly manage voice intonation. But it is possible to adjust some voice parameters, as volume and speed.

In order to check whether with the changes in the robot desired results were obtained, authors carried out two tests.

First, they made the robot read a definition authors found in Wikipedia. They manipulate its functioning to its body language flow within the three types of emotions proposed (negative, neutral, positive).

Afterwards, the robot read a book and authors tried to adapt its body language to what was happening in the different chapters. For example, if sad scenes appeared, robot showed negative emotions. The same happened for happy scenes and positive emotions.

Authors considered their experiments a success, because with the configurated changes, the robot’s emotions were understood paying attention to its body language.

In this way, it is easier to create a personality for the robot and it also facilitates to establish bonds or relations with people.

A limitation in this study is that authors achieve the robot to react with emotions, but only when large speeches appeared. In other words, it looks like the robot does not react as successfully with short sentences.

Authors point out that they will correct these limitations. Plus, they say that there is a need for a public evaluation, a performance in front of real people.

In this way, we would know whether if authors have achieved for the robot to show emotions with its both verbal and nonverbal behavior.

Besides, they were looking for an emulation of human behavior, so, if the results in front of people are positive, this objective would be achieved.

NonVerbal Communication Blog