processed for feature extraction
Three methods are compared
To widen the scope of decision-making and include other activities for different age groups.
A deep parallel neural network uses two channels:
Simulation experiments show that:
The Review Stages.
AI is one of the most significant technological developments, increasing in popularity and being used in all application areas. One of the most important of these applications is the use of AI in healthcare. Health is the most important human factor for life on this planet. Recently, the use and applications of AI in healthcare have played a significant role in helping doctors discover diseases and improve human health. The use of AI in health depends on the appearance of some symptoms on parts of the body. These symptoms affect and are reflected in the movements and expressions of the body, which are manifested as body language. From this point, these features of body language can be used to classify disease symptoms by detecting them in ML. In this section, we want to explain the importance of using body language by artificial intelligence. There are features that appear in body language that AI can analyze to solve many problems in many applications. For example, facial expressions can be analyzed to know human feelings and benefit from them in psychotherapy or examine subjects’ emotions in the study. Another example is analyzing the movements of the hand, shoulder, or leg, and using them to discover valuable features in medicine, security, etc. From this point, we want to show that body language has many benefits and applications, so this is important. Therefore, we want to suggest that body language can also be used to detect infectious diseases such as COVID-19 using ML.
Now, it is feasible to employ this technology in healthcare systems. Pandemic and epidemic diseases are considered an intractable matter that inferiorly affects human health, regarded as peoples’ most valuable asset. Additionally, the biggest worry is that new pandemics or epidemics will suddenly appear and become deadly, such as COVID-19, which has claimed nearly a million lives so far. This stimulates us to develop AI technologies to help detect the disease’s external symptoms by analyzing the patients’ body language. This work deals with general studies that prove the importance of body language processing in various fields.
Every computer user interacts with the device via mouse and keyboard. Currently, researchers are developing a computer system for interaction and response through body language such as hand gestures and movement. In [ 8 ], a comprehensive survey was completed evaluating the published literature recommending the visual interpretation of hand gestures when interacting with computing devices and introducing more advanced methods to analyze body language rather than mouse and keyboard movements. The study of [ 9 ] considered the problem of robot accuracy recognition. It proposed a fusion system to identify the fall movement types and abnormal directions with an accuracy rate of 99.37%. A facial coding system was developed in [ 10 ] to measure and analyze facial muscle movements and identify facial expressions. A database was created with a series of 1100 images. The system analyzed and classified facial creases and wrinkles to match their movements. The results showed that the performance improved, reaching 92%. Combining facial features and movements with body movements is essential for analyzing individual expressions. Three different experiments were conducted to determine whether facial expressions and body language should be combined and concluded in the affirmative. Another study [ 11 ] focused on deep learning techniques to identify emotions revealed in facial expressions. This research used pure convolutional neural network techniques to prove that deep learning using these neural networks successfully recognizes emotions by developing cognition, significantly improving the usability. A new model was invented in [ 12 ] that detected body gestures and movements with a pair of digital video images, which supplied a set of vector monitors with three dimensions.
The first study showed the relationship between the contraction of the internal muscles of the face and the facial movements as established by Hjortsjo 1970 [ 13 ] to develop a coding system by identifying the minor units of facial muscle movements and then drawing coordinates that defined the facial expressions. The recognition of people’s emotions has merited much attention. However, the issue of detecting facial emotions and expressions of speech, especially among researchers, is still problematic. The work presented in [ 14 ] offered a comprehensive survey to facilitate further research in this field. It focused on identifying gender-specific characteristics, setting an automatic framework to determine the physical manifestation of emotions, and identifying constant and dynamic body shape comments. It also examined recent studies on learning and emotion by identifying gestures through photos or video. Several methods combined speech, body, and facial gestures were also discussed to identify optimized emotions. The study concluded that the knowledge of a person’s feelings through overtones was still incomplete.
A coding system was created to classify the facial expressions by analyzing more than 1100 pictures at work [ 10 ]. Three ways to classify facial expressions were compared: a method for analyzing image components in the gray field, measuring wrinkles, and a template for creating facial movements. The accuracy of performance of the coding system for the three roads was 89%, 57%, and 85%, respectively, while when assembling the methods, the performance accuracy reached 92%. Online learning is challenged by knowing students’ participation in learning processes. In work [ 15 ], an algorithm is introduced to learn about student interactions and see their problems. In this algorithm, two methods were used to collect evidence of student participation: the first method involved collecting facial expressions using a camera, and the second involved collecting hand movement data using mouse movements. The data were trained by building two groups; one group collected facial data with mouse data, and the second was without the mouse. It was discovered that the first group’s performance was better than the second group’s by 94.60% compared to 91.51%. Work [ 14 ] commented on recognizing facial and speech gestures that may provide a comprehensive survey of body language. It provided a framework for the automatic identification of dynamic and fixed emotional body gestures that combined facial and speech gestures to improve recognition of a person’s emotions. Paper [ 16 ] defines facial expressions by matching them with body positions. The work demonstrated that the effects and expressions are more evident when the major irritations on the face are similar to those highlighted in the body. However, the model produces different results according to the dependence on the properties, whether physical, dimensional, or latent. Another significant finding in the study is that expressions of fear bloom better when paired with facial expressions than when performing tasks.
In [ 17 ], the authors stated that the medical advisor must exhibit exciting communication qualities that make the patient feel comfortable making a correct decision. They advised doctors to know how to use facial expressions, eyes, hand gestures, and other body expressions. It was mentioned that a smile is the most robust expression that a doctor can use to communicate with their patients, as the doctor’s smile makes the patient feel comfortable. The patient’s sense of comfort makes them appear confident, and they answer the doctor’s questions with clear responses, credibility, and confidence. In addition, communicating with the eyes is very important to help the patient, as the lack of this from the doctor may suggest that the doctor does not care about them. The research in [ 18 ] concludes that the doctor’s appropriate nonverbal communication positively impacts the patient. Objective evidence has shown that the patient improves and recovers better and faster when the doctor uses a smile and direct eye communication with the patient compared to those who do not use a smile and direct eye with the patient. It was also concluded that patients who receive more attention, feeling, sensation, and participation by the doctor respond better to treatment, as the tone of voice, movement of the face and body, and eye gaze affect the patient. Clint [ 19 ] reported his first day on the job in the intensive care unit. He felt fear and anxiety on that day as the unit was comprehensive and informative. Clint was asking himself, “is it worth working in that unit?” He had a patient with her sister next to her. The patient glimpsed Clint’s nervousness and anxiety but did not dare ask him, so she whispered that the nurse was nervous to her sister. Then, her sister asked Clint, “you are worried and anxious today; why?” What is there to be so nervous about? Clint thought to hide his nervousness and anxiety and restore confidence; he smiled and replied, “I am not nervous.” However, sometimes, we have to ask our patients ridiculous questions that make us tense. Here, Clint states that he noticed from the patient’s looks that he could not persuade her to hide his stress. Clint made it clear that patients are affected by their body language and facial expressions. They can know their cases through their body language. From here, Clint realized that he was wrong. As anxiety and stress began on his patient, his condition may increase for that reason.
In one of Henry’s articles [ 20 ], he wrote that treating a patient with behaviors and body language has a more significant impact than using drugs. The work [ 21 ] concluded that non-verbal language between a doctor and their patient plays a vital role in treating the patient. The doctor can use non-verbal signals sent from the patient to collect information about the condition of the disease to help them decide on diagnosis and treatment. The research summarized that the non-verbal technique used by the doctor toward the patient affects them in obtaining information and helping them recover from the disease. For example, eye gaze, closeness to the patient, and facial and hand gestures to appear relaxed. The research suggests that there is a positive effect on the use of non-verbal cues on the patient. It is recommended that doctors be trained in incorporating non-verbal cues as a significant way of dealing with patients to speed up their treatment.
Different AI methods and techniques have been used to analyze patients’ body language. We briefly discuss some studies conducted so far in this area. More specifically, focusing on facial recognition, a pimple system was introduced in [ 22 ] to analyze facial muscles and thus identify different emotions. The proposed system automatically tracks faces using video and extracts geometric shapes for facial features. The study was conducted on eight patients with schizophrenia, and the study collected dynamical information on facial muscle movements. This study showed the possibility of identifying engineering measurements for individual faces and determining their exact differences for recognition purposes. Three methods were used in [ 23 ] to measure facial expressions to define emotions and identify persons with mental illness. The study’s proposed facial action coding system enabled the interpretation of emotional facial expressions and thus contributed to the knowledge of therapeutic intervention for patients with mental illnesses.
Many people suffer from an imbalance in the nervous system, which leads to paralysis of the patient’s movement and falls without prior warning. The study [ 24 ] was targeted to improve early warning signs detection and identification rate using a platform (R). Wireless sensor devices were placed on the chest and waist. The collected data were converted to an algorithm for analysis that extracted them and activated if there was a risk. The results showed that the patient at risk engaged in specific typical movements, which indicated an imminent fall. The authors further suggested applying this algorithm to patients with seizures to warn of an imminent attack and alert the emergency services.
In research [ 25 ], a computational framework was designed to monitor the movements of older adults to signal organ failures and other sudden drops in vital body functions. The system monitored the patient’s activity and determined its level using sensors placed on different body parts. The experiments show that this system identifies the correct locations in real-time with an accuracy of 95.8%. Another approach based on data analysis was presented in [ 26 ] for an intelligent home using sensors to monitor its residents’ movements and behaviors. This system helps detect behaviors and forecast diseases or injuries that residents may experience, especially older people. This study is helpful for doctors in providing remote care and monitoring their patients’ progress. The target object capture setup model proposed in [ 27 ] is based on the candidate region–suggestion network to detect the position grab of the manipulator combined with information for color and deep image capture using deep learning. It achieved a 94.3% crawl detection success rate on multiple target detection datasets through merging information for a color image. A paper [ 28 ] under review deals with the elderly and their struggle to continue living independently without relying on the support of others—the research project aimed to compare automated learning algorithms used to monitor their body functions and movements. Among the eight higher education algorithms studied, the support conveyor algorithm achieved the highest accuracy rate of 95%, using reference traits. Some jobs require prolonged sitting, resulting in long-term spinal injury and nervous disease. Some surveys helped design sitting position monitoring systems (SPMS) to assess the position of the seated person using sensors attached to the chair. The drawback of the proposed method was that it required too many sensors. This problem was resolved by [ 29 ], who designed an SPMS system that only needed four such sensors. This improved system defined six different sitting positions through several machine-learning algorithms applied to average body weight measurements. The positions were then analyzed and classified into any approach that would produce the highest level of accuracy, reaching from 97.20% to 97.94%. In most hospitals, medical doctors face anxiety about treating patients with mental illness regarding potential bodily harm, staff risks, and hospital tool damage. The study [ 30 ] devised a method to analyze the patient’s movements and identify the risk of harmful behavior by extracting visual data monitoring the patient’s movements from cameras installed in their rooms. The proposed method traced the movement points, accumulated them, and extracted their properties. The characteristics of the movement points were analyzed according to spacing, position, and speed. The study concluded that the proposed method could be used to explore features and characteristics for other purposes, such as analyzing the quality of the disease and determining its level of progression. In the study [ 31 ], wireless intelligent sensor applications and devices were designed to care for patient health, provide better patient monitoring, and facilitate disease diagnosis. Wireless sensors were installed on the body to periodically monitor the patient’s health, update the information, and send it to the service center. The researchers investigated the multi-level decision system (MDS) to monitor patient behaviors and match them with the stored historical data. This information allowed the decision makers in the medical centers to give treatment recommendations. The proposed system could also record new cases, store new disease data, and reduce the doctors’ effort and time spent examining the patients. The results proved accurate and reliable (MDS) in predicting and monitoring patients.
The study of [ 32 ] proposed the Short Time Fourier Transform application to monitor the patient’s movements and voice through sensors and microphones. The system transmitted sound and accelerometer data, analyzed the data to identify the patient’s conditions, and achieved high accuracy. Three experiments were conducted in reference [ 33 ], which involve the recognition of full-body expressions. The first experiment was about matching body expressions to incorporate all emotions, where fear was the most difficult emotion to express. At the same time, the second experiment focused on facial expressions strongly influenced by physical expression and, as a result, was ambiguous. In the last experiment, attention was given to expressions of the tone of a voice to identify emotional feelings related to the body. Finally, it was concluded that it was essential to pool the results of the three experiments to reveal true body expression.
A valuable study was conducted at the MIT Institute [ 34 ] to develop a system that detects pain in patients by analyzing data on brain activities using a wearable device to scan brain nerves. This was shown to help diagnose and treat patients with loss of consciousness and sense of touch. In this research, researchers use several fNIRS sensors specifically on the patient’s front to measure the activity of the frontal lobe, where the researchers developed ML models to determine the levels of oxygenated hemoglobin related to pain. The results showed that pain was detected with an accuracy of 87%.
The study [ 35 ] considered the heartbeat as a type of body language. Checking a patient’s heartbeat constitutes a crucial medical examination tool. The researcher suggested a one-dimensional (1D) convolutional neural network model CNN, which classified the vibrational signals of the regular and irregular heartbeats through an electrocardiogram. The model used the de-noising auto-encoder (DAE) algorithm, and the results showed that the proposed model classified the sound signals of the heart with an accuracy of up to 99%.
We can conclude from this study that reading and understanding body language through AI will help automatically detect epidemic diseases. Counting epidemic patients is a significant obstacle to detecting every infected person. The most prominent example that is evident now is COVID-19 sufferers. All the developed, middle, and developing countries of the world have faced a significant problem examining the disease due to many infected people and the rapid spread. Thus, infections increased significantly, making it difficult to catch up to detect. We suggest conducting a study to determine the movements and gestures of the body with epidemic diseases, such as those with COVID-19. Indeed, the epidemic disease will have unique and distinct movements in some body parts. The thermal camera to detect high body temperature certainly plays a significant role in indicating a patient with a disease. Still, it is difficult to determine what kind of disease is affected, and secondly, there may be a patient with epidemic disease, but their temperature may not have significantly increased. Thirdly, it may be revealed that the high temperature of an epidemic may be delayed, and the patient is in a critical stage of treatment. We focus in this study on the interest in studying the body language of some epidemics, especially COVID-19, which changed our lives for the worse. We have learned a harsh lesson from this deadly enemy: not to stand still. We must help our people, countries, and the world defend and attack this disease. Hence, we propose studying the use of body language using AI. We hope to collect and identify body parts’ gestures that characterize the epidemic in the upcoming studies on which we are currently working.
Table 1 indicates some studies that have used ML to discover disease and symptoms through gestures, hands, and facial expressions. This table concludes that the CNN algorithms are the most common and efficient methods of identifying disease symptoms through facial expressions and hand gestures. Some studies indicate that analyzing other body parts is also helpful in identifying some types of diseases using different ML algorithms, such as SVM and LSTM. It appears to us here that combining the proposed CNN algorithm with a new proposed algorithm to determine facial expressions will lead to high-quality results for detecting some epidemic diseases. It is essential first to study the symptoms that characterize the epidemic disease and their reflection on body expressions and then use the algorithm to learn the machine that has a higher efficiency in identifying these expressions.
The studies in Table 1 are classified as follows:
This study aims to survey research using ML algorithms to identify body features, movements, and expressions. Each movement is affected by the disease, and each disease is characterized by a distinct and different effect on the body. This means some body parts will undergo certain changes that point to a specific disease. Thus, we propose that ML algorithms capture images of body movements and expressions, analyze them, and identify diseases. This study surveyed a selection of existing studies that use different ML algorithms to detect body movements and expressions. Since these studies do not discuss this epidemiology method, this study seeks to document the use of ML algorithms in discovering epidemics such as COVID-19. Our survey analysis concludes that the results achieved indicate the possibility of identifying the body movements and expressions and that ML and convolutional neural networks are the most proficient in determining body language.
From an epidemiological, diagnostic, and pharmacological standpoint, AI has yet to play a substantial part in the fight against coronavirus. Its application is limited by a shortage of data, outlier data, and an abundance of noise. It is vital to create unbiased time series data for AI training. While the expanding number of worldwide activities in this area is promising, more diagnostic testing is required, not just for supplying training data for AI models but also for better controlling the epidemic and lowering the cost of human lives and economic harm. Clearly, data are crucial in determining if AI can be used to combat future diseases and pandemics. As [ 91 ] previously stated, the risk is that public health reasons will override data privacy concerns. Long after the epidemic has passed, governments may choose to continue the unparalleled surveillance of their population. As a result, worries regarding data privacy are reasonable.
According to patient surveys, communication is one of the most crucial skills a physician should have. However, communication encompasses more than just what is spoken. From the time a patient first visits a physician, their nonverbal communication, or “body language”, determines the course of therapy. Bodily language encompasses all nonverbal forms of communication, including posture, facial expression, and body movements. Being aware of such habits can help doctors get more access to their patients. Patient involvement, compliance, and the result can all be influenced by effective nonverbal communication.
Pandemic and epidemic illnesses are a worldwide threat that might kill millions. Doctors have limited abilities to recognize and treat victims. Human and technological resources are still in short supply regarding epidemic and pandemic conditions. To better the treatment process and when the patient cannot travel to the treatment location, remote diagnosis is necessary, and the patient’s status should be automatically examined. Altering facial wrinkles, movements of the eyes and eyebrows, some protrusion of the nose, changing the lips, and the appearance of certain motions of the hands, shoulders, chest, head, and other areas of the body are all characteristics of pandemic and epidemic illnesses. AI technology has shown promise in understanding these motions and cues in some cases. As a result, the concept of allocating body language to identifying epidemic diseases in patients early, treating them before, and assisting doctors in recognizing them arose owing to the speed with which they spread and people died. It should be emphasized that the COVID-19 disease, which horrified the entire world and revolutionized the world’s life, was the significant and crucial motivator for the idea of this study after we studied the body language analysis research in healthcare and defined the automatic recognition frame using AI to recognize various body language elements.
As researchers in information technology and computer science, we must contribute to discussing an automatic gesture recognition model that helps better identify the external symptoms of epidemic and pandemic diseases to help humanity.
First author’s research has been supported by Grant RMCG20-023-0023, Malaysia International Islamic University, and the second author’s work has been endorsed by the United Arab Emirates University Start-Up Grant 31T137.
This research was funded by Grant RMCG20-023-0023, Malaysia International Islamic University, and United Arab Emirates University Start-Up Grant 31T137.
Conceptualization, R.A. and S.T.; methodology, R.A.; software, R.A.; validation, R.A. and S.T.; formal analysis, R.A.; investigation, M.A.H.A.; resources, M.A.H.A.; data curation, R.A.; writing—original draft preparation, R.A.; writing—review and editing, S.T.; visualization, M.A.H.A.; supervision, R.A. and S.T.; project administration, R.A. and S.T.; funding acquisition, R.A. and S.T. All authors have read and agreed to the published version of the manuscript.
Informed consent statement, data availability statement, conflicts of interest.
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Microexpressions
Reviewed by Psychology Today Staff
Body language is a silent orchestra, as people constantly give clues to what they’re thinking and feeling. Non-verbal messages including body movements, facial expressions, vocal tone and volume, and other signals are collectively known as body language.
Microexpressions (brief displays of emotion on the face), hand gestures, and posture all register in the human brain almost immediately—even when a person is not consciously aware they have perceived anything. For this reason, body language can strongly color how an individual is perceived, and how he or she, in turn, interprets others’ motivation , mood, and openness . It's natural to mirror; beginning as soon as infancy, a newborn moves its body to the rhythm of the voice he hears.
Body language is a vital form of communication, but most of it happens below the level of conscious awareness. When you wait for a blind date to arrive, for instance, you may nervously tap your foot without even realizing that you’re doing it. Similarly, when you show up to meet your date, you may not consciously perceive that your date appears closed-off, but your unconscious mind may pick up on the crossed arms or averted gaze. Luckily, with knowledge and a little practice, it is possible to exert some measure of control over your own body language and to become more skilled at reading others.
The face is the first place to look , arching eyebrows might indicate an invitation of sorts, and smiling is another indication that the person welcomes you. And is the person standing or sitting close to you? If so, then there is interest. Plus, open arms are just that: Open.
If a person repeatedly touches your arm, places a light hand on your shoulder, or knocks elbows with you, the person is attracted to you and is demonstrating this with increased touch. People interested in each other smile more, and their mouths may even be slightly open. Engaging in eye contact is another indication. A person who leans towards you or mirrors your body language is also demonstrating interest.
A common form of body language is mirroring another person’s gestures and mannerisms; mirroring also includes mimicking another person’s speech patterns and even attitudes. This is a method of building rapport with others. We learn through imitating others, and it is mostly an unconscious action.
When you want to persuade or influence a person, mirroring can be an effective way to build rapport. Salespeople who use this with prospective clients pay close attention to them and they listen, observe, mimic with positive results.
People who are attracted to one another indeed copy each other’s movements and mannerisms. In fact, many animals mirror as well. That is why cats circle each other, and why chimpanzees stare at each other before intercourse.
If you tilt your head while looking at a baby, the baby relaxes. Why is that? The same applies to couples who are in love, tilting the head exposes the neck, and perhaps shows vulnerability. The person with a tilted head is perceived as more interested, attentive, caring, and having less of an agenda.
Eye blocking , or covering your eyes, expresses emotions such as frustration and worry. And sometimes the eyelids shut to show determination, while sometimes the eyelids flutter to show that you have screwed up and feel embarrassed.
When you’re stressed out, touching or stroking the neck signals a pacifying behavior. We all rub our necks at the back, the sides, and also under the chin. The fleshy area under the chin has nerve endings and stroking it lowers heart rate and calms us.
The hands reveal a lot about a person. When you feel confident, the space between your fingers grows, but that space lessens when you feel insecure. And while rubbing the hands conveys stress, steepling the fingers means that a person feels confident.
In many cultures, a light touch on the arm conveys harmony and trust. In one study, people in the UK, the US, France, and Puerto Rico were observed while sitting at a coffee shop. The British and the Americans hardly touched, and the French and the Puerto Ricans freely touched in togetherness.
To make others feel comfortable while standing, crossing your legs will show you are interested in what the other person has to say. It also means, “Take your time.” The standing crossed legs will help you say that you are comfortable with the other person.
Fidgety hands mean anxiousness or even boredom and keeping your arms akimbo may telegraph arrogance. Crossing the arms and legs is, no doubt, a closed position. Whereas sitting with open arms invites the other person in. If you are sitting and want to appear neutral, it’s best to hold your hands on your lap, just like the Queen of England.
Shake hands firmly while making eye contact, but do not squeeze the person’s hand—your goal is to make someone feel comfortable , not to assert dominance. It is important to be sensitive to cultural norms: if you receive a weak handshake, it may be that the person comes from a background in which a gentle handshake is the norm.
Most people think that crossed arms are a sign of aggression or refusal to cooperate. In fact, crossed arms can signal many other things, including anxiety, self-restraint, and even interest, if the person crossing their arms is mirroring someone who is doing the same.
For the most part, yes. All primates demonstrate behaviors including the freeze response and various self-soothing behaviors, such as touching the neck or twirling the hair in humans. We know that many non-verbal behaviors are innate because even blind children engage in them. Still, some behaviors are mysteries.
In males, wide shoulders and narrow hips are associated with strength and vitality; this is reflected in everything from the form of Greek statues to padded shoulders in men's suit jackets. How one hold's one's shoulders conveys dominance and relative status within a hierarchy.
Freezing in place, rocking back and forth, and contorting into a fetal position are all known as " reserved behaviors ," as they are used only when a person experiences extreme stress. Facial expressions alone can signal this state, such as pursing or sucking in the lips, often seen when a person is upset or feels contrite.
As social animals, we evolved to display emotions, thoughts, and intentions, all of which are processed by the brain's limbic system. Because these reactions precede and at times even override conscious deliberation, body language is uniquely capable of revealing how a person feels--but only if another person is schooled in what these gestures indicate.
Do healthcare professionals hold anti-fat attitudes? Talking openly about body size and shape is important to preventing weight-based discrimination.
Can you be too self-aware? To be an effective leader, it's important to get the balance right.
Neurodiverse individuals face conflicting expectations about emotional expression. Gus Walz's story reveals the bias and challenges around being authentic.
You’re not playing poker; you are connecting. To build a relationship, you need to ditch the poker face and show us what you are feeling.
People lie, but how do you know that they are? You may not be able to trust everyone, but trust yourself. You are the best lie detector on the planet.
Loneliness is destructive and a gateway to the development of more serious disorders, like anxiety and depression. As we lose the nonverbal skills of connecting, loneliness makes gains.
The CALM method helps us navigate life’s disruptions by capturing external details, attuning to inner sensations, labeling emotions, and moving into aligned action for growth.
Find out how to take advantage of interoception to elevate your emotional state.
One of the most debated areas of intent is decoding whether or not someone is lying. An enormous amount of human effort has gone into increasing our ability to detect liars.
Learn the relational skills to improve your intimate relationship.
It’s increasingly common for someone to be diagnosed with a condition such as ADHD or autism as an adult. A diagnosis often brings relief, but it can also come with as many questions as answers.
Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.
Nature Reviews Neuroscience volume 7 , pages 242–249 ( 2006 ) Cite this article
12k Accesses
524 Citations
33 Altmetric
Metrics details
People's faces show fear in many different circumstances. However, when people are terrified, as well as showing emotion, they run for cover. When we see a bodily expression of emotion, we immediately know what specific action is associated with a particular emotion, leaving little need for interpretation of the signal, as is the case for facial expressions. Research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are automatically perceived and understood, and their role in emotional communication and decision-making.
This is a preview of subscription content, access via your institution
Subscribe to this journal
Receive 12 print issues and online access
176,64 € per year
only 14,72 € per issue
Buy this article
Prices may be subject to local taxes which are calculated during checkout
Sprengelmeyer, R. et al. Knowing no fear. Proc. Biol. Sci. 266 , 2451–2456 (1999).
Article CAS PubMed PubMed Central Google Scholar
de Gelder, B., Snyder, J., Greve, D., Gerard, G. & Hadjikhani, N. Fear fosters flight: a mechanism for fear contagion when perceiving emotion expressed by a whole body. Proc. Natl Acad. Sci. USA 101 , 16701–16706 (2004).
Dittrich, W. H., Troscianko, T., Lea, S. E. & Morgan, D. Perception of emotion from dynamic point-light displays represented in dance. Perception 25 , 727–738 (1996).
Article CAS PubMed Google Scholar
Atkinson, A. P., Dittrich, W. H., Gemmell, A. J. & Young, A. W. Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33 , 717–746 (2004).
Article PubMed Google Scholar
Ekman, P. Differential communication of affect by head and body cues. J. Pers. Soc. Psychol. 2 , 726–735 (1965).
Panksepp, J. Affective Neuroscience: The Foundation of Human and Animal Emotions (Oxford Univ. Press, New York, 1998).
Google Scholar
Bulwer, J. Chirologia, or the Natural Language of the Hand (Harper, London1644).
Book Google Scholar
Bell, C. The Anatomy and Philosophy of Expression as Connected with the Fine Arts (George Bell and Sons, London, 1806).
Gratiolet, L. P. Mémoire sur les plis Cérébraux de l'homme et des Primates (A. Bertrand, Paris, 1854).
Duchene de Boulogne, G. B. Mécanismes de la Physionomie Humaine, ou Analyse Electro-physiologique de l'expression des Passions (Baillière, Paris, 1862).
Darwin, C. The Expression of the Emotions in Man and Animals (John Murray, London, 1872).
Frijda, N. H. The Emotions (Cambridge Univ. Press, Cambridge, 1986).
Schmidt, K. L. & Cohn, J. F. Human facial expressions as adaptations: evolutionary questions in facial expression research. Am. J. Phys. Anthropol. 33 (Suppl.), 3–24 (2001).
Davidson, R. J. & Irwin, W. The functional neuroanatomy of emotion and affective style. Trends Cogn. Sci. 3 , 11–21 (1999).
Damasio, A. R. The Feeling of What Happens (Harcourt Brace, New York, 1999).
LeDoux, J. E. The Emotional Brain: The Mysterious Underpinnings of Emotional Life 384 (Simon and Schuster, New York, USA, 1996).
Zald, D. H. The human amygdala and the emotional evaluation of sensory stimuli. Brain Res. Brain Res. Rev. 41 , 88–123 (2003).
Phelps, E. A. & Ledoux, J. E. Contributions of the amygdala to emotion processing: from animal models to human behavior. Neuron 48 , 175–187 (2005).
Brothers, L. The neural basis of primate social communication. Motiv. Emot. 14 , 81–91 (1990).
Article Google Scholar
Haxby, J. V., Hoffman, E. A. & Gobbini, M. I. The distributed human neural system for face perception. Trends Cogn. Sci. 4 , 223–233 (2000).
de Gelder, B., Frissen, I., Barton, J. & Hadjikhani, N. A modulatory role for facial expressions in prosopagnosia. Proc. Natl Acad. Sci. USA 100 , 13105–13110 (2003).
Rotshtein, P., Malach, R., Hadar, U., Graif, M. & Hendler, T. Feeling or features: different sensitivity to emotion in high-order visual cortex and amygdala. Neuron 32 , 747–757 (2001).
Adolphs, R. Neural systems for recognizing emotion. Curr. Opin. Neurobiol. 12 , 169–177 (2002).
Emery, N. J. & Amaral, D. G. in Cognitive Neuroscience of Emotion (eds Lane, R. D., Nadel, L. & Ahern, G.) 156–191 (Oxford Univ. Press, New York, 2000).
Graziano, M. S. & Cooke, D. F. Parieto-frontal interactions, personal space, and defensive behavior. Neuropsychologia 8 Nov 2005 (10.1016/j.neuropsychologia.2005.09.009).
Argyle, M. Bodily Communication 363 (Methuen, London, 1988).
de Meijer, M. The contribution of general features of body movement to the attribution of emotions. J. Nonverbal Behav. 13 , 247–268 (1989).
Reed, C. L., Stone, V. E., Bozova, S. & Tanaka, J. The body-inversion effect. Psychol. Sci. 14 , 302–308 (2003).
Stekelenburg, J. J. & de Gelder, B. The neural correlates of perceiving human bodies: an ERP study on the body-inversion effect. Neuroreport 15 , 777–780 (2004).
Perrett, D. I., Hietanen, J. K., Oram, M. W. & Benson, P. J. Organization and functions of cells responsive to faces in the temporal cortex. Phil. Trans. R. Soc. Lond. B 335 , 23–30 (1992).
Article CAS Google Scholar
Rizzolatti, G., Fadiga, L., Gallese, V. & Fogassi, L. Premotor cortex and the recognition of motor actions. Brain Res. Cogn. Brain Res. 3 , 131–141 (1996).
Downing, P. E., Jiang, Y., Shuman, M. & Kanwisher, N. A cortical area selective for visual processing of the human body. Science 293 , 2470–2473 (2001).
Bonda, E., Petrides, M., Ostry, D. & Evans, A. Specific involvement of human parietal systems and the amygdala in the perception of biological motion. J. Neurosci. 16 , 3737–3744 (1996).
Hadjikhani, N. & de Gelder, B. Seeing fearful body expressions activates the fusiform cortex and amygdala. Curr. Biol. 13 , 2201–2205 (2003).
Peelen, M. V. & Downing, P. E. Selectivity for the human body in the fusiform gyrus. J. Neurophysiol. 93 , 603–608 (2005).
Johnson, M. H. Subcortical face processing. Nature Rev. Neurosci. 6 , 766–774 (2005).
Gliga, T. & Dehaene-Lambertz, G. Structural encoding of body and face in human infants and adults. J. Cogn. Neurosci. 17 , 1328–1340 (2005).
Meeren, H. K. M., Hadjikhani, N., Ahlfors, S. P., Hamalainen, M. S. & de Gelder, B. in Human Brain Mapping (Florence, Italy, 2006).
Meeren, H. K., van Heijnsbergen, C. C. & de Gelder, B. Rapid perceptual integration of facial expression and emotional body language. Proc. Natl Acad. Sci. USA 102 , 16518–16523 (2005).
Bentin, S., Sagiv, N., Mecklinger, A., Friederici, A. & von Cramon, Y. D. Priming visual face-processing mechanisms: electrophysiological evidence. Psychol. Sci. 13 , 190–193 (2002).
Damasio, A. R. et al. Subcortical and cortical brain activity during the feeling of self-generated emotions. Nature Neurosci. 3 , 1049–1056 (2000).
Singer, T. et al. Empathy for pain involves the affective but not sensory components of pain. Science 303 , 1157–1162 (2004).
Kilts, C. D., Egan, G., Gideon, D. A., Ely, T. D. & Hoffman, J. M. Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions. Neuroimage 18 , 156–168 (2003).
Kourtzi, Z. & Kanwisher, N. Activation in human MT/MST by static images with implied motion. J. Cogn. Neurosci. 12 , 48–55 (2000).
Bertenthal, B. I., Proffitt, D. R. & Kramer, S. J. Perception of biomechanical motions by infants: implementation of various processing constraints. J. Exp. Psychol. Hum. Percept. Perform. 13 , 577–585 (1987).
Jellema, T. & Perrett, D. I. Cells in monkey STS responsive to articulated body motions and consequent static posture: a case of implied motion? Neuropsychologia 41 , 1728–1737 (2003).
Vallortigara, G., Regolin, L. & Marconato, F. Visually inexperienced chicks exhibit spontaneous preference for biological motion patterns. PLoS Biol. 3 , e208 (2005).
Heberlein, A. S. & Adolphs, R. Impaired spontaneous anthropomorphizing despite intact perception and social knowledge. Proc. Natl Acad. Sci. USA 101 , 7487–7491 (2004).
Anderson, A. K. & Phelps, E. A. Expression without recognition: contributions of the human amygdala to emotional communication. Psychol. Sci. 11 , 106–111 (2000).
Astafiev, S. V., Stanley, C. M., Shulman, G. L. & Corbetta, M. Extrastriate body area in human occipital cortex responds to the performance of motor actions. Nature Neurosci. 7 , 542–548 (2004).
Jeannerod, M. The Cognitive Neuroscience of Action (Blackwell, Oxford, 1997).
Gallese, V., Fadiga, L., Fogassi, L. & Rizzolatti, G. Action recognition in the premotor cortex. Brain 119 , 593–609 (1996).
Grèzes, J. et al. Does perception of biological motion rely on specific brain regions? Neuroimage 13 , 775–785 (2001).
Allison, T., Puce, A. & McCarthy, G. Social perception from visual cues: role of the STS region. Trends Cogn. Sci. 4 , 267–278 (2000).
Grèzes, J., Armony, J. L., Rowe, J. & Passingham, R. E. Activations related to 'mirror' and 'canonical' neurones in the human brain: an fMRI study. Neuroimage 18 , 928–937 (2003).
Fadiga, L., Fogassi, L., Gallese, V. & Rizzolatti, G. Visuomotor neurons: ambiguity of the discharge or 'motor' perception? Int. J. Psychophysiol. 35 , 165–177 (2000).
Rizzolatti, G. & Craighero, L. The mirror-neuron system. Annu. Rev. Neurosci. 27 , 169–192 (2004).
Gallese, V., Keysers, C. & Rizzolatti, G. A unifying view of the basis of social cognition. Trends Cogn. Sci. 8 , 396–403 (2004).
Wicker, B. et al. Both of us disgusted in my insula: the common neural basis of seeing and feeling disgust. Neuron 40 , 655–664 (2003).
Carr, L., Iacoboni, M., Dubeau, M. C., Mazziotta, J. C. & Lenzi, G. L. Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas. Proc. Natl Acad. Sci. USA 100 , 5497–5502 (2003).
Grosbras, M. H. & Paus, T. Brain networks involved in viewing angry hands or faces. Cereb. Cortex 12 Oct 2005 (10.1093/cercor/bhj050).
Grèzes, J., Pichon, S. & de Gelder, B. in Human Brain Mapping (Florence, Italy, 2006).
Tamietto, M., Latini, L., Weiskrantz, L., Guiliani, G. & de Gelder, B. Non-conscious recognition of faces and bodies in blindsight. 28th Cognitive Neurospsychology Workshop. Bressanone, Italy, 22–27 Jan 2006.
Pegna, A. J., Khateb, A., Lazeyras, F. & Seghier, M. L. Discriminating emotional faces without primary visual cortices involves the right amygdala. Nature Neurosci. 8 , 24–25 (2005).
de Gelder, B., Morris, J. S. & Dolan, R. J. Unconscious fear influences emotional awareness of faces and voices. Proc. Natl Acad. Sci. USA 102 , 18682–18687 (2005).
Anders, S. et al. Parietal somatosensory association cortex mediates affective blindsight. Nature Neurosci. 7 , 339–340 (2004).
Hamm, A. O. et al. Affective blindsight: intact fear conditioning to a visual cue in a cortically blind patient. Brain 126 , 267–275 (2003).
de Gelder, B., Vroomen, J., Pourtois, G. & Weiskrantz, L. Affective blindsight: are we blindly led by emotions?Response to Heywood and Kentridge (2000). Trends Cogn. Sci. 4 , 126–127 (2000).
Adolphs, R., Damasio, H., Tranel, D., Cooper, G. & Damasio, A. R. A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping. J. Neurosci. 20 , 2683–2690 (2000).
Dolan, R. J. Emotion, cognition and behavior. Science 8 , 1191–1194 (2002).
Magnée, M. J. C. M., de Gelder, B., Van Engeland, H. & Kemner, C. Facial EMG and affect processing in pervasive developmental disorder. Paper presented at the 4th International Meeting For Autism Research, Boston, Massachusetts, USA, 5–7 May 2005.
Bonelli, R. M., Kapfhammer, H. P., Pillay, S. S. & Yurglun-Todd, D. Basal ganglia volumetric studies in affective disorder: what did we learn in the last 15 years? J. Neural Transm. 113 , 255–268 (2006).
Van den Stock, J., de Gelder, B., De Diego Balaguer, R. & Bachoud-Lévi, A. -C. Huntington's disease impairs recognition of facial expression but also of body language. Paper presented at the 14th Conference of the European Society for Cognitive Psychology, Leiden, The Netherlands, 31 Aug–3 Sep 2005.
Morris, J. S., Ohman, A. & Dolan, R. J. Conscious and unconscious emotional learning in the human amygdala. Nature 393 , 467–470 (1998).
LeDoux, J. E. Emotion circuits in the brain. Annu. Rev. Neurosci. 23 , 155–184 (2000).
Schiller, P. H. & Koerner, F. Discharge characteristics of single units in superior colliculus of the alert rhesus monkey. J. Neurophysiol. 34 , 920–936 (1971).
Dean, P., Redgrave, P. & Westby, G. W. Event or emergency? Two response systems in the mammalian superior colliculus. Trends Neurosci. 12 , 137–147 (1989).
Sah, P., Faber, E. S., Lopez De Armentia, M. & Power, J. The amygdaloid complex: anatomy and physiology. Physiol. Rev. 83 , 803–834 (2003).
Davis, M. & Whalen, P. J. The amygdala: vigilance and emotion. Mol. Psychiatry 6 , 13–34 (2001).
Cardinal, R. N., Parkinson, J. A., Hall, J. & Everitt, B. J. Emotion and motivation: the role of the amygdala, ventral striatum, and prefrontal cortex. Neurosci. Biobehav. Rev. 26 , 321–352 (2002).
Bechara, A. Decision making, impulse control and loss of willpower to resist drugs: a neurocognitive perspective. Nature Neurosci. 8 , 1458–1463 (2005).
Everitt, B. J. & Robbins, T. W. Neural systems of reinforcement for drug addiction: from actions to habits to compulsion. Nature Neurosci. 8 , 1481–1489 (2005).
Delgado, M. R., Miller, M. M., Inati, S. & Phelps, E. A. An fMRI study of reward-related probability learning. Neuroimage 24 , 862–873 (2005).
Giese, M. A. & Poggio, T. Neural mechanisms for the recognition of biological movements. Nature Rev. Neurosci. 4 , 179–192 (2003).
Flash, T. & Hochner, B. Motor primitives in vertebrates and invertebrates. Curr. Opin. Neurobiol. 15 , 660–666 (2005).
Casile, A. & Giese, M. A. Critical features for the recognition of biological motion. J. Vis. 5 , 348–360 (2005).
Weiskrantz, L. Behavioral changes associated with ablation of the amygdaloid complex in monkeys. J. Comp. Physiol. Psychol. 49 , 381–391 (1956).
Amaral, D. G. & Price, J. L. Amygdalo-cortical projections in the monkey ( Macaca fascicularis ). J. Comp. Neurol. 230 , 465–496 (1984).
Amaral, D. G. & Insausti, R. Retrograde transport of D -[ 3 H]-aspartate injected into the monkey amygdaloid complex. Exp. Brain Res. 88 , 375–388 (1992).
Ghashghaei, H. T. & Barbas, H. Pathways for emotion: interactions of prefrontal and anterior temporal pathways in the amygdala of the rhesus monkey. Neuroscience 115 , 1261–1279 (2002).
Brothers, L., Ring, B. & Kling, A. Response of neurons in the macaque amygdala to complex social stimuli. Behav. Brain Res. 41 , 199–213 (1990).
Adolphs, R. The neurobiology of social cognition. Curr. Opin. Neurobiol. 11 , 231–239 (2001).
Amaral, D. G. The amygdala, social behavior, and danger detection. Ann. NY Acad. Sci. 1000 , 337–347 (2003).
Bauman, M. D., Lavenex, P., Mason, W. A., Capitanio, J. P. & Amaral, D. G. The development of social behavior following neonatal amygdala lesions in rhesus monkeys. J. Cogn. Neurosci. 16 , 1388–1411 (2004).
Emery, N. J. & Clayton, N. S. The mentality of crows: convergent evolution of intelligence in corvids and apes. Science 306 , 1903–1907 (2004).
Adolphs, R., Tranel, D. & Damasio, A. R. The human amygdala in social judgment. Nature 393 , 470–474 (1998).
Hadland, K. A., Rushworth, M. F., Gaffan, D. & Passingham, R. E. The effect of cingulate lesions on social behaviour and emotion. Neuropsychologia 41 , 919–931 (2003).
Bachevalier, J. Brief report: medial temporal lobe and autism: a putative animal model in primates. J. Autism Dev. Disord. 26 , 217–220 (1996).
Amaral, D. G. & Corbett, B. A. The amygdala, autism and anxiety. Novartis Found. Symp. 251 , 177–187; discussion 187–197, 281–197 (2003).
PubMed Google Scholar
Morris, J. S., Ohman, A. & Dolan, R. J. A subcortical pathway to the right amygdala mediating 'unseen' fear. Proc. Natl Acad. Sci. USA 96 , 1680–1685 (1999).
Ohman, A. The role of the amygdala in human fear: automatic detection of threat. Psychoneuroendocrinology 30 , 953–958 (2005).
Vuilleumier, P., Richardson, M. P., Armony, J. L., Driver, J. & Dolan, R. J. Distant influences of amygdala lesion on visual cortical activation during emotional face processing. Nature Neurosci. 7 , 1271–1278 (2004).
Rizzolatti, G., Fogassi, L. & Gallese, V. Neurophysiological mechanisms underlying the understanding and imitation of action. Nature Rev. Neurosci. 2 , 661–670 (2001).
Dapretto, M. et al. Understanding emotions in others: mirror neuron dysfunction in children with autism spectrum disorders. Nature Neurosci. 9 , 28–30 (2006).
Hadjikhani, N., Joseph, R. M., Snyder, J. & Tager-Flusberg, H. Anatomical differences in the mirror neuron system and social cognition network in autism. Cereb. Cortex 23 Nov 2005 (10.1093/cercor/bhj069).
Martin, J. H. Neuroanatomy: Text and Atlas 2nd edn (Appleton & Lange, Stamford, Connecticut, 1996).
Download references
Preparation of this manuscript was partly funded by a grant from The Human Frontier Science Program (HFSP) and by the MIND Foundation, the Martinos NMR-MGH Center, Harvard Medical School and Nederland Wetenschappelijk Onderzoek (NWO)-Dutch Science Foundation. I am very grateful to my collaborators in the joint studies reviewed here, to J. Van den Stock for assistance with the manuscript and to anonymous reviewers who provided valuable suggestions.
Authors and affiliations.
the Cognitive and Affective Neurosciences Laboratory, Tilburg University, 5000 LE Tilburg, The Netherlands
Beatrice de Gelder
Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Room 417, Building 36, First Street, Charlestown, 02129, Massachusetts, USA
Harvard Medical School, Charlestown
You can also search for this author in PubMed Google Scholar
Competing interests.
The author declares no competing financial interests.
Further information.
Fall of the Damned
Reprints and permissions
Cite this article.
de Gelder, B. Towards the neurobiology of emotional body language. Nat Rev Neurosci 7 , 242–249 (2006). https://doi.org/10.1038/nrn1872
Download citation
Issue Date : 01 March 2006
DOI : https://doi.org/10.1038/nrn1872
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.
by Imed Bouchrika, Phd
Co-Founder and Chief Data Scientist
Many situations in our day-to-day lives entail the use of nonverbal communication. According to Phutela (2015), nonverbal communication, such as body language and facial expressions, greatly affects social environments and the communication process.
Nonverbal communication can be perceived in different forms. A person who waves their hands and gestures while speaking might be perceived as enthusiastic or passionate about the subject. Meanwhile, a posture that is curled inward may suggest anxiety or fear. Reading these cues enables the person perceiving them to respond appropriately and foster stronger relationships. As such, our research team has created a guide on body language. This article will discuss the different types of nonverbal cues for various body parts to help you perceive them in various situations.
What is body language, physical expressions of body language, perception of body language, facial expressions, body posture, physical distance, applications of body language.
Nonverbal communication comprises a major part of the communication process. According to Tipper et al. (2015), in the course of everyday life, people pick up information about others’ thoughts and feelings through body posture, mannerisms, and gestures. In the 1970s, psychology professor Albert Mehrabian also suggested that during communication, the total impact of a message can be largely attributed to nonverbal communication (Pease & Pease, 2006).
The importance of body language as a form of nonverbal communication is also evident in the number of studies exploring the science behind body language and its perception. The following sections discuss a number of these studies to provide a background on body language and its mechanisms.
Body language can refer to a wide range of movements and expressions that provide nonverbal cues to someone who is perceiving another. On occasion, body language can reveal the truth of the matter. If someone says, “It’s fine," in response to a situation but frowns as they say it, then it is likely that they may still be bothered or worried. Body language can also enforce verbal statements — a fist made during a rousing speech can suggest determination or a bowed head can suggest reverence or humility
According to Atkinson et al. (2004), the movement of the body or its parts has a substantial impact on the way humans communicate. For instance, the face and body can display emotional cues that regulate social behavior (De Gelder, 2004), part of body language meaning. Slaughter et al. (2004) also explained that reading and processing signals based on the positioning of body parts allow people to detect others’ intentions, motivations, and internal states.
Tipper et al. (2015) further explained that although there is extensive research into the brain systems involved in the perception of body movement, hand gestures, eye movements, and facial expressions, there is little understanding of how the brain understands or reads body language. Thus, a body language definition is required.
Processing body posture information involves more than visual perception and requires the use of abstract abilities (Tipper et al., 2005). As such, reading body language means not just recognizing socially relevant visual information but also attributing meaning to the information. Most of the process for nonverbal communication occurs below the level of conscious awareness (Body Language, n.d.). Barrett et al. (2007), for instance, concluded that observers can process emotional information without being aware of it.
There have also been studies that provide direct and indirect evidence for recognition or understanding of stimuli even in the absence of visual awareness, such as in blind patients reliably guessing the emotions conveyed by facial and bodily expressions presented in their blind field (de Gelder et al., 1999).
Various parts of the body can be used to indicate nonverbal signals in the communication process. Below are common types of body language, body language examples, and their interpretations.
Facial expressions form an integral part of body language, so much so that photographs of faces are the most common stimuli for studies of emotion perception (Atkinson, 2004). Atkinson further states that responses to facial expressions of emotion are highly consistent. In support of this finding, Ekman (2009) explains that certain facial expressions have a universal meaning.
For instance, a person typically smiles when he or she is happy, and their faces convey more energy in general. Meanwhile, the lack of a smile is usually taken as a sign of sadness. A person who is afraid will often have raised eyebrows and a taut brow, with a mouth that hangs slightly open (The Body Language of Fear, 2020).
According to a 2008 study by Todorov et al., slightly raised eyebrows and a small smile make up the most trustworthy facial expression.
According to Cherry (2019), taking note of eye movements during conversations is a natural and important part of communication. The scientific study of eye movement, eye behavior, gaze, and eye-related nonverbal communication is referred to as oculesics.
When a person looks directly into the eyes during a conversation, it indicates that he/she is paying attention and expressing interest in what the other person is saying. On the other hand, a person who frequently looks away and breaks eye contact during conversation indicates that he or she is distracted or uncomfortable. According to D’Agostino (2013), such behavior also signifies that the person is trying to conceal his or her true feelings or intentions.
Cherry (2019) also suggested that people blink more rapidly when they feel uncomfortable or distressed. On the other hand, infrequent blinking may indicate that the person is intentionally trying to control their eye movements (Marchak, 2013). On a more subtle note, pupil size can also convey certain emotions. For instance, Jiang et al. (2017) suggested that highly dilated eyes can indicate interest or even arousal.
In 2011, Cruz found that cultural differences are present in interpretations of eye behavior as body language. For instance, in the Anglo-Saxon culture, a lack of eye contact indicates a lack of confidence or truthfulness. In Latino culture, however, using direct, prolonged eye contact can be taken as a challenge or romantic interest.
Aside from smiling, people can also use their mouths to convey a number of emotions (Cherry, 2019). Pursed lips, for instance, can be an indicator of distaste or disapproval. Sometimes, people also bite their lips when they are worried or stressed. A downturned mouth can also be an indicator of sadness or disapproval.
Whole-body posture conveys affect-specific information (Atkinson et al., 2004). Characteristic body movements and postures indicate specific emotional states. This has long been recognized and exploited by actors, directors, and dramatists (Roth, 1990 cited in Atkinson et al., 2004).
According to research by Mondloch et al., (2013), body postures are more easily and accurately recognized when the emotion is compared to a different or neutral emotion. For instance, a person who feels angry would have a dominant posture that suggests approach, compared to a fearful person who would have an avoidant posture.
A number of studies have explored the implications of posture on body language. For instance, Vacharkulksemsuk (2016) explained that an open posture, with the trunk of the body kept open and exposed, indicates positive emotions of friendliness and willingness. Meanwhile, hiding the trunk of the body and keeping the arms and legs crossed can be an indicator of unfriendliness, anxiety, and hostility. These are strong body language examples that law enforcement officers are trained to react to.
As another example, Cherry (2019) suggested that a person who sits up straight is focused and paying attention. Sitting with the upper body hunched forward, on the other hand, implies that the person is indifferent or bored.
The positioning of the arms and legs can also indicate emotions. For instance, crossed arms can indicate defensiveness or a desire for self-protection (Foley & Gentile, 2010). Standing with hands placed on the hips can also be an expression of aggressiveness or indicate that a person feels ready and in control. Crossed legs can also be an expression for a need for privacy and indicate a feeling of being closed off.
Gestures also make up a large part of nonverbal communication and, as such, have been extensively studied. According to Kurien (2010), gestures are movements made with body parts, including hands, arms, fingers, legs, and the head. For instance, crossed or folded arms can demonstrate insecurity and a lack of confidence.
Pease & Pease (2006) explain that certain gestures are considered to have a universal meaning. One such gesture is the shoulder shrug, which demonstrates a lack of understanding.
However, while there are gestures with universal meanings, what does body language mean when culture dictates the meaning for a number of gestures? For instance, while it is acceptable to point using one’s index finger in certain cultures, this gesture of pointing is considered aggressive or offensive for people who share Hindu beliefs (Black, 2011).
In another example of cultural differences among hand gestures, the “thumbs up" gesture is acceptable in countries like Germany, France, South Africa, and the United States. However, the same gesture is insulting in Iran, Bangladesh, and Thailand, where it is the equivalent to showing the middle finger (Black, 2011).
Touch can also be used as a nonverbal means of communication. Haptics is the scientific study of touching and how it is used to communicate. For instance, meaning can be gleaned from the physical contact of handshakes, holding hands, and high fives (Haptics: The Use Of Touch In Communication, 2013). The meaning varies depending on the length of the touch and the location on the body where the touching takes place.
What is body language in communication via touch? A 2006 study by Hertenstein et al. also found that people are able to accurately interpret distinct emotions from watching others communicate via touch.
According to Cherry (2019), the amount of physical space between individuals can also communicate information. Proxemics refers to the study of measurable distances between people as they interact with one another.
According to Edward T. Hall, who coined the term proxemics in 1966, there are four levels of social distance that can be observed in different social situations.
Although body language can be an involuntary phenomenon, this form of nonverbal communication is widely used today, especially by individuals pursuing careers in art . There have also been studies suggesting how to read body language, as well as documentation on the application and use of body language in different situations and environments.
For instance, according to Kellerman (1992), kinesic behavior is essential to second-language acquisition, particularly in achieving discourse and sociolinguistic competencies in the said language, as well as in performing linguistics jobs . Kellerman explains that a conscious ability to recognize and perform kinesic behavior is necessary for achieving fluency in a second language.
Kret & de Gelder (2013) also studied body language and its perception among violent offenders. The findings of the study indicate that violent offenders have difficulties processing congruences in emotions when aggressive stimuli are involved. Moreover, the study found that violent offenders have a possible bias towards aggressive body language.
Body language can also be a useful aid in a classroom management plan when used as nonlinguistic output for guiding students and paired with verbal methods. In 2014, Tai observed three ways body language affected teaching:
While the the science of body language is widely accepted and body language can be useful, there are certain gestures that convey negative emotions in a given context (Negative Body Language: Examples & Signs, 2016). These gestures are best avoided by people who want to be more mindful of what their bodies are communicating with others. Below are some examples of these gestures and how to explain body language.
1. What is the significance of body language in communication? Body language plays a crucial role in communication, often conveying more information than verbal cues. It can reveal true feelings, reinforce verbal messages, and help build stronger relationships by enabling better understanding of others' emotions and intentions.
2. How can facial expressions impact the perception of emotions? Facial expressions are key indicators of emotions. For example, a smile generally signifies happiness, while a frown indicates sadness. Consistent facial cues, like raised eyebrows for fear or slight smiles for trustworthiness, help observers accurately interpret emotional states.
3. Why is eye contact important in conversations? Eye contact signifies attention and interest. Direct eye contact suggests engagement, while frequent breaks in eye contact may indicate discomfort or distraction. Rapid blinking can signal distress, and pupil size changes can reflect emotional responses.
4. How does body posture influence communication? Body posture conveys specific emotions and attitudes. Open postures, such as an exposed trunk, indicate friendliness, whereas closed postures, like crossed arms, suggest defensiveness or hostility. Posture also affects how others perceive a person's confidence and engagement.
5. What role do gestures play in nonverbal communication? Gestures enhance communication by providing additional context and meaning. Some gestures have universal interpretations, such as shoulder shrugs indicating confusion, but cultural differences can affect their meanings. Understanding these nuances is important for effective communication.
6. How does touch communicate different messages? Touch, studied under haptics, varies in meaning based on duration and location. Handshakes, hugs, and high fives convey different levels of intimacy, approval, or camaraderie. Interpreting touch correctly depends on the context and relationship between individuals.
7. What is proxemics, and how does it affect interactions? Proxemics studies the use of physical space in communication. It identifies four social distance zones: public, social, personal, and intimate. These distances reflect the relationship between individuals and their comfort levels in different interactions.
8. How is body language used in professional and educational settings? In professional settings, body language aids in interpreting behaviors and intentions. In education, it helps create a conducive learning environment and aids in teaching by providing nonverbal cues that complement verbal instructions. It is also crucial in second-language acquisition for achieving fluency.
9. What are some common pitfalls of body language? Negative body language, such as checking the time or fidgeting, can convey disinterest, impatience, or lack of confidence. Being aware of these gestures and avoiding them helps maintain positive interactions and effective communication.
References:
Recently published articles.
Newsletter & conference alerts.
Research.com uses the information to contact you about our relevant content. For more information, check out our privacy policy .
Thank you for subscribing!
Confirmation email sent. Please click the link in the email to confirm your subscription.
By Ramin Skibba/Undark
Posted on Oct 8, 2020 8:00 PM EDT
Ramin Skibba is an astrophysicist turned science writer and freelance journalist who is based in San Diego. This story originally featured on Undark .
Last week, tens of millions of people tuned into the first debate between President Donald J. Trump and former Vice President Joe Biden. Similar viewership is expected for the next two contests—assuming they go ahead following Trump’s COVID-19 diagnosis last week—as well as for Wednesday’s vice presidential debate in Salt Lake City. Along with listening to the candidates’ words, many viewers of the closely watched political spectacles will also pay attention to the debaters’ demeanor, posture, tics, and gestures.
Body language can exude confidence or awkwardness, charisma or anxiety. In recent years, it has also become the subject of a small cottage industry premised on the idea that nonverbal cues can reveal important truths about people in high-stakes situations. News outlets like The Washington Post and Politico interview consultants and bring them on as columnists to analyze speakers’ body language after debates and diplomatic meetings between world leaders. On YouTube, self-appointed experts claiming to read public figures’ expressions sometimes garner millions of views .
Some of this analysis explores how body language can influence audiences. Other times, pundits try to explain what public figures are thinking or feeling based on subtle cues. After Trump and Biden’s first debate, for example, one analyst told The Independent , a British newspaper, that when Biden looked down at his lectern as Trump spoke, it “could be interpreted as submission to the attack” or a sign of self-control.
This work has a more consequential side: Many police departments and federal agencies use body language analysis as a forensics technique, claiming that these tools can help assess people’s intentions or truthfulness. Body language consultants, an Intercept investigation reported in August, have trained federal and local “law enforcement across the country.”
Psychologists and other researchers agree that body language can convey certain emotional states. But many bold claims haven’t been backed by scientific evidence. For instance, claims that a single gesture reliably indicates what a person thinks or desires—that maintaining eye contact for too long means a person is lying, that a smile without crinkles around the eyes isn’t a genuine one, or that a pointed finger with a closed hand is a display of dominance.
“Nonverbal communication in politics is extremely important because it creates impressions among the public, and this can influence whether people trust a politician,” says Vincent Denault, a communication researcher at the University of Montreal.
But when it comes to pundits commenting about body language in the media, “what you see is often more entertainment than science,” he says. “It can contribute to misinformation.”
Modern research on body language —often called nonverbal behavior—began in the 1960s and ’70s with studies that aimed to demonstrate the universality of facial expressions of emotion. That work was inspired, in part, by Charles Darwin’s neglected study from a century earlier, “The Expression of Emotions in Man and Animals,” according to David Matsumoto, a San Francisco State University psychologist and director of Humintell, a company that provides body language trainings and does research for companies and government agencies.
Since then, researchers have examined how parts of the brain seemingly react to particular facial expressions, and how infants begin to imitate facial and hand gestures. But scientists have also mapped the complexities and subtleties of body language, which can sometimes be challenging to decipher despite its ubiquity.
For researchers like Denault, the scope of nonverbal communication has expanded to include anything beyond a person’s spoken words. A speaker might make an impression by shrugging their shoulders, scratching their nose, tapping their foot, rolling their eyes, or wiping sweat off their face, as Richard Nixon famously did in one of his 1960 presidential election debates against John F. Kennedy. A person’s clothes, their Zoom background, and their tone, pauses, and “uhs” and “ums” while speaking all count as nonverbal cues that can shape a viewer’s perceptions.
While many experts caution that body language is complex and context-dependent, for years a small class of consultants and specialists have been applying body language research in myriad scenarios, including career coaching, work presentations, and airport screenings.
“I help people influence and persuade others around how trustworthy and credible their message is by helping them with their specific nonverbal communication,” says Mark Bowden, a body language consultant and author of the book Winning Body Language , a guide for corporate and political clients. He focuses on where a person faces their body and how much space they take up, as well as their gestures.
Some analysts also claim to be able to use those signals to interpret hidden motivations and emotions. For example, some news stories feature analysts explaining that the positioning of Donald Trump’s hands during speeches indicates that he believes in what he’s saying, or that when people touch their faces it’s a clear sign of nervousness .
But, Denault said, “associating ‘states of mind’ to specific gestures, or concluding that this gesture will have this effect on the public, without any nuance, is dubious.”
Still, analysts like Bowden and Joe Navarro, a former FBI agent and the author of What Every Body is Saying , a book about interpreting nonverbal behavior, have made careers in part out of those kinds of insights.
Navarro, who has analyzed politicians’ body language for Politico and written for CNBC about how to read the body language of someone wearing a protective mask during the COVID-19 pandemic, says that he has a particular method for assessing speakers like the presidential candidates. “I record it and then watch it with the sound off,” he said. “I look for behavior that stands out: these discomfort displays, the furrowing of the forehead and the glabella, the area between the eyes, or the pursing of the lips or the ventilating by pulling their shirt collar.” As an example, he argues that it’s easy to spot Donald Trump’s lip movements when he reacts to a question he apparently doesn’t like.
While the work of Navarro and other analysts can attract large audiences, many experts are unsure whether their methods are as reliable as claimed.
“Our facial expressions convey certain types of emotional states,” Matsumoto says. So do some motions, like a shrug. “But there’s a lot of noise, too,” he says. “People do all kinds of things with their bodies.” For example, a person’s raised eyebrow could be express disbelief—but it might also signal discomfort or surprise. The same hand gesture could mean different things in different cultures.
Denault and Matsumoto are both skeptical of those making strong conclusions based on body language observations. Because of all the ambiguities, even perceptive observers can’t infer a person’s thoughts or intentions based on their nonverbal behavior alone, Denault argues.
Dawn Sweet, a University of Idaho communication researcher, agrees. “There’s not likely to be a single behavior diagnostic ever to be found” for someone lying or acting aggressively, she says.
Sweet and her fellow researchers often look at a person’s body language and spoken words together, since they’re usually communicating the same things. The researchers also examine the context of a person’s behavior and learn more about the speaker, since it matters if the behavior is typical for them or a deviation.
Sweet cites an earlier analysis of dozens of studies involving more than 1,300 estimates of 158 possible signs of deception. These studies focused on body language cues that people sometimes associate with lying, like fidgeting or avoiding eye contact. The studies found that cues like these have either no links or only weak links to lying. No one has a giveaway like Pinocchio and his nose.
For that reason, some researchers, like California State University, Fullerton psychologist Iris Blandón-Gitlin, simply avoid looking at such nonverbal cues altogether. “My research is focused mostly on understanding what people are saying,” she says. In general, she finds that lying takes effort, and liars tend to tell more simplistic stories, with fewer details.
Asked about these kinds of concerns, Navarro defends his methods. “Nonverbals are quicker to observe, and they’re authentic and very accurate,” he says. He points to the role of body language in understanding what a baby is feeling before it’s able to talk, and in whether one feels safe in the presence of potentially threatening behavior. People even pick mates based on nonverbal cues, he says. But he agrees that some kinds of behavior can be more reliably interpreted than others and that nonverbal behavior is not effective for conclusively detecting deception.
Despite these expert reservations, body language analysis has also been used in criminal cases , with police, federal agents, and prosecutors using the techniques to try to determine whether a suspect is telling the truth, or whether someone convicted of a crime feels remorse.
But, like many other kinds of forensic science, body language analysis has been shown to be unreliable. The technique could unjustly sway judges and jurors in trials, says Denault, who describes some of these judgments as pseudoscience. Unsupported claims about body language, he says, may seem to offer simple solutions to the complex challenge of evaluating testimony, but evidence-based research doesn’t really provide easy answers.
That said, if security and justice professionals and other officials focus on vetted findings that have scientific consensus, Denault argues that research on nonverbal behavior could still benefit them, for example, by helping police officers behave in a way that puts suspects at ease and helps build rapport.
Whether assessing the behavior of a politician or a suspect, Sweet cautions that people easily jump to conclusions that merely confirm their preconceptions. A person might look uncomfortable, nervous, or fearful at a given moment, but observers rarely know why. An observer might think they’re noticing a telling gesture that reveals information about what another person is thinking, when they’re really just finding a reason to justify an initial belief that the person is lying or aggressive.
Matsumoto warns people not to trust every media analyst they see or read who invokes body language. “There’s a lot of great information a person can get from nonverbals,” he says. “But you have to be careful.”
IMAGES
VIDEO
COMMENTS
Body language plays a significant role in building trust and establishing effective collaboration among researchers and academics. Open and inviting body language, along with active listening skills, can foster an environment where ideas can be freely exchanged, leading to innovative breakthroughs.
This fMRI study investigated neural systems that interpret body language—the meaningful emotive expressions conveyed by body movement. Participants watched videos of performers engaged in modern dance or pantomime that conveyed specific themes such as hope, agony, lust, or exhaustion.
A growing body of evidence suggests that language evolved from manual gestures, gradually incorporating motor acts with vocal elements. In this evolutionary context, the human mirror mechanism (MM) would permit the passage from “doing something” to “communicating it to someone else.”
Body language may involve hand movements, facial expressions and hints, eye movements, tone of voice, body movements and positions, gestures, use of space, and the like. This research will focus on interpretations of the human body language, classified under kinesiology.
Body language is a silent orchestra, as people constantly give clues to what they’re thinking and feeling. Non-verbal messages including body movements, facial expressions, vocal tone and...
Their new method, based on machine-learning technology, allows them to understand, for the first time, the organization of behavior at fine time scales, which is key to understanding how individual genes or neural circuits influence bodily behavior.
Here, we investigate for the first time both levels of body features in social interactions and their influence on the perception of emotions from body language.
Emotional body language is a rapidly emerging research field in cognitive neuroscience. de Gelder reviews the body's role in our understanding of emotion, action and communication, and...
As such, our research team has created a guide on body language. This article will discuss the different types of nonverbal cues for various body parts to help you perceive them in various situations.
Modern research on body language —often called nonverbal behavior—began in the 1960s and ’70s with studies that aimed to demonstrate the universality of facial expressions of emotion.