Systematic Reviews Open Access
Copyright ©The Author(s) 2015. Published by Baishideng Publishing Group Inc. All rights reserved.
World J Psychiatr. Sep 22, 2015; 5(3): 342-351
Published online Sep 22, 2015. doi: 10.5498/wjp.v5.i3.342
Influence of gender in the recognition of basic facial expressions: A critical literature review
Larissa Forni-Santos, Flávia L Osório, Department of Neuroscience and Behavior, Ribeirão Preto Medical School - São Paulo University, Ribeirão Preto, São Paulo 14051-140, Brazil
Flávia L Osório, National Institute of Technology and Translational Medicine, Porto Alegre 90035-903, Brazil
Author contributions: Forni-Santos L and Osório FL made conception and design of the study, acquisition of data, or analysis and interpretation of data; drafted the article and made critical revisions related to important intellectual content of the manuscript; and approved final version of the article to be published.
Supported by FAPESP - Fundação de Amparo à Pesquisa do Estado de São Paulo, No. 2012/02260-7.
Conflict-of-interest statement: The authors declare there is no conflict of interest related to this article.
Data sharing statement: Not apply.
Open-Access: This article is an open-access article which was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
Correspondence to: Flávia L Osório, PhD, Department of Neuroscience and Behavior, Ribeirão Preto Medical School - São Paulo University, Avenida Bandeirantes 3900, 3 andar, Monte Alegre, Ribeirão Preto, São Paulo 14051-140, Brazil. flaliosorio@ig.com.br
Telephone: +55-16-36022530 Fax: +55-16-36022703
Received: February 23, 2015
Peer-review started: February 25, 2015
First decision: May 14, 2015
Revised: May 26, 2015
Accepted: August 4, 2015
Article in press: August 7, 2015
Published online: September 22, 2015

Abstract

AIM: To conduct a systematic literature review about the influence of gender on the recognition of facial expressions of six basic emotions.

METHODS: We made a systematic search with the search terms (face OR facial) AND (processing OR recognition OR perception) AND (emotional OR emotion) AND (gender or sex) in PubMed, PsycINFO, LILACS, and SciELO electronic databases for articles assessing outcomes related to response accuracy and latency and emotional intensity. The articles selection was performed according to parameters set by COCHRANE. The reference lists of the articles found through the database search were checked for additional references of interest.

RESULTS: In respect to accuracy, women tend to perform better than men when all emotions are considered as a set. Regarding specific emotions, there seems to be no gender-related differences in the recognition of happiness, whereas results are quite heterogeneous in respect to the remaining emotions, especially sadness, anger, and disgust. Fewer articles dealt with the parameters of response latency and emotional intensity, which hinders the generalization of their findings, especially in the face of their methodological differences.

CONCLUSION: The analysis of the studies conducted to date do not allow for definite conclusions concerning the role of the observer’s gender in the recognition of facial emotion, mostly because of the absence of standardized methods of investigation.

Key Words: Facial, Face, Perception, Recognition, Sex, Expression

Core tip: In this systematic review we found that results on the influence of the observers’ gender on the recognition of basic facial expressions of emotion as examined in respect to accuracy, latency, and emotional intensity are inconclusive, despite a small tendency for women to perform better than men in general emotion recognition. This can be partly explained by the wide variation in the methods used in the studies. We highlight the need for standardized procedures to be used in facial emotion recognition tasks. Otherwise, inconsistencies in the final results of these studies will continue to exist.



INTRODUCTION

The recognition of facial expressions has been a focus of research since the 19th century when, in 1872, Darwin[1] published his book “The Expression of Emotions in Man and Animals”.

Emotion recognition is central for successful social interactions, since it is fundamental for one to be able to correctly identify signs related to the emotional states of his counterparts. Such signs consist mostly of non-verbal behavior including gestures and facial expressions[2], the latter regarded as the main form of emotional communication[3].

In 1971, Ekman et al[4] proposed that facial displays of happiness, sadness, disgust, anger, surprise, and fear are universal as they appear in many cultural backgrounds, and they were thus named “basic expressions”.

Facial emotion recognition is a complex task because it involves several elaborate processes from a neurobiological point of view. Most brain regions that play a role in facial emotion recognition are required to execute a perception process, identifying the geometric configuration and features of the observed face so as to be able to discriminate between different stimuli based on their appearance[5].

In addition, the emotional meaning of a given face must be attributed through the identification of the specific signs of each emotion. The occipital temporal cortex, the amygdala, and the orbitofrontal cortex are regions that take part in this process[5].

It has also been established that the ability to distinguish and interpret facial displays is dependent on individual experiences and learning, although there is no consensus in regard to the mechanisms implicated in the perception and categorization of facial stimuli; that is, about whether these two processes are biologically determined or acquired through experience[6].

The expansion of research assessing facial recognition abilities has provided evidence suggesting that the observer’s characteristics affect the final results of facial emotion recognition tasks. Among these characteristics, gender stands out, with relevance in clinical settings and especially in research, as it is used as a reference variable.

A number of studies investigated the influence of gender on the accuracy and response latency in facial emotion recognition tasks, as well as differences in the neurobiological functioning of men and women during the performance of such tasks[7]. Results have been inconsistent to date, however, given the diversity of findings. Furthermore, there are no systematic reviews available dealing with evidence produced in this field.

Our objective was to make a systematic review of indexed literature on the influence of the variable gender on the recognition of facial expressions of the six basic emotions based on the outcome variables accuracy, response latency, and emotional intensity, in addition to making a critical exam of the methodology used in the studies.

MATERIALS AND METHODS

We made a systematic search with the search terms (face or facial) and (processing or recognition OR perception) and (emotional or emotion) and (gender or sex) in PubMed, PsycINFO, LILACS, and SciELO electronic databases for articles assessing outcomes related to response accuracy and latency and emotional intensity. The articles selection was performed according to parameters set by COCHRANE. The reference lists of the articles found through the database search were checked for additional references of interest. The number of articles found through the database and hand searches, the inclusion and exclusion criteria, and the total number of references included in the review are shown in Figure 1.

Figure 1
Figure 1 Flowchart describing the inclusion and exclusion of references found. FER: Facial emotion recognition.
RESULTS
Sampling and methodological aspects

As seen in Figure 1, 32 articles were included in the review following a thorough analysis by two psychologists with research experience in the field. The decade with most articles published was the 2000s (31.25%, n = 10). Around 37% of the studies (n = 12) were performed in the United States. The main characteristics of the samples and methodologies used in the studies are described in Table 1.

Table 1 Main sampling and methodological aspects of studies included in this review.
Ref.CountryYearSample typeN (/)Age (mean)Stimuli/color/presentationEmotionsStimuli presentation timeOutcome variable
Gitter et al[8]United States1972U160 (NA/NA)NADS/ BW/staticH, F, D, A, S, SDNAAc
Zuckerman et al[9]United States1975U101 (64/37)NADS/ color/staticH, F, D, A, S, SD2 sAc
Fujita et al[10]United States1980U24 (12/12)NADS/ BW/staticH, D, S, SD10 sAc
Kirouac et al[11]Canada1983U34 (16/18)NAEkman-Friesen/ BW/staticH, F, D, A, S, SD10 sAc
Babchuk et al[12]United States1985U40 (20/20)25.65Dr Carroll Izard/ BW/staticH, F, D, A, SDNo limitAc, L
Kirouac et al[13]Canada1985GP300 (150/150)26.20Ekman-Friesen/ BW/staticH, F, D, A, S, SD10 sAc
Mandal et al[14]India1985U150 (75/75)26.50Ekman-Friesen/ BW/staticH, F, D, A, S, SD0.25 s, 0.5 s, 1 sAc
Wagner et al[15]England1986U53 (15/38)21.50DS/ BW/staticH, F, D, A, S, SD5 sAc
Nowicki et al[16]United States1987U107 (49/58)NAEkman-Friesen/ BW/staticH, F, D, A, S, SD0.06 sAc
Rotter et al[17]United States1988U(1) 679 (241/438) (2) 399 (162/237)(1) 20.00 (2) NADS/ Color/staticF, D, A, SDNAAc
Mufson et al[18]United States1991U275 (105/170)NAEkman-Friesen/ BW/StaticH, F, D, A, S, SD0.05 sAc
Erwin et al[19]United States1992GP(1) 39 (24/15) (2) 20 (10/10)31.80DS/ BW/staticH, SD7 sAc
Duhaney et al[20]United States1993U30 (15/15)NAEkman-Friesen/ BW/staticH, F, D, A, S, SD10 sAc, I
Hess et al[21]Canada1997U24 (12/12)18.97Matsumoto-Ekman/ BW/dynamicH, D, A, SD5 sAc
Thayer et al[22]Norway2000U44 (16/28)23Ekman-Friesen/ BW/staticH, D, F, A, S, SD6 sAc
Oyuela-Vargas et al[23]Colombia2003U60 (30/30)21.5Ekman-Friesen/ BW/staticH, A, SD2 sAc, L
Grimshaw et al[24]United States and Canada2004U73 (36/37)NAEkman-Friesen/ BW/staticH, F, A, SD0.05 sAc, L
Hall et al[25]United States2004U(1) 96 (69/27) (2) 363 (126/237)NAMatsumoto-Ekman/BW/ static and dynamicH, F, D, A, S, SD10 s/0.20 s (maximum)Ac
Rahman et al[26]England2004GP240 (120/120)29Ekman-Friesen/ BW/staticH, SDNo limitAc, L
Palermo et al[27]Australia2004GP24 (12/12)24.5Ekman-Friesen; Gur et al[51] NimStim; Watson; Mazurski-bond/ BW/staticH, F, D, A, S, SDNo limitAc, L
Montagne et al[28]The Netherlands2005U68 (28/40)22.45DS/ BW/dynamicH, F, D, A, S, SDNAAc, I
Biele et al[29]Poland2006U38 (14/24)22MSFDE/ BW/dynamic and staticH, A1.5 s (static and dynamic)I
Hampson et al[30]Canada2006U62 (31/31)20.77Ekman-Friesen/ BW/dynamicH, F, D, A, SDNo limitAc, L
Williams et al[31]Australia2009GP728 (329/339)20-91Gur et al[51]/ Color/staticH, F, D, A, S, SD2 sAc, L
Collignon et al[32]Canada2010U46 (23/23)24.8DS Color/dynamicF, D0.5 sAc
Hoffmann et al[33]Germany2010U(1) 133 (58/75) (2) 186 (70/116)22Matsumoto-Ekman/ color/staticH, F, D, A, S, SD,0.3 sAc, I
Scherer et al[34]Switzerland2011(1) GP/ (2) U(1) 7158 (5358/1800) (2) 72 (9/63)NA(1) Ekman-Friesen, BW static (2) DS/BW/dynamic(1): H, F, D, A, SD/(2): H, D, SD(1) 3 s/(2): NAAc
Donges et al[35]Germany2012GP81 (28/53)25.25Facial Emotion Discrimination Test/ BW/staticH, SD0.033 sAc, L
Weisenbach et al[36]United States2012GP138 (75/63)32.91Ekman-Friesen/ BW/staticH, F, A, SD0.3 sAc
Pinto et al[37]Brazil2013U120 (60/60)NAEkman-Friesen/ BW/dynamicH, F, D, A, S, SD0.5 sAc
Wang[38]China2013U93 (48/45)18.81Ekman-Friesen/ BW/staticH, A2 sAc
Kessels et al[39]Norway, Australia, Ireland and Germany2013GP210 (85/125)18-75ERT/ BW/dynamicH, F, D, A, S, SDNAAc

As one of the inclusion criteria for this review was the enrollment of non-clinical samples, around 75% (n = 24) of the studies involved samples of university students. Samples included a variable number of subjects (mean: 331, median: 93) with a mean age of 24 years. The homogeneity of sociodemographic variables between groups of men and women was ensured in 56.25% of the studies (n = 18)[12,13,19-24,26-28,30-33,35,36,39].

Most articles (56.25%; n = 18) provided no information concerning inclusion and exclusion criteria[8-11,14-18,20,21,23,25,27,33,34,36,37]. Among the inclusion/exclusion criteria described in the articles, the most common were presence/absence of psychiatric (24%) and neurological (21%) disorders, and use of psychotropic and/or illicit drugs (17%).

In respect to methodological aspects, Table 1 shows that 24 of the 32 studies included in the review used standardized stimuli sets, the most frequent of which was the series by Ekman et al[40] (66%, n = 16). Another eight studies (25%) used their own sets of stimuli. Black and white stimuli (n = 27) were more common than colored stimuli (n = 5). Only one study[31] used a standardized procedure; the remaining investigations adapted procedures according to their objectives resulting in large diversity, with methodological details not always available in the articles, which hampered comprehension.

The study by Williams et al[31] was also the only investigation that used the Internet for data collection, whereas the remaining studies used face-to-face stimuli presentations.

The number of stimuli displayed in the several tasks used in the studies ranged from six to 336 (mean: 171, median: 58), which were mostly presented at random order (59.37%, n = 19)[8,9,12,17,20-23,25-30,32,33,35,38] and statically (81.25%, n = 26).

Only eight studies (25%) used morphing techniques in the composition of their stimuli[10,20,28,29,32,33,37,39], which allows the manipulation of pictures in order to achieve the display of facial emotions at different intensities.

The number of actors photographed to compose the stimuli sets varied from two to 10 and all sets included male and female models. Most actors were Caucasian (71.87%, n = 23)[8,11,13,14,16-27,30,33,34,37-39] and adults (78.12%, n = 25)[9-11,13-27,29,30,33,34,36-38].

In respect to the emotions studied, the most frequent were happiness, assessed in 93.75% of the articles reviewed (n = 30), and sadness, assessed in 90.62% of the studies (n = 29). Surprise was the least frequently assessed emotion (56.25%, n = 18) and on average studies included displays of five facial emotions.

As also shown in Table 1, the time of stimuli presentation was measured in 28 studies (87.50%), whereas response latency was measured in only 11 (34.37%). In most studies, the presentation time was previously established and was not under the subjects’ control.

Summarizing the outcome variables analyzed in this review in accordance with our inclusion criteria, response accuracy was assessed in 31 studies, response latency in eight studies, and gender differences in relation to the intensity of displayed emotions in four studies.

Accuracy

The accuracy of emotional judgments has been studied in terms of general emotional recognition and of the recognition of specific emotions. Table 2 presents the main findings related to the accuracy of emotional judgments.

Table 2 Main results related to the variables accuracy, response latency, and emotional intensity.
VariableinvestigatedEmotionResults
F > MF = MF < M
Accuracy (n = 31)Total score (n = 26)n = 16[8-11,14,16-18,22,25,28,32-34,38,39]n = 10 [10,15,20,23-27,31,33]-
Happiness (n = 24)n = 4[21,25,33,35]n = 19[8,12,14-16,18-20,23-28,30,31,33,36,39]n = 1[37]
Sadness (n = 26)n = 12[8,14,16,17,25,28,31-34,37,39]n = 11[12,15,18,20,23,24,26,27,30,35,36]n = 3[19,21,33]
Anger (n = 21)n = 6[8,12,25,30,33,29]n = 12[16,18,20,21,23,24,27,28,31,33,36-38]n = 3[14,15,17]
Disgust (n = 19)n = 6[17,18,25,30,33,34]n = 12[8,12,14-16,20,27,28,31,33,37,39]n = 1[21]
Fear (n = 20)n = 10[8,12,17,18,25,31-33,36,39]n = 10[14-16,20,24,27,28,30,33,37]-
Surprise (n = 16)n = N = 6[8,12,16,25,28,33]n = 10[14-16,18,20,27,31,33,37,39]-
Response latency (n = 8)Total score (n = 8)-n = 5[14,23,24,27,35]n = 3[12,26,31]
Happiness (n = 1)-n = 1[23]n = 1[26]
Sadness (n = 2)--n = 2[26,30]
Anger (n = 2)-n = 1[23]n = 1[30]
Disgust (n = 1)--n = 1[30]
Fear (n = 1)--n = 1[30]
Emotional intensity (n = 4)Total score (n = 2)-n = 2[20,33]-
Anger (n = 2)n = 2[28,29]--
Disgust (n = 1)n = 1[28]--

As seen in Table 2, 26 studies assessed accuracy in respect to the full set of emotions displayed, and stimuli sets varied across studies (Table 1). From these, around two-thirds (n = 16) reported that women performed better than men in respect to the correct identification of emotions, with a minimum significance level of P≤ 0.01. In the remaining studies, no such differences were found.

These studies did not share common methodological designs, whether we consider this group as a whole (n = 26) or the groups of articles that found or failed to find gender-related differences. Thus, there seems to be no direct influence of methodological variables on the final results obtained.

In respect to specific emotions, results are rather heterogeneous. The only emotion for which a marked pattern was found was happiness, regardless of the methodology used: in 79.16% of the studies (n = 19), men and women did not have significant differences in their accuracy to recognize happiness.

In regard to the recognition of sadness, women tended to be more accurate than men, as approximately half of the articles reviewed described a significant difference in favor of women (46.15%, n = 12). Men and women tended to perform similarly in the recognition of surprise, anger, and disgust (surprise: 62.50%, n = 10; anger: 57.14%, n = 12; disgust: 63.15%, n = 12). As for fear, half of the studies found significant differences in favor of women (50%, n = 10), whereas accuracy was the same for both genders in the other half (50%, n = 10).

It should be noted that some studies, although few, described greater accuracy for men in the recognition of happiness (4.34%, n = 1), anger (14.28%, n = 3), sadness (11.53%, n = 3), and disgust (4.34%, n = 1).

Considering the findings and the methodological aspects of the studies, we can infer that, when specific emotions are examined, the studies that found no statistically significant differences between men and women had some similar characteristics in the methods used. Common aspects were: (1) use of Ekman and Friesen[40]’s stimuli series, with black and white pictures presented statically; (2) sample matching procedures for the number of subjects included in each group; (3) organization of samples so as to ensure the homogeneity of sociodemographic characteristics across groups of men and women; and (4) strict selection criteria.

Response latency

Only eight studies investigated the subjects’ response time for the recognition of facial emotions. Their results are presented in Table 2.

All eight studies assessed their full sets of stimuli, with heterogeneous results regardless of the emotions represented in each set. It is of note that this group of studies was more methodologically homogeneous than the studies that assessed accuracy, regardless of their results.

Specific emotions were assessed in a very limited number of studies (n = 3), which makes it impossible to draw even partial conclusions.

Emotional intensity

A small number of studies (n = 4) investigated the effects of emotional intensity on the recognition of facial expressions by men and women and, as Table 2 shows, their results are still speculative, especially in the face of the variety in study designs.

DISCUSSION

Several investigations have been carried out in an attempt to elucidate whether there are differences between genders in what concerns the recognition of facial emotion and the reasons for this. Although hypotheses have been raised, there is no consensus about the definitive answer for this question.

One major line of thought refers to evolutionary differences, starting from the cultural aspects that involve the attribution of roles to men and women according to gender[41]. For instance, men would be more prone to recognize anger because boys are encouraged to manifest aggressive behavior[42].

Technological advance has enabled the investigation of differences in neurobiological processing during the recognition of facial expressions. In a literature review, Fusar-Poli et al[7] concluded that men and women tend to present activation in distinct brain areas during emotion recognition, with men displaying greater activation in the right medial frontal and hippocampal gyri, left fusiform gyrus, and amygdala; whereas women would have greater activation in the right subcalosal gyrus.

In accordance with the objective of this review, we included studies with the specific aim of assessing gender differences in respect to the accuracy, response latency, and intensity of emotion in the recognition of facial expressions of at least one of the six basic emotions in adult, non-clinical samples. Therefore, we did not include articles focusing on other sociodemographic or cultural aspects, neurobiological components, hormonal issues, influence of psychoactive substances or studies involving clinical samples and/or children and adolescents, even when they provided indirect data on the influence of gender in emotion recognition.

In respect to accuracy, the studies reviewed show that women tend to be more accurate than men in the recognition of emotions in general. However, we observed that the studies that failed to find differences between men and women were more rigorous and homogeneous in terms of the methodology adopted and sample selection.

For specific emotions, in turn, no common pattern was observed, except for facial displays of happiness, as most studies found no difference between groups regardless of their design. One possible explanation for this refers to the peculiarities of the recognition of happiness, considered to be the most easily recognized emotion and for which accuracy levels often present a ceiling effect for both men and women[19].

Based on the data described above, it can be inferred that the results found in this review do not allow for conclusive statements, since findings were quite heterogeneous concerning all the outcome variables examined, with the additional problem of the small number of articles that assessed response latency and emotional intensity. It should be noted, however, that there is a small tendency for women to be more accurate in the general recognition of facial emotions, but no conclusions can be reached in respect to specific emotions.

Theoretically, hypotheses to explain possible differences between genders in terms of accuracy in emotional recognition are based on cultural and evolutionary aspects. Historically, women have been in charge of child care, especially during the pre-verbal stage of development, and would thus be required to develop abilities to recognize emotional displays and potential threat to their offspring. Accordingly, women would be more stimulated to recognize different emotions, assigning increased importance to the recognition of mental states in others with the purpose of facilitating communication, strengthening affective bonds, and protecting their social group[41].

As for men, greater importance would be attributed to the recognition of aggressive stimuli, as these could be indicative of threat posed by competitors in the same social environment and therefore have great adaptive value, ensuring the maintenance of leadership within the group[41]. This could explain the results of studies in which men showed and increased capacity to recognize anger.

With the development of research techniques that involve the recognition of facial emotion, additional hypotheses were raised to explain differences between genders based on the particularities of brain functioning. The results of a literature review[7] suggest that maturation processes play a significant role in the way men and women recognize emotion, mostly because of the action of masculine and feminine hormones, affecting even the activation of certain brain regions during research procedures.

As described above, the existing literature indicates the possibility that there are differences in facial emotion recognition by men and women; however, this result was not evident in our review. One hypothesis to explain this divergence is the variety of methodologies used in the studies, and while a common method is not used, it will be difficult to conclude in favor or against this difference. Below is a description of the main methodological aspects that can interfere with the results of studies in the field.

As mentioned earlier, the literature informs that certain characteristics of the observer, which are outlined next, can influence the performance of facial emotion recognition tasks. Thus, the selection and composition of samples may have an impact on the final results of studies if they are not taken into account.

From the 32 articles reviewed here, around half provided no information concerning their inclusion/exclusion criteria, which would be important to lend greater reliability to their findings since it is known, for example, that the presence of mental[43] and neurological disorders[44] and cognitive deficits[45] can directly affect facial recognition.

Another factor that should be considered is the ethnicity of viewers and actors, since the recognition of facial emotion is facilitated when subjects belong to the same ethnic group, as specificities exist within ethnic groups despite emotions being universal[46].

In the studies reviewed here, there was a predominance of Caucasians among both observers and actors. However, ideal tasks should include stimuli with actors of different ethnicities, which would confer greater ecological validity to the studies. The composition of the sample should also be carefully considered, with the inclusion of subjects from different ethnic groups or the establishment of bias control measures.

The same applies to the age of respondents, which is known to affect the recognition of facial expressions[47]. In many of the studies included in this review, this information was absent in sample descriptions. Among the articles that brought this information, most samples were formed by young adults. As happens with ethnicity, the inclusion of subjects at different age ranges strengthens results as the experimental situation gets closer to real life. When age variation is impossible, this variable must be controlled for.

The predominance of Caucasian ethnicity and adult age in the pictures used as stimuli is due to the fact that the most commonly used stimuli series was that of Ekman and Friesen[40], renowned and validated throughout the world despite its limitations, such as the sole inclusion of Caucasian, adult actors in black and white pictures.

Although most of the tasks in the studies used static stimuli, the remaining characteristics of the procedures employed were quite heterogeneous, including the stimuli presentation time, number of images used, and emotions represented in the stimuli sets.

A relevant example of methodological variation refers to the time of stimuli presentation, since the longer a subject can take to make his judgment, the greater the accuracy tends to be[14]. The use of standardized procedures would avoid this bias across studies, allowing more reliable comparisons of findings.

In an attempt to increase the ecological validity of facial emotion recognition tasks, recent investigations have used colored images and dynamic stimuli presentation. This trend is mostly a result of technological developments that allow the manipulation of photographs in order to make them look closer to real-life social situations[48].

Another possibility that starts to be explored is the administration of tasks over the Internet, which allows the enrollment of larger samples with greater sociocultural diversity. It should be noted that these procedures must be validated so that the results obtained through these means are indeed reliable and contribute to the research on facial emotion recognition.

We conclude that findings related to the influence of the observers’ gender on the recognition of basic facial expressions of emotion as examined in respect to accuracy, latency, and emotional intensity are inconclusive, despite a small tendency for women to perform better than men in general emotion recognition.

This can be partly explained by the wide variation in the methods used in the studies, especially in a field where a number of variables are known to affect performance in the tasks, including age, gender, and ethnicity of respondents and actors depicted in stimuli sets; time of exposure to the stimuli; presentation mode (static or dynamic); and stimuli colors[49,50].

This general look highlights the need for standardized procedures to be used in facial emotion recognition tasks that take into account the influence of variables whose effect has already been described in the literature. For many researchers in this area, the proposition of an ideal procedure is illusory. But we believe that variables like form, apresentation time and intensity of the stimulus can be standardized, as well sociodemographic and clinical characteristics of the sample (like age, intellectual capacity) must be controlled. Otherwise, inconsistencies in the final results of studies will continue to exist.

COMMENTS
Background

Emotion recognition is central for successful social interactions, since it is fundamental for one to be able to correctly identify signs related to the emotional states of his counterparts. It has also been established that the ability to distinguish and interpret facial displays is dependent on individual experiences and learning, although there is no consensus in regard to the mechanisms implicated in the perception and categorization of facial stimuli; that is, about whether these two processes are biologically determined or acquired through experience.

Research frontiers

The expansion of research assessing facial recognition abilities has provided evidence suggesting that the observer’s characteristics affect the final results of facial emotion recognition tasks. Among these characteristics, gender stands out, with relevance in clinical settings and especially in research, as it is used as a reference variable.

Innovations and breakthroughs

A number of studies investigated the influence of gender on the accuracy and response latency in facial emotion recognition tasks. Results have been inconsistent to date, however, given the diversity of findings. Furthermore, there are no systematic reviews available dealing with evidence produced in this field. The authors made a systematic review of indexed literature on the influence of the variable gender on the recognition of facial expressions of the six basic emotions based on the outcome variables accuracy, response latency, and emotional intensity, in addition to making a critical exam of the methodology used in the studies.

Applications

The findings related to the influence of the observers’ gender on the recognition of basic facial expressions of emotion as examined in respect to accuracy, latency, and emotional intensity are inconclusive, despite a small tendency for women to perform better than men in general emotion recognition. This can be partly explained by the wide variation in the methods used in the studies. It is need for standardized procedures to be used in facial emotion recognition tasks. Otherwise, inconsistencies in the final results of studies will continue to exist.

Terminology

Stimulus recognition of facial expressions: the way in which the subjects have access to facial recognition tasks, which are photographs of actors expressing various emotion; Task recognition of facial expressions: how these stimuli are presented to subjects.

Peer-review

This systematic review deals with a relevant issue such as the influence of gender in the recognition of facial expressions. Authors’ main findings are the lack of homogeneity among the studies and their serious methodological flaws. Data are therefore inconclusive and this fact strongly suggests the need for further studies with improved and standardized procedures. The manuscript is easy to read, and results and comments are well exposed throughout the paper.

Footnotes

P- Reviewer: Eduardo JA, Jun Y S- Editor: Ji FF L- Editor: A E- Editor: Wu HL

References
1.  Darwin C The expression of the emotions in man and animals. Chicago: University of Chicago Press 1965; (Original work published 1872).  [PubMed]  [DOI]  [Cited in This Article: ]
2.  Lambrecht L, Kreifelts B, Wildgruber D. Age-related decrease in recognition of emotional facial and prosodic expressions. Emotion. 2012;12:529-539.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 59]  [Cited by in F6Publishing: 64]  [Article Influence: 5.3]  [Reference Citation Analysis (0)]
3.  Mehrabian A. Communication without words. Psychol Today. 1968;2:53-56.  [PubMed]  [DOI]  [Cited in This Article: ]
4.  Ekman P, Friesen WV. Constants across cultures in the face and emotion. J Pers Soc Psychol. 1971;17:124-129.  [PubMed]  [DOI]  [Cited in This Article: ]
5.  Adolphs R. Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev. 2002;1:21-62.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 820]  [Cited by in F6Publishing: 732]  [Article Influence: 43.1]  [Reference Citation Analysis (0)]
6.  Pollak SD, Kistler DJ. Early experience is associated with the development of categorical representations for facial expressions of emotion. Proc Natl Acad Sci USA. 2002;99:9072-9076.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 293]  [Cited by in F6Publishing: 284]  [Article Influence: 12.9]  [Reference Citation Analysis (0)]
7.  Fusar-Poli P, Placentino A, Carletti F, Landi P, Allen P, Surguladze S, Benedetti F, Abbamonte M, Gasparotti R, Barale F. Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies. J Psychiatry Neurosci. 2009;34:418-432.  [PubMed]  [DOI]  [Cited in This Article: ]
8.  Gitter AG, Black H, Mostofsky D. Race and Sex in the Perception of Emotion. J Soc Issues. 1972;28:63-78.  [PubMed]  [DOI]  [Cited in This Article: ]
9.  Zuckerman M, Lipets MS, Koivumaki JH, Rosenthal R. Encoding and decoding nonverbal cues of emotion. J Pers Soc Psychol. 1975;32:1068-1076.  [PubMed]  [DOI]  [Cited in This Article: ]
10.  Fujita BN, Harper RG, Wiens AN. Encoding-decoding of nonverbal emotional messages: Sex differences in spontaneous and enacted expressions. J Nonverbal Behav. 1980;4:131-145.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 36]  [Cited by in F6Publishing: 22]  [Article Influence: 0.5]  [Reference Citation Analysis (0)]
11.  Kirouac G, Doré FY. Accuracy and latency of judgment of facial expressions of emotions. Percept Mot Skills. 1983;57:683-686.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 75]  [Cited by in F6Publishing: 78]  [Article Influence: 1.9]  [Reference Citation Analysis (0)]
12.  Babchuk WA, Hames RB, Thompson RA. Sex differences in the recognition of infant facial expressions of emotion: The primary caretaker hypothesis. Ethol Sociobiol. 1985;6:89-101.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 95]  [Cited by in F6Publishing: 142]  [Article Influence: 3.6]  [Reference Citation Analysis (0)]
13.  Kirouac G, Dore FY. Accuracy of the judgment of facial expression of emotions as a function of sex and level of education. J Nonverbal Behav. 1985;9:3-7.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 60]  [Cited by in F6Publishing: 58]  [Article Influence: 1.5]  [Reference Citation Analysis (0)]
14.  Mandal MK, Palchoudhury S. Perceptual skill in decoding facial affect. Percept Mot Skills. 1985;60:96-98.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 42]  [Cited by in F6Publishing: 42]  [Article Influence: 1.1]  [Reference Citation Analysis (0)]
15.  Wagner HL, MacDonald CJ, Manstead AS. Communication of individual emotions by spontaneous facial expressions. J Pers Soc Psychol. 1985;50:737-743.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 99]  [Cited by in F6Publishing: 100]  [Article Influence: 2.6]  [Reference Citation Analysis (0)]
16.  Nowicki S, Hartigan M. Accuracy of facial affect recognition as a function of locus of control orientation and anticipated interpersonal interaction. J Soc Psychol. 1988;128:363-372.  [PubMed]  [DOI]  [Cited in This Article: ]
17.  Rotter NG, Rotter GS. Sex differences in the encoding and decoding of negative facial emotions. J Nonverbal Behav. 1988;12:139-148.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 114]  [Cited by in F6Publishing: 73]  [Article Influence: 2.0]  [Reference Citation Analysis (0)]
18.  Mufson L, Nowicki S. Factors affecting the accuracy of facial affect recognition. J Soc Psychol. 1991;131:815-822.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 25]  [Article Influence: 0.8]  [Reference Citation Analysis (0)]
19.  Erwin RJ, Gur RC, Gur RE, Skolnick B, Mawhinney-Hee M, Smailis J. Facial emotion discrimination: I. Task construction and behavioral findings in normal subjects. Psychiatry Res. 1992;42:231-240.  [PubMed]  [DOI]  [Cited in This Article: ]
20.  Duhaney A, McKelvie SJ. Gender differences in accuracy of identification and rated intensity of facial expressions. Percept Mot Skills. 1993;76:716-718.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 6]  [Cited by in F6Publishing: 8]  [Article Influence: 0.3]  [Reference Citation Analysis (0)]
21.  Hess U, Blairy S, Kleck RE. The intensity of emotional facial expressions and decoding accuracy. J Nonverbal Behav. 1997;21:241-257.  [PubMed]  [DOI]  [Cited in This Article: ]
22.  Thayer JF, Johnsen BH. Sex differences in judgement of facial affect: a multivariate analysis of recognition errors. Scand J Psychol. 2000;41:243-246.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 184]  [Cited by in F6Publishing: 164]  [Article Influence: 6.8]  [Reference Citation Analysis (0)]
23.  Oyuela-Vargas R, Pardo-Vélez CF. Gender differences in the recognition of facial expressions of emotion. Universitas Psychologica. 2003;2:151-168.  [PubMed]  [DOI]  [Cited in This Article: ]
24.  Grimshaw GM, Bulman-Fleming MB, Ngo C. A signal-detection analysis of sex differences in the perception of emotional faces. Brain Cogn. 2004;54:248-250.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 55]  [Cited by in F6Publishing: 57]  [Article Influence: 2.9]  [Reference Citation Analysis (0)]
25.  Hall JA, Matsumoto D. Gender differences in judgments of multiple emotions from facial expressions. Emotion. 2004;4:201-206.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 257]  [Cited by in F6Publishing: 225]  [Article Influence: 11.3]  [Reference Citation Analysis (0)]
26.  Rahman Q, Wilson GD, Abrahams S. Sex, sexual orientation, and identification of positive and negative facial affect. Brain Cogn. 2004;54:179-185.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 68]  [Cited by in F6Publishing: 70]  [Article Influence: 3.5]  [Reference Citation Analysis (0)]
27.  Palermo R, Coltheart M. Photographs of facial expression: accuracy, response times, and ratings of intensity. Behav Res Methods Instrum Comput. 2004;36:634-638.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 162]  [Cited by in F6Publishing: 177]  [Article Influence: 9.3]  [Reference Citation Analysis (0)]
28.  Montagne B, Kessels RP, Frigerio E, de Haan EH, Perrett DI. Sex differences in the perception of affective facial expressions: do men really lack emotional sensitivity? Cogn Process. 2005;6:136-141.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 231]  [Cited by in F6Publishing: 215]  [Article Influence: 11.3]  [Reference Citation Analysis (0)]
29.  Biele C, Grabowska A. Sex differences in perception of emotion intensity in dynamic and static facial expressions. Exp Brain Res. 2006;171:1-6.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 132]  [Cited by in F6Publishing: 123]  [Article Influence: 6.8]  [Reference Citation Analysis (0)]
30.  Hampson E, Van Anders SM, Mullin LI. A female advantage in the recognition of emotional facial expressions: test of an evolutionary hypothesis. Evol Hum Behav. 2006;27:401-416.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 184]  [Cited by in F6Publishing: 230]  [Article Influence: 12.8]  [Reference Citation Analysis (0)]
31.  Williams LM, Mathersul D, Palmer DM, Gur RC, Gur RE, Gordon E. Explicit identification and implicit recognition of facial emotions: I. Age effects in males and females across 10 decades. J Clin Exp Neuropsychol. 2009;31:257-277.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 127]  [Cited by in F6Publishing: 126]  [Article Influence: 7.9]  [Reference Citation Analysis (0)]
32.  Collignon O, Girard S, Gosselin F, Saint-Amour D, Lepore F, Lassonde M. Women process multisensory emotion expressions more efficiently than men. Neuropsychologia. 2010;48:220-225.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 122]  [Cited by in F6Publishing: 127]  [Article Influence: 9.1]  [Reference Citation Analysis (0)]
33.  Hoffmann H, Kessler H, Eppel T, Rukavina S, Traue HC. Expression intensity, gender and facial emotion recognition: Women recognize only subtle facial emotions better than men. Acta Psychol (Amst). 2010;135:278-283.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 173]  [Cited by in F6Publishing: 172]  [Article Influence: 12.3]  [Reference Citation Analysis (0)]
34.  Scherer KR, Scherer U. Assessing the Ability to Recognize Facial and Vocal Expressions of Emotion: Construction and Validation of the Emotion Recognition Index. J Nonverbal Behav. 2011;35:305-326.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 67]  [Cited by in F6Publishing: 41]  [Article Influence: 3.2]  [Reference Citation Analysis (0)]
35.  Donges US, Kersting A, Suslow T. Women’s greater ability to perceive happy facial emotion automatically: gender differences in affective priming. PLoS One. 2012;7:e41745.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 93]  [Cited by in F6Publishing: 99]  [Article Influence: 8.3]  [Reference Citation Analysis (0)]
36.  Weisenbach SL, Rapport LJ, Briceno EM, Haase BD, Vederman AC, Bieliauskas LA, Welsh RC, Starkman MN, McInnis MG, Zubieta JK. Reduced emotion processing efficiency in healthy males relative to females. Soc Cogn Affect Neurosci. 2014;9:316-325.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 32]  [Cited by in F6Publishing: 40]  [Article Influence: 3.3]  [Reference Citation Analysis (0)]
37.  Pinto BMC, Dutra NB, Filgueiras A, Juruena MFP, Stingel AN. Gender differences among undergraduates in the recognition of emotional facial expressions. Av Psicol Latinonot. 2013;31:1.  [PubMed]  [DOI]  [Cited in This Article: ]
38.  Wang B. Gender difference in recognition memory for neutral and emotional faces. Memory. 2013;21:991-1003.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 24]  [Article Influence: 2.2]  [Reference Citation Analysis (0)]
39.  Kessels RP, Montagne B, Hendriks AW, Perrett DI, de Haan EH. Assessment of perception of morphed facial expressions using the Emotion Recognition Task: normative data from healthy participants aged 8-75. J Neuropsychol. 2014;8:75-93.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 90]  [Cited by in F6Publishing: 99]  [Article Influence: 9.0]  [Reference Citation Analysis (0)]
40.  Ekman P, Friesen WV; Pictures of facial affect: Palo Alto: Consulting Psychologists, 1946.  .  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 2617]  [Cited by in F6Publishing: 1285]  [Article Influence: 24.2]  [Reference Citation Analysis (0)]
41.  Kret ME, De Gelder B. A review on sex differences in processing emotional signals. Neuropsychologia. 2012;50:1211-1221.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 321]  [Cited by in F6Publishing: 337]  [Article Influence: 28.1]  [Reference Citation Analysis (0)]
42.  Eron LD, Huesmann LR. The relation of prosocial behavior to the development of aggression and psychopathy. Aggress Behav. 1984;10:201-211.  [PubMed]  [DOI]  [Cited in This Article: ]
43.  Bourke C, Douglas K, Porter R. Processing of facial emotion expression in major depression: a review. Aust N Z J Psychiatry. 2010;44:681-696.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 331]  [Cited by in F6Publishing: 337]  [Article Influence: 24.1]  [Reference Citation Analysis (0)]
44.  Meletti S, Benuzzi F, Cantalupo G, Rubboli G, Tassinari CA, Nichelli P. Facial emotion recognition impairment in chronic temporal lobe epilepsy. Epilepsia. 2009;50:1547-1559.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 86]  [Cited by in F6Publishing: 90]  [Article Influence: 6.0]  [Reference Citation Analysis (0)]
45.  Hetzroni O, Oren B. Effects of intelligence level and place of residence on the ability of individuals with mental retardation to identify facial expressions. Res Dev Disabil. 2002;23:369-378.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 13]  [Cited by in F6Publishing: 5]  [Article Influence: 0.2]  [Reference Citation Analysis (0)]
46.  Elfenbein HA, Ambady N. On the universality and cultural specificity of emotion recognition: a meta-analysis. Psychol Bull. 2002;128:203-235.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 989]  [Cited by in F6Publishing: 778]  [Article Influence: 35.4]  [Reference Citation Analysis (0)]
47.  Ruffman T, Henry JD, Livingstone V, Phillips LH. A meta-analytic review of emotion recognition and aging: implications for neuropsychological models of aging. Neurosci Biobehav Rev. 2008;32:863-881.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 503]  [Cited by in F6Publishing: 482]  [Article Influence: 30.1]  [Reference Citation Analysis (0)]
48.  Alves TR. Recognition of static and dynamic facial expressions: a study review. Estudos de Psicologia. 2013;18:125-130.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 27]  [Cited by in F6Publishing: 28]  [Article Influence: 2.5]  [Reference Citation Analysis (0)]
49.  Horstmann G, Ansorge U. Visual search for facial expressions of emotions: a comparison of dynamic and static faces. Emotion. 2009;9:29-38.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 18]  [Cited by in F6Publishing: 21]  [Article Influence: 1.4]  [Reference Citation Analysis (0)]
50.  Tanaka JW, Kiefer M, Bukach CM. A holistic account of the own-race effect in face recognition: evidence from a cross-cultural study. Cognition. 2004;93:B1-B9.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 293]  [Cited by in F6Publishing: 322]  [Article Influence: 16.1]  [Reference Citation Analysis (0)]
51.  Gur RC, Sara R, Hagendoorn M, Marom O, Hughett P, Macy L, Turner T, Bajcsy R, Posner A, Gur RE. A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. J Neurosci Methods. 2002;115:137-143.  [PubMed]  [DOI]  [Cited in This Article: ]