Journal articles on the topic 'Emotional face'

To see the other types of publications on this topic, follow the link: Emotional face.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Emotional face.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Föcker, Julia, and Brigitte Röder. "Event-Related Potentials Reveal Evidence for Late Integration of Emotional Prosody and Facial Expression in Dynamic Stimuli: An ERP Study." Multisensory Research 32, no. 6 (2019): 473–97. http://dx.doi.org/10.1163/22134808-20191332.

Full text
Abstract:
Abstract The aim of the present study was to test whether multisensory interactions of emotional signals are modulated by intermodal attention and emotional valence. Faces, voices and bimodal emotionally congruent or incongruent face–voice pairs were randomly presented. The EEG was recorded while participants were instructed to detect sad emotional expressions in either faces or voices while ignoring all stimuli with another emotional expression and sad stimuli of the task irrelevant modality. Participants processed congruent sad face–voice pairs more efficiently than sad stimuli paired with an incongruent emotion and performance was higher in congruent bimodal compared to unimodal trials, irrespective of which modality was task-relevant. Event-related potentials (ERPs) to congruent emotional face–voice pairs started to differ from ERPs to incongruent emotional face–voice pairs at 180 ms after stimulus onset: Irrespectively of which modality was task-relevant, ERPs revealed a more pronounced positivity (180 ms post-stimulus) to emotionally congruent trials compared to emotionally incongruent trials if the angry emotion was presented in the attended modality. A larger negativity to incongruent compared to congruent trials was observed in the time range of 400–550 ms (N400) for all emotions (happy, neutral, angry), irrespectively of whether faces or voices were task relevant. These results suggest an automatic interaction of emotion related information.
APA, Harvard, Vancouver, ISO, and other styles
2

Peterson, Johnathan Caleb, Carly Jacobs, John Hibbing, and Kevin Smith. "In your face." Politics and the Life Sciences 37, no. 1 (2018): 53–67. http://dx.doi.org/10.1017/pls.2017.13.

Full text
Abstract:
Research suggests that people can accurately predict the political affiliations of others using only information extracted from the face. It is less clear from this research, however, what particular facial physiological processes or features communicate such information. Using a model of emotion developed in psychology that treats emotional expressivity as an individual-level trait, this article provides a theoretical account of why emotional expressivity may provide reliable signals of political orientation, and it tests the theory in four empirical studies. We find statistically significant liberal/conservative differences in self-reported emotional expressivity, in facial emotional expressivity measured physiologically, in the perceived emotional expressivity and ideology of political elites, and in an experiment that finds that more emotionally expressive faces are perceived as more liberal.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhao, Zikang, Yujia Zhang, Tianjun Wu, Hao Guo, and Yao Li. "Emotionally Controllable Talking Face Generation from an Arbitrary Emotional Portrait." Applied Sciences 12, no. 24 (December 14, 2022): 12852. http://dx.doi.org/10.3390/app122412852.

Full text
Abstract:
With the continuous development of cross-modality generation, audio-driven talking face generation has made substantial advances in terms of speech content and mouth shape, but existing research on talking face emotion generation is still relatively unsophisticated. In this work, we present Emotionally Controllable Talking Face Generation from an Arbitrary Emotional Portrait to synthesize lip-sync and an emotionally controllable high-quality talking face. Specifically, we take a facial reenactment perspective, using facial landmarks as an intermediate representation driving the expression generation of talking faces through the landmark features of an arbitrary emotional portrait. Meanwhile, decoupled design ideas are used to divide the model into three sub-networks to improve emotion control. They are the lip-sync landmark animation generation network, the emotional landmark animation generation network, and the landmark-to-animation translation network. The two landmark animation generation networks are responsible for generating content-related lip area landmarks and facial expression landmarks to correct the landmark sequences of the target portrait. Following this, the corrected landmark sequences and the target portrait are fed into the translation network to generate an emotionally controllable talking face. Our method controls the expressions of talking faces by driving the emotional portrait images while ensuring the generation of animated lip-sync, and can handle new audio and portraits not seen during training. A multi-perspective user study and extensive quantitative and qualitative evaluations demonstrate the superiority of the system in terms of visual emotion representation and video authenticity.
APA, Harvard, Vancouver, ISO, and other styles
4

Schaarschmidt, Nadine, and Thomas Koehler. "Experiencing Emotions in Video-Mediated Psychological Counselling Versus to Face-to-Face Settings." Societies 11, no. 1 (March 11, 2021): 20. http://dx.doi.org/10.3390/soc11010020.

Full text
Abstract:
How does using video technology influence the emotional experience of communication in psychological counselling? In this paper, the experience of emotion—as an essential factor in the communication between counsellor and client—is systematically compared for face-to-face and video formats. It is suggested that the research methodology for studying computer-mediated forms of communication links lab and (virtual) reality in an ideal way. Based on a sample of 27 cases, significant differences and their observed effect sizes are presented. The aim of this study is to investigate the emotional experience in direct and mediated interaction and thus to contribute to the systematic search for evidence as to whether and how the emotional experience in psychological counselling interviews changes during video-mediated transmission. The results suggest, among others, that negative emotions are more intense in the video format and positive emotions are intensified in the face-to-face format.
APA, Harvard, Vancouver, ISO, and other styles
5

Homorogan, C., R. Adam, R. Barboianu, Z. Popovici, C. Bredicean, and M. Ienciu. "Emotional Face Recognition in Bipolar Disorder." European Psychiatry 41, S1 (April 2017): S117. http://dx.doi.org/10.1016/j.eurpsy.2017.01.1904.

Full text
Abstract:
IntroductionEmotional face recognition is significant for social communication. This is impaired in mood disorders, such as bipolar disorder. Individuals with bipolar disorder lack the ability to perceive facial expressions.ObjectivesTo analyse the capacity of emotional face recognition in subjects diagnosed with bipolar disorder.AimsTo establish a correlation between emotion recognition ability and the evolution of bipolar disease.MethodsA sample of 24 subjects were analysed in this trial, diagnosed with bipolar disorder (according to ICD-10 criteria), who were hospitalised in the Psychiatry Clinic of Timisoara and monitored in outpatients clinic. Subjects were introduced in the trial based on inclusion/exclusion criteria. The analysed parameters were: socio-demographic (age, gender, education level), the number of relapses, the predominance of manic or depressive episodes, and the ability of identifying emotions (Reading the Mind in the Eyes Test).ResultsMost of the subjects (79.16%) had a low ability to identify emotions, 20.83% had a normal capacity to recognise emotions, and none of them had a high emotion recognition capacity. The positive emotions (love, joy, surprise) were easier recognised, by 75% of the subjects, than the negative ones (anger, sadness, fear). There was no evident difference in emotional face recognition between the individuals with predominance of manic episodes than the ones who had mostly depressive episodes, and between the number of relapses.ConclusionsThe individuals with bipolar disorder have difficulties in identifying facial emotions, but with no obvious correlation between the analysed parameters.Disclosure of interestThe authors have not supplied their declaration of competing interest.
APA, Harvard, Vancouver, ISO, and other styles
6

Malsert, Jennifer, Khanh Tran, Tu Anh Thi Tran, Tho Ha-Vinh, Edouard Gentaz, and Russia Ha-Vinh Leuchter. "Cross-Cultural and Environmental Influences on Facial Emotional Discrimination Sensitivity in 9-Year-Old Children from Swiss and Vietnamese Schools." Swiss Journal of Psychology 79, no. 3-4 (December 2020): 89–99. http://dx.doi.org/10.1024/1421-0185/a000240.

Full text
Abstract:
Abstract. The Other Race Effect (ORE), i.e., recognition facilitation for own-race faces, is a well-established phenomenon with broad evidence in adults and infants. Nevertheless, the ORE in older children is poorly understood, and even less so for emotional face processing. This research samples 87 9-year-old children from Vietnamese and Swiss schools. In two separate studies, we evaluated the children’s abilities to perceive the disappearance of emotions in Asian and Caucasian faces in an offset task. The first study evaluated an “emotional ORE” in Vietnamese-Asian, Swiss-Caucasian, and Swiss-Multicultural children. Offset times showed an emotional ORE in Vietnamese-Asian children living in an ethnically homogeneous environment, whereas mixed ethnicities in Swiss children seem to have balanced performance between face types. The second study compared socioemotionally trained versus untrained Vietnamese-Asian children. Vietnamese children showed a strong emotional ORE and tend to increase their sensitivity to emotion offset after training. Moreover, an effect of emotion consistent with previous observation in adults could suggest a cultural sensitivity to disapproval signs. Taken together, the results suggest that 9-year-old children can present an emotional ORE, but that a heterogeneous environment or an emotional training could strengthen face-processing abilities without reducing skills on their own-group.
APA, Harvard, Vancouver, ISO, and other styles
7

Kim, Eunye. "The Effects of Classes on the Application of Emotional Regulation Task in Online Classes." Korean Society of Culture and Convergence 45, no. 2 (February 28, 2023): 175–91. http://dx.doi.org/10.33645/cnc.2023.02.45.02.175.

Full text
Abstract:
The purpose of this study is to examine the effects of learning flow, self-determination motivation, academic emotional regulation, and self-directed learning ability, which are known to affect learners' positive academic achievement after applying emotion regulation strategies in non-face-to-face classes. Emotional regulation strategies were conducted for 64 college students in a non-face-to-face class for 10 weeks. Next, the Paried t-test was conducted to compare learning flow, self-determination motivation, academic emotional regulation, and self-directed learning ability before and after. As a result, learning flow, academic emotional regulation, and self-directed learning ability significantly increased after the application of emotion regulation tasks in non-face-to-face classes. This study shows that if learners focus more on positive emotions in non-face-to-face classes, they can intervene in the learning and psychological aspects of students.
APA, Harvard, Vancouver, ISO, and other styles
8

Isomura, Tomoko, and Tamami Nakano. "Automatic facial mimicry in response to dynamic emotional stimuli in five-month-old infants." Proceedings of the Royal Society B: Biological Sciences 283, no. 1844 (December 14, 2016): 20161948. http://dx.doi.org/10.1098/rspb.2016.1948.

Full text
Abstract:
Human adults automatically mimic others' emotional expressions, which is believed to contribute to sharing emotions with others. Although this behaviour appears fundamental to social reciprocity, little is known about its developmental process. Therefore, we examined whether infants show automatic facial mimicry in response to others' emotional expressions. Facial electromyographic activity over the corrugator supercilii (brow) and zygomaticus major (cheek) of four- to five-month-old infants was measured while they viewed dynamic clips presenting audiovisual, visual and auditory emotions. The audiovisual bimodal emotion stimuli were a display of a laughing/crying facial expression with an emotionally congruent vocalization, whereas the visual/auditory unimodal emotion stimuli displayed those emotional faces/vocalizations paired with a neutral vocalization/face, respectively. Increased activation of the corrugator supercilii muscle in response to audiovisual cries and the zygomaticus major in response to audiovisual laughter were observed between 500 and 1000 ms after stimulus onset, which clearly suggests rapid facial mimicry. By contrast, both visual and auditory unimodal emotion stimuli did not activate the infants' corresponding muscles. These results revealed that automatic facial mimicry is present as early as five months of age, when multimodal emotional information is present.
APA, Harvard, Vancouver, ISO, and other styles
9

Curby, Kim M., Kareem J. Johnson, and Alyssa Tyson. "Face to face with emotion: Holistic face processing is modulated by emotional state." Cognition & Emotion 26, no. 1 (January 2012): 93–102. http://dx.doi.org/10.1080/02699931.2011.555752.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Evers, Kris, Inneke Kerkhof, Jean Steyaert, Ilse Noens, and Johan Wagemans. "No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces." Autism Research and Treatment 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/345878.

Full text
Abstract:
Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.
APA, Harvard, Vancouver, ISO, and other styles
11

Izen, Sarah C., Hannah E. Lapp, Daniel A. Harris, Richard G. Hunter, and Vivian M. Ciaramitaro. "Seeing a Face in a Crowd of Emotional Voices: Changes in Perception and Cortisol in Response to Emotional Information across the Senses." Brain Sciences 9, no. 8 (July 25, 2019): 176. http://dx.doi.org/10.3390/brainsci9080176.

Full text
Abstract:
One source of information we glean from everyday experience, which guides social interaction, is assessing the emotional state of others. Emotional state can be expressed through several modalities: body posture or movements, body odor, touch, facial expression, or the intonation in a voice. Much research has examined emotional processing within one sensory modality or the transfer of emotional processing from one modality to another. Yet, less is known regarding interactions across different modalities when perceiving emotions, despite our common experience of seeing emotion in a face while hearing the corresponding emotion in a voice. Our study examined if visual and auditory emotions of matched valence (congruent) conferred stronger perceptual and physiological effects compared to visual and auditory emotions of unmatched valence (incongruent). We quantified how exposure to emotional faces and/or voices altered perception using psychophysics and how it altered a physiological proxy for stress or arousal using salivary cortisol. While we found no significant advantage of congruent over incongruent emotions, we found that changes in cortisol were associated with perceptual changes. Following exposure to negative emotional content, larger decreases in cortisol, indicative of less stress, correlated with more positive perceptual after-effects, indicative of stronger biases to see neutral faces as happier.
APA, Harvard, Vancouver, ISO, and other styles
12

Schindler, Sebastian, Maximilian Bruchmann, Anna-Lena Steinweg, Robert Moeck, and Thomas Straube. "Attentional conditions differentially affect early, intermediate and late neural responses to fearful and neutral faces." Social Cognitive and Affective Neuroscience 15, no. 7 (July 2020): 765–74. http://dx.doi.org/10.1093/scan/nsaa098.

Full text
Abstract:
Abstract The processing of fearful facial expressions is prioritized by the human brain. This priority is maintained across various information processing stages as evident in early, intermediate and late components of event-related potentials (ERPs). However, emotional modulations are inconsistently reported for these different processing stages. In this pre-registered study, we investigated how feature-based attention differentially affects ERPs to fearful and neutral faces in 40 participants. The tasks required the participants to discriminate either the orientation of lines overlaid onto the face, the sex of the face or the face’s emotional expression, increasing attention to emotion-related features. We found main effects of emotion for the N170, early posterior negativity (EPN) and late positive potential (LPP). While N170 emotional modulations were task-independent, interactions of emotion and task were observed for the EPN and LPP. While EPN emotion effects were found in the sex and emotion tasks, the LPP emotion effect was mainly driven by the emotion task. This study shows that early responses to fearful faces are task-independent (N170) and likely based on low-level and configural information while during later processing stages, attention to the face (EPN) or—more specifically—to the face’s emotional expression (LPP) is crucial for reliable amplified processing of emotional faces.
APA, Harvard, Vancouver, ISO, and other styles
13

Mareeswari V. "Face Emotion Recognition based Recommendation System." ACS Journal for Science and Engineering 2, no. 1 (March 1, 2022): 73–80. http://dx.doi.org/10.34293/acsjse.v2i1.29.

Full text
Abstract:
Face recognition technology has gotten a lot of press because of its wide range of applications and market potential. It is used in a variety of fields, including surveillance systems, digital video editing, and other technical advancements. In the fields of tourism, music, video, and film, these systems have overcome the burden of irrelevant knowledge by taking into account user desires and emotional states. Advice systems, emotion recognition, and machine learning are proposed as thematic categories in the analysis. Our vision is to develop a method for recommending new content that is based on the emotional reactions of the viewers. Music is a form of art that is thought to have a stronger connection to a person's emotions. It has the unique potential to boost one's mood, and video streaming services are becoming more prevalent in people's lives, necessitating the development of better video recommendation systems that respond to their users in a customised manner. Furthermore, many users will believe that travel would be a method to help them cope with their ongoing emotions. Our project aims to create a smart travel recommendation system based on the user's emotional state. This project focuses on developing an efficient music, video, movie, and tourism recommendation system that uses Facial Recognition techniques to assess the emotion of users. The system's overall concept is to identify facial expression and provide music, video, and movie recommendations based on the user's mood.
APA, Harvard, Vancouver, ISO, and other styles
14

With, Stéphane, and Susanne Kaiser. "Sequential Patterning of Facial Actions in the Production and Perception of Emotional Expressions." Swiss Journal of Psychology 70, no. 4 (December 2011): 241–52. http://dx.doi.org/10.1024/1421-0185/a000062.

Full text
Abstract:
Despite the fact that the facial expressions of emotions are naturally dynamic social signals, their communicative value has typically been studied using static photographs. In this paper, we focus on the perception of emotions from naturally occurring, dynamic facial displays produced in social interactions. In describing their impressions of 200 video records of spontaneous emotional expressions produced during a face-to-face emotional sharing task, observers were found to agree on five emotional factors: enjoyment, hostility, embarrassment, surprise, and sadness. FACS coding and sequential analysis using a pattern detection algorithm showed that recordings rated high on one emotional factor were characterized by unique sequences of facial actions coordinated with eye and/or gaze actions. Our results suggest that the dynamic unfolding of facial displays and their combination with additional nonverbal signals may play an important and still under-investigated role in emotion perception in face-to-face interactions.
APA, Harvard, Vancouver, ISO, and other styles
15

Żochowska, Anna, Maria M. Nowicka, Michał J. Wójcik, and Anna Nowicka. "Self-face and emotional faces—are they alike?" Social Cognitive and Affective Neuroscience 16, no. 6 (February 8, 2021): 593–607. http://dx.doi.org/10.1093/scan/nsab020.

Full text
Abstract:
Abstract The image of one’s own face is a particularly distinctive feature of the self. The self-face differs from other faces not only in respect of its familiarity but also in respect of its subjective emotional significance and saliency. The current study aimed at elucidating similarities/dissimilarities between processing of one’s own face and emotional faces: happy faces (based on the self-positive bias) and fearful faces (because of their high perceptual saliency, a feature shared with self-face). Electroencephalogram data were collected in the group of 30 participants who performed a simple detection task. Event-related potential analyses indicated significantly increased P3 and late positive potential amplitudes to the self-face in comparison to all other faces: fearful, happy and neutral. Permutation tests confirmed the differences between the self-face and all three types of other faces for numerous electrode sites and in broad time windows. Representational similarity analysis, in turn, revealed distinct processing of the self-face and did not provide any evidence in favour of similarities between the self-face and emotional (either negative or positive) faces. These findings strongly suggest that the self-face processing do not resemble those of emotional faces, thus implying that prioritized self-referential processing is driven by the subjective relevance of one’s own face.
APA, Harvard, Vancouver, ISO, and other styles
16

Levitan, Carmel A., Isabelle Rusk, Danielle Jonas-Delson, Hanyun Lou, Lennon Kuzniar, Gray Davidson, and Aleksandra Sherman. "Mask wearing affects emotion perception." i-Perception 13, no. 3 (May 2022): 204166952211073. http://dx.doi.org/10.1177/20416695221107391.

Full text
Abstract:
To reduce the spread of COVID-19, mask wearing has become ubiquitous in much of the world. We studied the extent to which masks impair emotion recognition and dampen the perceived intensity of facial expressions by naturalistically inducing positive, neutral, and negative emotions in individuals while they were masked and unmasked. Two groups of online participants rated the emotional intensity of each presented image. One group rated full faces (N=104); the other (N=102) rated cropped images where only the upper face was visible. We found that masks impaired the recognition of and rated intensity of positive emotions. This happened even when the faces were cropped and the lower part of the face was not visible. Masks may thus reduce positive emotion and/or expressivity of positive emotion. However, perception of negativity was unaffected by masking, perhaps because unlike positive emotions like happiness which are signaled more in the mouth, negative emotions like anger rely more on the upper face.
APA, Harvard, Vancouver, ISO, and other styles
17

Iqbal, Muhammad, Bhakti Yudho Suprapto, Hera Hikmarika, Hermawati Hermawati, and Suci Dwijayanti. "Design of Real-Time Face Recognition and Emotion Recognition on Humanoid Robot Using Deep Learning." Jurnal Ecotipe (Electronic, Control, Telecommunication, Information, and Power Engineering) 9, no. 2 (October 6, 2022): 149–58. http://dx.doi.org/10.33019/jurnalecotipe.v9i2.3044.

Full text
Abstract:
A robot is capable of mimicking human beings, including recognizing their faces and emotions. However, current studies of the humanoid robot have not been implemented in the real-time system. In addition, face recognition and emotion recognition have been treated as separate problems. Thus, for real-time application on a humanoid robot, this study proposed a combination of face recognition and emotion recognition. Face and emotion recognition systems were developed concurrently in this study using convolutional neural network architectures. The proposed architecture was compared to the well-known architecture, AlexNet, to determine which architecture would be better suited for implementation on a humanoid robot. Primary data from 30 respondents was used for face recognition. Meanwhile, emotional data were collected from the same respondents and combined with secondary data from a 2500-person dataset. Surprise, anger, neutral, smile, and sadness were among the emotions. The experiment was carried out in real-time on a humanoid robot using the two architectures. Using the AlexNet model, the accuracy of face and emotion recognition was 87 % and 70 %, respectively. Meanwhile, the proposed architecture achieved accuracy rates of 95 % for face recognition and 75 % for emotion recognition, respectively. Thus, the proposed method performs better in terms of recognizing faces and emotions, and it can be implemented on a humanoid robot.
APA, Harvard, Vancouver, ISO, and other styles
18

Meriyati, Meriyati. "Peran Orang Tua dalam Mengembangkan Kecerdasan Emosional Anak." KONSELI : Jurnal Bimbingan dan Konseling (E-Journal) 1, no. 1 (February 6, 2018): 29–34. http://dx.doi.org/10.24042/kons.v1i1.311.

Full text
Abstract:
One effort that can be done is to prepare the cognitive intelligence and emotional intelligence. Emotional intelligence is more aimed at the effort to recognize, understand and manifest emotions in the right portion and the effort to manage emotions in order to be controlled so that can be utilized to solve various problems faced later. Through IQ people can face the real world, but the individual needs emotions in order to understand and confront himself and ultimately be able to face others, so important we build emotional intelligence of children for future preparation. By having an emotional intelligence someone will better understand and appreciate feelings on self and others and can apply it effectively in everyday life.
APA, Harvard, Vancouver, ISO, and other styles
19

Barat, Elodie, Sylvia Wirth, and Jean-René Duhamel. "Face cells in orbitofrontal cortex represent social categories." Proceedings of the National Academy of Sciences 115, no. 47 (November 5, 2018): E11158—E11167. http://dx.doi.org/10.1073/pnas.1806165115.

Full text
Abstract:
Perceiving social and emotional information from faces is a critical primate skill. For this purpose, primates evolved dedicated cortical architecture, especially in occipitotemporal areas, utilizing face-selective cells. Less understood face-selective neurons are present in the orbitofrontal cortex (OFC) and are our object of study. We examined 179 face-selective cells in the lateral sulcus of the OFC by characterizing their responses to a rich set of photographs of conspecific faces varying in age, gender, and facial expression. Principal component analysis and unsupervised cluster analysis of stimulus space both revealed that face cells encode face dimensions for social categories and emotions. Categories represented strongly were facial expressions (grin and threat versus lip smack), juvenile, and female monkeys. Cluster analyses of a control population of nearby cells lacking face selectivity did not categorize face stimuli in a meaningful way, suggesting that only face-selective cells directly support face categorization in OFC. Time course analyses of face cell activity from stimulus onset showed that faces were discriminated from nonfaces early, followed by within-face categorization for social and emotion content (i.e., young and facial expression). Face cells revealed no response to acoustic stimuli such as vocalizations and were poorly modulated by vocalizations added to faces. Neuronal responses remained stable when paired with positive or negative reinforcement, implying that face cells encode social information but not learned reward value associated to faces. Overall, our results shed light on a substantial role of the OFC in the characterizations of facial information bearing on social and emotional behavior.
APA, Harvard, Vancouver, ISO, and other styles
20

Surcinelli, Paola, Bruno Baldaro, Antonio Balsamo, Roberto Bolzani, Monia Gennari, and Nicolino C. F. Rossi. "Emotion Recognition and Expression in Young Obese Participants: Preliminary Study." Perceptual and Motor Skills 105, no. 2 (October 2007): 477–82. http://dx.doi.org/10.2466/pms.105.2.477-482.

Full text
Abstract:
This study of the presence of alexithymic characteristics in obese adolescents and preadolescents tested the hypothesis of whether they showed impaired recognition and expression of emotion. The sample included 30 obese young participants and a control group of 30 participants of normal weight for their ages. Stimuli, 42 faces representing seven emotional expressions, were shown to participants who identified the emotion expressed in the face. The Level of Emotional Awareness Scale was adapted for children to evaluate their ability to describe their emotions. Young obese participants had significantly lower scores than control participants, but no differences were found in recognition of emotion. The lack of words to describe emotions might suggest a greater prevalence of alexithymic characteristics in the obese participants, but the hypothesis of a general deficit in the processing of emotional experiences was not supported.
APA, Harvard, Vancouver, ISO, and other styles
21

Ma, Xueling, and Entao Zhang. "The influence of social power on neural responses to emotional conflict." PeerJ 9 (April 12, 2021): e11267. http://dx.doi.org/10.7717/peerj.11267.

Full text
Abstract:
Background Major power theories assume that social power can play an important role in an individual’s goal-related behaviors. However, the specific psychological mechanisms through which this occurs remain unclear. Some studies suggested that having power enhanced individuals’ goal-related behaviors, by contrast, other studies suggested that low-power individuals were associated with a greater performance in goal-directed tasks. We were particularly interested in how social power changes individuals’ goal-related behaviors during an emotional face-word Stroop task. Method Social power was primed by asking participants to recall a past situation in which they were in a position of power (high-power individuals), or a situation in which they were lacking power (low-power individuals). Afterward, participants were asked to complete an emotional face-word Stroop task. In the task, words representing specific emotions were written in a prominent red color across a face, and these words and facial expressions were either congruent or incongruent. The participant’s task was to judge the emotion of the face while ignoring the red emotional words. Results Our behavioral data showed that these individuals displayed faster reaction time and better accuracy in congruent conditions, slower reaction time for fearful faces and worse accuracy for happy faces in both incongruent and congruent conditions. The event-related potential analyses showed that, compared with low-power individuals, high-power individuals showed greater P1 amplitudes when faced with emotional stimuli (both incongruent and congruent conditions), indicating that power affects individuals’ attention in the early sensory processing of emotional stimuli. For the N170 component, low-power individuals showed more negative amplitudes when facing emotional stimuli, indicated that low-power individuals paid more attention to the construct information of emotional stimuli. For the N450 component, compared with congruent conditions, incongruent conditions elicited more negative amplitudes for both high- and low-power individuals. More importantly, fearful faces provoked enhanced P1 amplitudes in incongruent conditions than in congruent conditions only for low-power individuals, while, happy faces elicited larger P1 amplitudes in congruent conditions than in incongruent conditions only for high-power individuals. The findings suggested that during the initial stage of stimuli processing low-power individuals are more sensitive to negative stimuli than high-power individuals. Conclusion These findings provided electrophysiological evidence that the differences in the emotional conflict process between high- and low-power individuals mainly lies in the early processing stages of emotional information. Furthermore, evidence from P1 and N170 showed that there was also a redistribution of attentional resources in low-power individuals.
APA, Harvard, Vancouver, ISO, and other styles
22

Rich, Brendan A., Mary E. Grimley, Mariana Schmajuk, Karina S. Blair, R. J. R. Blair, and Ellen Leibenluft. "Face emotion labeling deficits in children with bipolar disorder and severe mood dysregulation." Development and Psychopathology 20, no. 2 (2008): 529–46. http://dx.doi.org/10.1017/s0954579408000266.

Full text
Abstract:
AbstractChildren with narrow phenotype bipolar disorder (NP-BD; i.e., history of at least one hypomanic or manic episode with euphoric mood) are deficient when labeling face emotions. It is unknown if this deficit is specific to particular emotions, or if it extends to children with severe mood dysregulation (SMD; i.e., chronic irritability and hyperarousal without episodes of mania). Thirty-nine NP-BD, 31 SMD, and 36 control subjects completed the emotional expression multimorph task, which presents gradations of facial emotions from 100% neutrality to 100% emotional expression (happiness, surprise, fear, sadness, anger, and disgust). Groups were compared in terms of intensity of emotion required before identification occurred and accuracy. Both NP-BD and SMD youth required significantly more morphs than controls to label correctly disgusted, surprised, fearful, and happy faces. Impaired face labeling correlated with deficient social reciprocity skills in NP-BD youth and dysfunctional family relationships in SMD youth. Compared to controls, patients with NP-BD or SMD require significantly more intense facial emotion before they are able to label the emotion correctly. These deficits are associated with psychosocial impairments. Understanding the neural circuitry associated with face-labeling deficits has the potential to clarify the pathophysiology of these disorders.
APA, Harvard, Vancouver, ISO, and other styles
23

Farkhod, Akhmedov, Akmalbek Bobomirzaevich Abdusalomov, Mukhriddin Mukhiddinov, and Young-Im Cho. "Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces." Sensors 22, no. 22 (November 11, 2022): 8704. http://dx.doi.org/10.3390/s22228704.

Full text
Abstract:
Owing to the availability of a wide range of emotion recognition applications in our lives, such as for mental status calculation, the demand for high-performance emotion recognition approaches remains uncertain. Nevertheless, the wearing of facial masks has been indispensable during the COVID-19 pandemic. In this study, we propose a graph-based emotion recognition method that adopts landmarks on the upper part of the face. Based on the proposed approach, several pre-processing steps were applied. After pre-processing, facial expression features need to be extracted from facial key points. The main steps of emotion recognition on masked faces include face detection by using Haar–Cascade, landmark implementation through a media-pipe face mesh model, and model training on seven emotional classes. The FER-2013 dataset was used for model training. An emotion detection model was developed for non-masked faces. Thereafter, landmarks were applied to the upper part of the face. After the detection of faces and landmark locations were extracted, we captured coordinates of emotional class landmarks and exported to a comma-separated values (csv) file. After that, model weights were transferred to the emotional classes. Finally, a landmark-based emotion recognition model for the upper facial parts was tested both on images and in real time using a web camera application. The results showed that the proposed model achieved an overall accuracy of 91.2% for seven emotional classes in the case of an image application. Image based emotion detection of the proposed model accuracy showed relatively higher results than the real-time emotion detection.
APA, Harvard, Vancouver, ISO, and other styles
24

Dove, Brienna, and Teddi S. Deka. "Mask-Wearing and Emotional Intensity Perceptions." Psi Chi Journal of Psychological Research 28, no. 1 (2023): 38–45. http://dx.doi.org/10.24839/2325-7342.jn28.1.38.

Full text
Abstract:
The widespread use of sanitary face masks due to the COVID-19 pandemic renewed interest in facial emotion perception while wearing masks. Usage of face masks during this time were considered a nonpharmaceutical intervention to mitigate the spread of the virus. We examined whether wearing face masks affects the intensity of emotion perception and judgments of approachability while also considering the sex and age of the rater. Emerging adults (ages 18–25, M = 19.5, 16 men, 44 women, n = 60) and adults (ages 26–65, M = 44.8, 10 men, 31 women, n = 41) viewed photos of faces of a young adult man, middle-aged man, young adult woman, and a middle-aged woman, masked and unmasked, with happy, sad, neutral, and angry expressions. ANOVAs repeated onthe face mask and no face mask conditions showed significant reductions in emotion intensity for happy (p < .001, η2 = .22) and sad faces (p < .001, η2 = .35), no differences for angry faces (p = .16, η2 = .02), and the opposite (increased intensity) with neutral faces (p = .03, η2 = .02). Unmasked happy faces were rated as more approachable than masked happy faces. Unmasked angry faces were rated as less approachable than masked angry faces but only by emerging adults. No differences appeared for sad emotions. Neutral faces again showed an unexpected pattern, with masks increasing approachability.
APA, Harvard, Vancouver, ISO, and other styles
25

KALYTA, OLEG. "INFORMATION TECHNOLOGY OF FACIAL EMOTION RECOGNITION FOR VISUAL SAFETY SURVEILLANCE." Computer systems and information technologies, no. 1 (April 14, 2022): 54–61. http://dx.doi.org/10.31891/csit-2022-1-7.

Full text
Abstract:
Emotional expressions serve a crucial role in interpersonal communication between people while improving social life. In particular, information safety systems for visual surveillance that aim to recognize human emotional facial states are highly relevant today. In this regard, this study is devoted to the problem of identifying the main criteria for expressing the face of emotional manifestations for the possibility of recognizing without the use of specialized equipment, for example, security surveillance cameras with low resolution. In this work, we propose informational technology to define the face’s areas that reproduce the face’s emotional look. The input data from the proposed information technology is a set of videos with detected faces with the primary emotional states reproduced on them. At first, normalization of the faces of images is conducted to compare them in one base. It is executed by centering the face area and normalizing the distance between the eyes. Based on the analysis of point features moving in the set of input images, information points are then allocated (i.e., those points whose movement in the person’s emotional expression is the most significant). At the final stage, the areas of the face (with different bias thresholds) are determined, the changes of which form a visual perception of emotions. For each selected region, a set of possible states is formed. In conclusion, the behavior of point-specific features of a person under the manifestation of specific emotions is explored experimentally, and high-quality indicators for these emotions are highlighted. According to the study results, it is preferable to create a software product based on qualitative criteria for assessing the main areas of the face to determine the mimic expression of emotions.
APA, Harvard, Vancouver, ISO, and other styles
26

Pomarol-Clotet, E., F. Hynes, C. Ashwin, E. T. Bullmore, P. J. McKenna, and K. R. Laws. "Facial emotion processing in schizophrenia: a non-specific neuropsychological deficit?" Psychological Medicine 40, no. 6 (September 24, 2009): 911–19. http://dx.doi.org/10.1017/s0033291709991309.

Full text
Abstract:
BackgroundIdentification of facial emotions has been found to be impaired in schizophrenia but there are uncertainties about the neuropsychological specificity of the finding.MethodTwenty-two patients with schizophrenia and 20 healthy controls were given tests requiring identification of facial emotion, judgement of the intensity of emotional expressions without identification, familiar face recognition and the Benton Facial Recognition Test (BFRT). The schizophrenia patients were selected to be relatively intellectually preserved.ResultsThe patients with schizophrenia showed no deficit in identifying facial emotion, although they were slower than the controls. They were, however, impaired on judging the intensity of emotional expression without identification. They showed impairment in recognizing familiar faces but not on the BFRT.ConclusionsWhen steps are taken to reduce the effects of general intellectual impairment, there is no deficit in identifying facial emotions in schizophrenia. There may, however, be a deficit in judging emotional intensity. The impairment found in naming familiar faces is consistent with other evidence of semantic memory impairment in the disorder.
APA, Harvard, Vancouver, ISO, and other styles
27

Liu, Shiguang, Huixin Wang, and Min Pei. "Facial-expression-aware Emotional Color Transfer Based on Convolutional Neural Network." ACM Transactions on Multimedia Computing, Communications, and Applications 18, no. 1 (January 31, 2022): 1–19. http://dx.doi.org/10.1145/3464382.

Full text
Abstract:
Emotional color transfer aims to change the evoked emotion of a source image to that of a target image by adjusting color distribution. Most of existing emotional color transfer methods only consider the low-level visual features of an image and ignore the facial expression features when the image contains a human face, which would cause incorrect emotion evaluation for the given image. In addition, previous emotional color transfer methods may easily result in ambiguity between the emotion of resulting image and target image. For example, if the background of the target image is dark while the facial expression is happiness, then previous methods would directly transfer dark color to the source image, neglecting the facial emotion in the image. To solve this problem, we propose a new facial-expression-aware emotional color transfer framework. Given a target image with facial expression features, we first predict the facial emotion label of the image through the emotion classification network. Then, facial emotion labels are matched with pre-trained emotional color transfer models. Finally, we use the matched emotion model to transfer the color of the target image to the source image. Considering none of the existing emotion image databases, which focus on images that contain face and background, we built an emotion database for our new emotional color transfer framework that is called “Face-Emotion database.” Experiments demonstrate that our method can successfully capture and transfer facial emotions, outperforming state-of-the-art methods.
APA, Harvard, Vancouver, ISO, and other styles
28

Lapate, Regina C., Jason Samaha, Bas Rokers, Hamdi Hamzah, Bradley R. Postle, and Richard J. Davidson. "Inhibition of Lateral Prefrontal Cortex Produces Emotionally Biased First Impressions: A Transcranial Magnetic Stimulation and Electroencephalography Study." Psychological Science 28, no. 7 (June 14, 2017): 942–53. http://dx.doi.org/10.1177/0956797617699837.

Full text
Abstract:
Optimal functioning in everyday life requires the ability to override reflexive emotional responses and prevent affective spillover to situations or people unrelated to the source of emotion. In the current study, we investigated whether the lateral prefrontal cortex (lPFC) causally regulates the influence of emotional information on subsequent judgments. We disrupted left lPFC function using transcranial magnetic stimulation (TMS) and recorded electroencephalography (EEG) before and after. Subjects evaluated the likeability of novel neutral faces after a brief exposure to a happy or fearful face. We found that lPFC inhibition biased evaluations of novel faces according to the previously processed emotional expression. Greater frontal EEG alpha power, reflecting increased inhibition by TMS, predicted increased behavioral bias. TMS-induced affective misattribution was long-lasting: Emotionally biased first impressions formed during lPFC inhibition were still detectable outside of the laboratory 3 days later. These findings indicate that lPFC serves an important emotion-regulation function by preventing incidental emotional encoding from automatically biasing subsequent appraisals.
APA, Harvard, Vancouver, ISO, and other styles
29

KEILLOR, JOCELYN M., ANNA M. BARRETT, GREGORY P. CRUCIAN, SARAH KORTENKAMP,, and KENNETH M. HEILMAN. "Emotional experience and perception in the absence of facial feedback." Journal of the International Neuropsychological Society 8, no. 1 (January 2002): 130–35. http://dx.doi.org/10.1017/s1355617701020136.

Full text
Abstract:
The facial feedback hypothesis suggests that facial expressions are either necessary or sufficient to produce emotional experience. Researchers have noted that the ideal test of the necessity aspect of this hypothesis would be an evaluation of emotional experience in a patient suffering from a bilateral facial paralysis; however, this condition is rare and no such report has been documented. We examined the role of facial expressions in the determination of emotion by studying a patient (F.P.) suffering from a bilateral facial paralysis. Despite her inability to convey emotions through facial expressions, F.P. reported normal emotional experience. When F.P. viewed emotionally evocative slides her reactions were not dampened relative to the normative sample. F.P. retained her ability to detect, discriminate, and image emotional expressions. These findings are not consistent with theories stating that feedback from an active face is necessary to experience emotion, or to process emotional facial expressions. (JINS, 2002, 8, 130–135.)
APA, Harvard, Vancouver, ISO, and other styles
30

Pioggia, Giovanni, Arti Ahluwalia, Federico Carpi, Andrea Marchetti, Marcello Ferro, Walter Rocchia, and Danilo De Rossi. "FACE: Facial Automaton for Conveying Emotions." Applied Bionics and Biomechanics 1, no. 2 (2004): 91–100. http://dx.doi.org/10.1155/2004/153078.

Full text
Abstract:
The human face is the main organ of expression, capable of transmitting emotions that are almost instantly recognised by fellow beings. In this paper, we describe the development of a lifelike facial display based on the principles of biomimetic engineering. A number of paradigms that can be used for developing believable emotional displays, borrowing from elements of anthropomorphic mechanics and control, and materials science, are outlined. These are used to lay down the technological and philosophical premises necessary to construct a man-machine interface for expressing emotions through a biomimetic mechanical head. Applications in therapy to enhance social skills and understanding emotion in people with autism are discussed.
APA, Harvard, Vancouver, ISO, and other styles
31

Mandal, Manas K., and Nalini Ambady. "Laterality of Facial Expressions of Emotion: Universal and Culture-Specific Influences." Behavioural Neurology 15, no. 1-2 (2004): 23–34. http://dx.doi.org/10.1155/2004/786529.

Full text
Abstract:
Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals.
APA, Harvard, Vancouver, ISO, and other styles
32

Kidron, Yael, and Ron Kuzar. "My face is paling against my will." Pragmatics and Cognition 10, no. 1-2 (July 11, 2002): 129–57. http://dx.doi.org/10.1075/pc.10.1-2.07kid.

Full text
Abstract:
Various syntactical forms may be used for presenting an emotional event. The choice of a grammatical form may be related to cultural, social and personal attitudes towards the nature of emotions. One of the cases in which the consistency of choices is evident is the description of bodily changes during an emotional event. In one possible syntactic style, the human experiencer is in the center of attention when a somatic change takes place, or the experiencer actively produces the vocal or facial communicative act. In a different syntactic style, the focus is on a body part or a physical sensation, which arises spontaneously and independently of the person’s will. Examples of translations from English into Hebrew and from Hebrew into English exemplify the syntactical alternatives. An empirical study is presented that links syntactic scripts to different emotion scenes.
APA, Harvard, Vancouver, ISO, and other styles
33

Anderson, Ian M., Clare Shippen, Gabriella Juhasz, Diana Chase, Emma Thomas, Darragh Downey, Zoltan G. Toth, Kathryn Lloyd-Williams, Rebecca Elliott, and J. F. William Deakin. "State-dependent alteration in face emotion recognition in depression." British Journal of Psychiatry 198, no. 4 (April 2011): 302–8. http://dx.doi.org/10.1192/bjp.bp.110.078139.

Full text
Abstract:
BackgroundNegative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse.AimsTo compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression.MethodThe sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms.ResultsIn the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group.ConclusionsAbnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.
APA, Harvard, Vancouver, ISO, and other styles
34

Tian, Wenqiang. "Personalized Emotion Recognition and Emotion Prediction System Based on Cloud Computing." Mathematical Problems in Engineering 2021 (May 26, 2021): 1–10. http://dx.doi.org/10.1155/2021/9948733.

Full text
Abstract:
Promoting economic development and improving people’s quality of life have a lot to do with the continuous improvement of cloud computing technology and the rapid expansion of applications. Emotions play an important role in all aspects of human life. It is difficult to avoid the influence of inner emotions in people’s behavior and deduction. This article mainly studies the personalized emotion recognition and emotion prediction system based on cloud computing. This paper proposes a method of intelligently identifying users’ emotional states through the use of cloud computing. First, an emotional induction experiment is designed to induce the testers’ positive, neutral, and negative three basic emotional states and collect cloud data and EEG under different emotional states. Then, the cloud data is processed and analyzed to extract emotional features. After that, this paper constructs a facial emotion prediction system based on cloud computing data model, which consists of face detection and facial emotion recognition. The system uses the SVM algorithm for face detection, uses the temporal feature algorithm for facial emotion analysis, and finally uses the classification method of machine learning to classify emotions, so as to realize the purpose of identifying the user’s emotional state through cloud computing technology. Experimental data shows that the EEG signal emotion recognition method based on time domain features performs best has better generalization ability and is improved by 6.3% on the basis of traditional methods. The experimental results show that the personalized emotion recognition method based on cloud computing is more effective than traditional methods.
APA, Harvard, Vancouver, ISO, and other styles
35

Buluk, Katarzyna, and Celina Timoszyk-Tomczak. "„Co wyraża twarz?” – rozpoznawanie ekspresji emocjonalnej twarzy przez osoby głuche i słyszące." Psychologia Rozwojowa 25, no. 4 (2020): 101–10. http://dx.doi.org/10.4467/20843879pr.20.030.13438.

Full text
Abstract:
„What does the Face Express?” – Recognition of Emotional Facial Expressions in Deaf and Hearing People An analysis of emotional functioning of deaf people is important for understanding their activities in different areas of life. Emotional functioning is related to emotional intelligence, which involves emotion perception and recognition as well as emotional expressiveness. The aim of the study was to compare the ability to recognize facial emotional expression among deaf and hearing people. The present study was conducted on 80 individuals (40 deaf people and 40 hearing people). The Emotional Intelligence Scale – Faces (Matczak, Piekarska, Studniarek, 2005) and a set of photographs used by Paul Ekman in his study of basic emotions were used for the data collection. The results obtained show that deaf people differ from hearing people in recognizing facial expressions. The analysis was conducted in terms of differences in recognition of expression of basic and complex emotions. The study included variables such as the moment of hearing loss (congenital or acquired deafness) or upbringing with deaf or hearing parents.
APA, Harvard, Vancouver, ISO, and other styles
36

Bonfiglio, Natale Salvatore, Roberta Renati, and Gabriella Bottini. "Decoding Emotion in Drug Abusers: Evidence for Face and Body Emotion Recognition and for Disgust Emotion." European Journal of Investigation in Health, Psychology and Education 12, no. 9 (September 17, 2022): 1427–40. http://dx.doi.org/10.3390/ejihpe12090099.

Full text
Abstract:
Background: Different drugs damage the frontal cortices, particularly the prefrontal areas involved in both emotional and cognitive functions, with a consequence of decoding emotion deficits for people with substance abuse. The present study aimed to explore the cognitive impairments in drug abusers through facial, body and disgust emotion recognition, expanding the investigation of emotions processing, measuring accuracy and response velocity. Methods: We enrolled 13 addicted to cocaine and 12 alcohol patients attending treatment services in Italy, comparing them with 33 matched controls. Facial emotion and body posture recognition tasks, a disgust rating task and the Barrat Impulsivity Scale were included in the experimental assessment. Results: We found that emotional processes are differently influenced by cocaine and alcohol, suggesting that these substances impact diverse cerebral systems. Conclusions: Drug abusers seem to be less accurate on elaboration of facial, body and disgust emotions. Considering that the participants were not cognitively impaired, our data support the hypothesis that emotional impairments emerge independently from the damage of cognitive functions.
APA, Harvard, Vancouver, ISO, and other styles
37

Balconi, Michela, and Ylenia Canavesio. "Empathy, Approach Attitude, and rTMs on Left DLPFC Affect Emotional Face Recognition and Facial Feedback (EMG)." Journal of Psychophysiology 30, no. 1 (January 2016): 17–28. http://dx.doi.org/10.1027/0269-8803/a000150.

Full text
Abstract:
Abstract. Empathic trait (Balanced Emotional Empathy Scale [BEES]) and emotional attitude (Behavior Activation System [BAS]) were supposed to modulate emotional face recognition, based on left dorsolateral prefrontal (DLPFC) cortex contribution. High-empathic trait (high-BEES) was compared with low-empathic trait (low-BEES), when detection performance (Accuracy Index; Response Times [RTs]) and facial activity (electromyogram, EMG, i.e., zygomatic and corrugators muscle activity) were analyzed. Moreover, the implication of the left DLPFC was tested by using low-frequency rTMS (repeated Transcranial Magnetic Stimulation) to induce a decreased response to facial expression of emotions when subjects (N = 46) were required to empathize with the emotional stimuli. EMG and behavioral responses were found to be modulated by BEES and BAS, with a decreased performance and a reduced facial responsiveness in response to happiness for high-BEES and high-BAS in the case of TMS on left DLPFC. Secondly, an emotion-specific effect was found: the DLPFC effect was observed for the positive emotion (happiness) more than for the negative emotions (anger and fear) with a decreased performance (lower Accuracy Index [AI] and higher RTs) and a decreased zygomatic muscle activity. Finally, a direct correlation was found between BEES and BAS and the latter was revealed to be predictive (regression analysis) of the behavioral and EMG modulation induced by TMS. These results suggest significant effect by empathic and emotional attitude component on both EMG and behavioral level in emotional face recognition. This mechanism appears to be supported and regulated by DLPFC. The lateralization (left) effect was discussed in light of the valence model of emotions.
APA, Harvard, Vancouver, ISO, and other styles
38

Wegrzyn, Martin, Maria Vogt, Berna Kireclioglu, Julia Schneider, and Johanna Kissler. "Mapping the emotional face. How individual face parts contribute to successful emotion recognition." PLOS ONE 12, no. 5 (May 11, 2017): e0177239. http://dx.doi.org/10.1371/journal.pone.0177239.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

KIRITA, Takahiro. "Searching an infant or adult face in emotional faces:." Proceedings of the Annual Convention of the Japanese Psychological Association 74 (September 20, 2010): 2AM126. http://dx.doi.org/10.4992/pacjpa.74.0_2am126.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Riberto, Martina, Deborah Talmi, and Gorana Pobric. "Symmetry in Emotional and Visual Similarity between Neutral and Negative Faces." Symmetry 13, no. 11 (November 4, 2021): 2091. http://dx.doi.org/10.3390/sym13112091.

Full text
Abstract:
Is Mr. Hyde more similar to his alter ego Dr. Jekyll, because of their physical identity, or to Jack the Ripper, because both evoke fear and loathing? The relative weight of emotional and visual dimensions in similarity judgements is still unclear. We expected an asymmetric effect of these dimensions on similarity perception, such that faces that express the same or similar feeling are judged as more similar than different emotional expressions of same person. We selected 10 male faces with different expressions. Each face posed one neutral expression and one emotional expression (five disgust, five fear). We paired these expressions, resulting in 190 pairs, varying either in emotional expressions, physical identity, or both. Twenty healthy participants rated the similarity of paired faces on a 7-point scale. We report a symmetric effect of emotional expression and identity on similarity judgements, suggesting that people may perceive Mr. Hyde to be just as similar to Dr. Jekyll (identity) as to Jack the Ripper (emotion). We also observed that emotional mismatch decreased perceived similarity, suggesting that emotions play a prominent role in similarity judgements. From an evolutionary perspective, poor discrimination between emotional stimuli might endanger the individual.
APA, Harvard, Vancouver, ISO, and other styles
41

Conte, Stefania, Viola Brenna, Paola Ricciardelli, and Chiara Turati. "The nature and emotional valence of a prime influences the processing of emotional faces in adults and children." International Journal of Behavioral Development 42, no. 6 (March 7, 2018): 554–62. http://dx.doi.org/10.1177/0165025418761815.

Full text
Abstract:
A large body of research has investigated both the emotional elaboration of facial stimuli in adults and the development of children’s recognition of emotional expressions. Yet, it is still not clear whether children’s ability to recognize an emotional face may be modulated by prior exposure to a different face, and whether an emotional expression may exert an effect on the processing of subsequently encountered facial emotional expressions. We tested in three experiments the recognition of happy and angry target faces preceded by neutral faces or objects (Experiment 1) and happy or angry faces (Experiment 2A and Experiment 2B) using an affective priming task in adults and 7- and 5-year-old children. Results showed a standard prime effect for neutral faces (Experiment 1) for all participants, and for happy faces in children (Experiment 2A) and adults (Experiment 2B). Otherwise, angry faces elicited negative priming effects in all participants (Experiment 2A). Overall, our findings showed that both prior exposure to a face per se and the emotional valence of the prime face have an impact on subsequent processing of facial emotional information. Implications for emotional processing are discussed.
APA, Harvard, Vancouver, ISO, and other styles
42

Carbon, Claus-Christian, and Martin Serrano. "The Impact of Face Masks on the Emotional Reading Abilities of Children—A Lesson From a Joint School–University Project." i-Perception 12, no. 4 (July 2021): 204166952110382. http://dx.doi.org/10.1177/20416695211038265.

Full text
Abstract:
Wearing face masks has become a usual practice in acute infection events inducing the problem of misinterpreting the emotions of others. Empirical evidence about face masks mainly relies on adult data, neglecting, for example, school kids who firmly are dependent on effective nonverbal communication. Here we offer insights from a joint school–university project. Data indicate that emotional reading of 9 to 10 years old pupils ( N = 57) was similarly impaired as adults on an overall performance level, but that their selective performance on specific emotions was quite different. Kids showed extreme problems in reading the emotion disgust, strong effects on fear and sadness, and only mild effects on happiness, but also even better performances for emotional states anger and neutral when faces were masked. This project did gain not only relevant data about children’s perception but also made clear how fruitful seriously conducted school projects can be to encourage the interest and commitment for Science, Technology, Engineering, and Mathematics (STEM)-relevant topics.
APA, Harvard, Vancouver, ISO, and other styles
43

Gallmeier, Charles P. "Putting on the Game Face: The Staging of Emotions in Professional Hockey." Sociology of Sport Journal 4, no. 4 (December 1987): 347–62. http://dx.doi.org/10.1123/ssj.4.4.347.

Full text
Abstract:
Dramaturgical analysis is used in a participant observation study of the emotional performances of professional hockey players before, during, and after professional games. The structure for the staging of these emotional displays is briefly described, but much more attention is placed on understanding the social processes involved in the mental or emotional preparation the players undergo in getting psyched up and “putting on the game face.” The staging of emotions is seen to evolve from expectation for emotional experience, to diffuse emotional readiness, and finally to quite specific emotional displays. The staging of emotions is shown to be directed by socialization agents (i.e., coach, trainer, teammates) who evoke rapidly shifting emotional expressions for each game day situation.
APA, Harvard, Vancouver, ISO, and other styles
44

Krokoszinski, Lars, and Daniela Hosser. "Emotion regulation during deception: an EEG study of imprisoned fraudsters." Journal of Criminal Psychology 6, no. 2 (May 3, 2016): 76–88. http://dx.doi.org/10.1108/jcp-02-2016-0005.

Full text
Abstract:
Purpose – The social interaction between a deceiver and the deceived opponent is a determining factor for deception that involves emotions. Hence, besides a great amount of cognitive control, a successful lie also requires the regulation of emotions, especially when deceiving somebody face-to-face. The purpose of this paper is to investigate emotion regulation processes in an interpersonal lying experiment and aimed to examine whether fraudsters have well-functioning emotion regulation strategies or show a lack of emotional processes when deceiving face-to-face. Design/methodology/approach – Imprisoned fraudsters (n=11), imprisoned violent offenders (n=10) and non-offenders (n=11) spontaneously deceived an interrogator in a face-to-face situation while the deceivers’ EEG was recorded. Findings – The results showed that a decrease of alpha activity in left dorsolateral prefrontal cortex (dlPFC) predicted a higher frequency of deceptive responses as well as less guilt about deceiving the interrogator. These findings suggest a pivotal role of the left dlPFC in emotion regulation during deception for fraudsters, violent offenders and non-offenders. Unlike violent offenders, fraudsters did not show differences in alpha activity of the dlPFC between truthful and deceptive responses, suggesting that fraudsters are better at emotion regulation while deceiving their opponents. Originality/value – This study emphasizes the recruitment of emotion regulation processes during deception. The results give first insight into the emotional processes underlying deception in fraudsters.
APA, Harvard, Vancouver, ISO, and other styles
45

Kelly, Karen J., and Janet Metcalfe. "Metacognition of emotional face recognition." Emotion 11, no. 4 (2011): 896–906. http://dx.doi.org/10.1037/a0023746.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Morris, Richard W., Cynthia Shannon Weickert, and Carmel M. Loughland. "Emotional face processing in schizophrenia." Current Opinion in Psychiatry 22, no. 2 (March 2009): 140–46. http://dx.doi.org/10.1097/yco.0b013e328324f895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Moody, Eric J., Catherine L. Reed, Tara Van Bommel, Betsy App, and Daniel N. McIntosh. "Emotional Mimicry Beyond the Face?" Social Psychological and Personality Science 9, no. 7 (August 25, 2017): 844–52. http://dx.doi.org/10.1177/1948550617726832.

Full text
Abstract:
Emotional mimicry—quick and spontaneous matching of another’s expressions—is a well-documented phenomenon that is associated with numerous social outcomes. Although the mechanisms underlying mimicry are not fully understood, there is growing awareness that it is more than a one-to-one motor matching of others’ expressions and may be the result of neural simulation. If true, it is possible that mimicry could extend to other parts of the body, even in the absence of visual information from that body part. Indeed, we found that passively viewing anger and fear expressions, without accompanying information from the body, voice or other channels, produced both facial mimicry and corresponding responses in arm muscles that make a fist or a defensive posture. This suggests that observers simulated observed expressions and that activity may have spilled over to other areas to create a body response.
APA, Harvard, Vancouver, ISO, and other styles
48

Åsberg Johnels, Jakob, Daniel Hovey, Nicole Zürcher, Loyse Hippolyte, Eric Lemonnier, Christopher Gillberg, and Nouchine Hadjikhani. "Autism and emotional face-viewing." Autism Research 10, no. 5 (November 28, 2016): 901–10. http://dx.doi.org/10.1002/aur.1730.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Skandrani-Marzouki, Inès, and Yousri Marzouki. "Subliminal Emotional Priming and Decision Making in a Simulated Hiring Situation." Swiss Journal of Psychology 69, no. 4 (January 2010): 213–19. http://dx.doi.org/10.1024/1421-0185/a000025.

Full text
Abstract:
The present study examines the unconscious influence of emotional information on decision making in a simulated hiring situation. We used a subliminal masked priming paradigm with varying faces as primes, which were presented for a duration of 50 ms and had two levels of emotion: positive emotion (happiness) and negative emotion (anger). These primes were followed by emotionally neutral target faces. Primes were congruent (same faces) or incongruent (different faces). Prime Emotion (positive vs. negative) was crossed with Prime Repetition (repeat vs. unrelated) in a 2 × 2 factorial design. Each participant was tested in all four of the experimental conditions, each of which had 5 different trials. The participants were asked to indicate as rapidly as possible whether they were “favorable” or “unfavorable” toward the selection of the candidate (target face). Two dependent measures were analyzed: number of target faces chosen (i.e., number of “favorable” responses to target faces) and reaction time (RT). Results revealed a strong effect of emotional priming. Participants tended to choose more target faces preceded by positive prime faces than by negative prime faces. Moreover, they reacted faster when presented with target faces preceded by negative primes. Despite its exploratory nature, this study provides further evidence for the role of emotional processing in modulating decision processes and extends the experimental manipulation of subliminal emotion to the case of the masked repetition priming technique.
APA, Harvard, Vancouver, ISO, and other styles
50

Sanscartier, Shayne, Jessica A. Maxwell, and Penelope Lockwood. "No effect of attachment avoidance on visual disengagement from a romantic partner’s face." Journal of Social and Personal Relationships 37, no. 7 (April 27, 2020): 2166–83. http://dx.doi.org/10.1177/0265407520919991.

Full text
Abstract:
Attachment avoidance (discomfort with closeness and intimacy) has been inconsistently linked to visual disengagement from emotional faces, with some studies finding disengagement toward specific emotional faces and others finding no effects. Although most studies use stranger faces as stimuli, it is likely that attachment effects would be most pronounced in the context of attachment relationships. The present study ( N = 92) combined ecologically valid stimuli (i.e., pictures of romantic partner’s face) with eye-tracking methods to more precisely test whether highly avoidant individuals are faster at disengaging from emotional faces. Unexpectedly, attachment avoidance had no effect on saccadic reaction time, regardless of face type or emotion. Instead, all participants took longer to disengage from romantic partner faces than from strangers’ faces, although this effect should be replicated in the future. Our results suggest that romantic attachments capture visual attention on an oculomotor level, regardless of one’s personal attachment orientations.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography