To see the other types of publications on this topic, follow the link: Emotional Expression.

Journal articles on the topic 'Emotional Expression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Emotional Expression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Abdellaoui, Benyoussef, Aniss Moumen, Younes El Bouzekri El Idrissi, and Ahmed Remaida. "The emotional state through visual expression, auditory expression and physiological representation." SHS Web of Conferences 119 (2021): 05008. http://dx.doi.org/10.1051/shsconf/202111905008.

Full text
Abstract:
As emotional content reflects human behaviour, automatic emotion recognition is a topic of growing interest. During the communication of an emotional message, the use of physiological signals and facial expressions gives several advantages that can be expected to understand a person’s personality and psychopathology better and determine human communication and human-machine interaction. In this article, we will present some notions about identifying the emotional state through visual expression, auditory expression and physiological representation, and the techniques used to measure emotions.
APA, Harvard, Vancouver, ISO, and other styles
2

Milenkovic, Ana. "The conceptualisation of primary emotions in the Serbian language (The case of verbs expressing joy, sadness, fear and anger)." Juznoslovenski filolog 77, no. 1 (2021): 163–85. http://dx.doi.org/10.2298/jfi2101163m.

Full text
Abstract:
The paper analyses the conceptual mechanisms underlying the development of secondary emotional meanings of ?non-emotional? verbs (in relation to their primary meaning). Being abstract, psychological entities, emotions are formalised and expressed by linguistic means using emotional lexis. Emotional verbs represent a type of this lexis: they denote emotions, emotional relationships and processes, emotional expression and an emotional situation as a whole. The research material consists of 92 verbs which are classified according to two criteria: a. the semantic role of the experiencer, i.e. whether the verbs denote experiencing or provoking an emotion (emotionally-active and emotionally-passive verbs) and b. the criterion of the primary emotion, i.e. whether the verbs belong to the emotional domain of joy, sorrow, fear or anger. The analysis showed that emotions are conceptualised by specific emotional metaphors, based on the pleasure: discomfort distinction. The primary metaphor MAN IS THE CONTAINER FOR EMOTIONS and the general metonymic rule PHYSIOLOGICAL MANIFESTATIONS OF EMOTIONS ARE THE EMOTION ITSELF, represent general mechanisms for the conceptualisation of secondary emotional meanings of verbs. It has also been shown that a certain type of a verb?s primary meaning potentially develops a certain secondary emotional meaning; in other words, each primary emotion has an intrinsic source domain which concretises its abstract meanings.
APA, Harvard, Vancouver, ISO, and other styles
3

Föcker, Julia, and Brigitte Röder. "Event-Related Potentials Reveal Evidence for Late Integration of Emotional Prosody and Facial Expression in Dynamic Stimuli: An ERP Study." Multisensory Research 32, no. 6 (2019): 473–97. http://dx.doi.org/10.1163/22134808-20191332.

Full text
Abstract:
Abstract The aim of the present study was to test whether multisensory interactions of emotional signals are modulated by intermodal attention and emotional valence. Faces, voices and bimodal emotionally congruent or incongruent face–voice pairs were randomly presented. The EEG was recorded while participants were instructed to detect sad emotional expressions in either faces or voices while ignoring all stimuli with another emotional expression and sad stimuli of the task irrelevant modality. Participants processed congruent sad face–voice pairs more efficiently than sad stimuli paired with an incongruent emotion and performance was higher in congruent bimodal compared to unimodal trials, irrespective of which modality was task-relevant. Event-related potentials (ERPs) to congruent emotional face–voice pairs started to differ from ERPs to incongruent emotional face–voice pairs at 180 ms after stimulus onset: Irrespectively of which modality was task-relevant, ERPs revealed a more pronounced positivity (180 ms post-stimulus) to emotionally congruent trials compared to emotionally incongruent trials if the angry emotion was presented in the attended modality. A larger negativity to incongruent compared to congruent trials was observed in the time range of 400–550 ms (N400) for all emotions (happy, neutral, angry), irrespectively of whether faces or voices were task relevant. These results suggest an automatic interaction of emotion related information.
APA, Harvard, Vancouver, ISO, and other styles
4

Sato, Wataru, and Sakiko Yoshikawa. "Anti-expressions: Artificial control stimuli for the visual properties of emotional facial expressions." Social Behavior and Personality: an international journal 37, no. 4 (May 1, 2009): 491–501. http://dx.doi.org/10.2224/sbp.2009.37.4.491.

Full text
Abstract:
The perceptual/cognitive processing for emotional facial expressions is effective compared to that for neutral facial expressions. To investigate whether this effectiveness can be attributed to the expression of emotion or to the visual properties of the facial expressions, we used computer morphing to develop a form of control stimuli. These "anti-expressions" changed the features in emotional facial expressions in the opposite direction from neutral expressions by amounts equivalent to the differences between emotional and neutral expressions. To examine if anti-expressions are usable as emotionally neutral faces, 35 participants were asked to categorize and rate the valence and arousal dimensions of six basic emotions for normal and anti-expressions. The results indicate that anti-expressions were assessed as neutral for anger, disgust, fear, and happiness, and these can be used as control stimuli in emotional facial expressions regarding visual properties.
APA, Harvard, Vancouver, ISO, and other styles
5

Malighetti, Clelia, Simona Sciara, Alice Chirico, and Giuseppe Riva. "Emotional Expression of #body on Instagram." Social Media + Society 6, no. 2 (April 2020): 205630512092477. http://dx.doi.org/10.1177/2056305120924771.

Full text
Abstract:
Our aim was to explore emotions in Instagram images marked with hashtags referring to body image–related components using an artificial intelligence–based discrete emotional analysis. A total of 500 Instagram photos marked by specific hashtags related to body image components were analyzed and specific discrete emotions expressed in each picture were detected using the Emotion application program interface API from Microsoft Azure Cognitive Service. Results showed that happiness and neutrality were the most intense and recognizable emotions expressed in all images. Happiness intensity was significantly higher in images with #bodyimage and #bodyconfidence and higher levels of neutral emotion were found in images tagged with #body, #bodyfitness, and #thininspirational. This study integrated a discrete emotional model with the conventional dimensional one, and offered a higher degree of granularity in the analysis of emotions–body link on Instagram through an artificial intelligence technology. Future research should deepen the use of discrete emotions on Instagram and the role of neutrality in body image representation.
APA, Harvard, Vancouver, ISO, and other styles
6

Trinh Van, Loan, Thuy Dao Thi Le, Thanh Le Xuan, and Eric Castelli. "Emotional Speech Recognition Using Deep Neural Networks." Sensors 22, no. 4 (February 12, 2022): 1414. http://dx.doi.org/10.3390/s22041414.

Full text
Abstract:
The expression of emotions in human communication plays a very important role in the information that needs to be conveyed to the partner. The forms of expression of human emotions are very rich. It could be body language, facial expressions, eye contact, laughter, and tone of voice. The languages of the world’s peoples are different, but even without understanding a language in communication, people can almost understand part of the message that the other partner wants to convey with emotional expressions as mentioned. Among the forms of human emotional expression, the expression of emotions through voice is perhaps the most studied. This article presents our research on speech emotion recognition using deep neural networks such as CNN, CRNN, and GRU. We used the Interactive Emotional Dyadic Motion Capture (IEMOCAP) corpus for the study with four emotions: anger, happiness, sadness, and neutrality. The feature parameters used for recognition include the Mel spectral coefficients and other parameters related to the spectrum and the intensity of the speech signal. The data augmentation was used by changing the voice and adding white noise. The results show that the GRU model gave the highest average recognition accuracy of 97.47%. This result is superior to existing studies on speech emotion recognition with the IEMOCAP corpus.
APA, Harvard, Vancouver, ISO, and other styles
7

Bloom, Lois. "Language Development and Emotional Expression." Pediatrics 102, Supplement_E1 (November 1, 1998): 1272–77. http://dx.doi.org/10.1542/peds.102.se1.1272.

Full text
Abstract:
The relation of language and emotion in development is most often thought about in terms of how language describes emotional experiences with words that name different feelings. However, children typically do not begin to use these words until language development is well underway, at approximately 2 years of age. Given the relatively small number of words for naming feelings and emotions, and the redundancy between emotion words and the expressions they name, understanding how emotion and language are related in early development requires looking beyond just acquisition of specific emotion words.
APA, Harvard, Vancouver, ISO, and other styles
8

Anderson, Adam K., and Elizabeth A. Phelps. "Expression Without Recognition: Contributions of the Human Amygdala to Emotional Communication." Psychological Science 11, no. 2 (March 2000): 106–11. http://dx.doi.org/10.1111/1467-9280.00224.

Full text
Abstract:
A growing body of evidence from humans and other animals suggests the amygdala may be a critical neural substrate for emotional processing. In particular, recent studies have shown that damage to the human amygdala impairs the normal appraisal of social signals of emotion, primarily those of fear. However, effective social communication depends on both the ability to receive (emotional appraisal) and the ability to send (emotional expression) signals of emotional state. Although the role of the amygdala in the appraisal of emotion is well established, its importance for the production of emotional expressions is unknown. We report a case study of a patient with bilateral amygdaloid damage who, despite a severe deficit in interpreting facial expressions of emotion including fear, exhibits an intact ability to express this and other basic emotions. This dissociation suggests that a single neural module does not support all aspects of the social communication of emotional state.
APA, Harvard, Vancouver, ISO, and other styles
9

Zecca, M., T. Chaminade, M. A. Umilta, K. Itoh, M. Saito, N. Endo, Y. Mizoguchi, et al. "2A1-O10 Emotional Expression Humanoid Robot WE-4RII : Evaluation of the perception of facial emotional expressions by using fMRI." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2007 (2007): _2A1—O10_1—_2A1—O10_4. http://dx.doi.org/10.1299/jsmermd.2007._2a1-o10_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Khrystenko, Oksana. "Implicit displays of emotional vulnerability: A cross-cultural analysis of “unacceptable” embarrassment-related emotions in the communication within male groups." Open Linguistics 8, no. 1 (January 1, 2022): 209–31. http://dx.doi.org/10.1515/opli-2022-0189.

Full text
Abstract:
Abstract One of the previously widespread sociolinguistic theories about gender differences was related to differences in the expression of emotion. Women’s language was stereotypically associated with emotional expressivity, whereas male language was connected to a lack of affect and toughness evincing (cf. Eckert, Penelope, and Sally McConnell-Ginet 1992). With regard to gender differences in the expression of emotion, in this article, I provide a brief overview of the existing research findings pertaining to males’ expressions of emotion, followed by an examination of embarrassment-related expressions of emotions and the role of paralinguistic cues in this respect. To accomplish this, I adopt a contrastive focus based on analysis of male talk in Ukraine and Austria that will enable the identification of the likely differences and similarities in expressing emotional vulnerability.
APA, Harvard, Vancouver, ISO, and other styles
11

Jin, Xiaoyun. "Analysis of Emotional Color Representation in Oil Painting Based on Deep Learning Model Evaluation." Wireless Communications and Mobile Computing 2022 (July 14, 2022): 1–11. http://dx.doi.org/10.1155/2022/6238930.

Full text
Abstract:
When an artist creates an oil painting, it is rich in emotion. Color is the main way to express emotion in oil painting. As far as the art field is concerned, color is an objective phenomenon that people can really feel its unique richness and brilliance, and different people will have different emotions when facing the same picture. Color itself is not emotional, but people have different psychological feelings by looking at different colors. This is the power of color. In the same way, color is also a carrier for painters to convey emotions in oil paintings. The study of emotional color expression in oil painting is more helpful for people to understand the emotion conveyed by oil painting. Color has an important role and an irreplaceable position in oil painting creation. In the creation of oil painting, color expression is combined with emotional expression. The creator conveys his experience and feelings in the form of color. In this paper, we look at the relationship between color and emotion, the emotional expression of color, and the expression of emotion in painting. The creators of oil paintings convey their feelings and experiences in the form of color, which plays a unique and important role in portraying the image of their works, expressing emotions, and creating an atmosphere, and we analyze and reveal how to use paint to express emotions in oil painting.
APA, Harvard, Vancouver, ISO, and other styles
12

Yagi, Satoshi, Yoshihiro Nakata, Yutaka Nakamura, and Hiroshi Ishiguro. "Can an android’s posture and movement discriminate against the ambiguous emotion perceived from its facial expressions?" PLOS ONE 16, no. 8 (August 10, 2021): e0254905. http://dx.doi.org/10.1371/journal.pone.0254905.

Full text
Abstract:
Expressing emotions through various modalities is a crucial function not only for humans but also for robots. The mapping method from facial expressions to the basic emotions is widely used in research on robot emotional expressions. This method claims that there are specific facial muscle activation patterns for each emotional expression and people can perceive these emotions by reading these patterns. However, recent research on human behavior reveals that some emotional expressions, such as the emotion “intense”, are difficult to judge as positive or negative by just looking at the facial expression alone. Nevertheless, it has not been investigated whether robots can also express ambiguous facial expressions with no clear valence and whether the addition of body expressions can make the facial valence clearer to humans. This paper shows that an ambiguous facial expression of an android can be perceived more clearly by viewers when body postures and movements are added. We conducted three experiments and online surveys among North American residents with 94, 114 and 114 participants, respectively. In Experiment 1, by calculating the entropy, we found that the facial expression “intense” was difficult to judge as positive or negative when they were only shown the facial expression. In Experiments 2 and 3, by analyzing ANOVA, we confirmed that participants were better at judging the facial valence when they were shown the whole body of the android, even though the facial expression was the same as in Experiment 1. These results suggest that facial and body expressions by robots should be designed jointly to achieve better communication with humans. In order to achieve smoother cooperative human-robot interaction, such as education by robots, emotion expressions conveyed through a combination of both the face and the body of the robot is necessary to convey the robot’s intentions or desires to humans.
APA, Harvard, Vancouver, ISO, and other styles
13

Isomura, Tomoko, and Tamami Nakano. "Automatic facial mimicry in response to dynamic emotional stimuli in five-month-old infants." Proceedings of the Royal Society B: Biological Sciences 283, no. 1844 (December 14, 2016): 20161948. http://dx.doi.org/10.1098/rspb.2016.1948.

Full text
Abstract:
Human adults automatically mimic others' emotional expressions, which is believed to contribute to sharing emotions with others. Although this behaviour appears fundamental to social reciprocity, little is known about its developmental process. Therefore, we examined whether infants show automatic facial mimicry in response to others' emotional expressions. Facial electromyographic activity over the corrugator supercilii (brow) and zygomaticus major (cheek) of four- to five-month-old infants was measured while they viewed dynamic clips presenting audiovisual, visual and auditory emotions. The audiovisual bimodal emotion stimuli were a display of a laughing/crying facial expression with an emotionally congruent vocalization, whereas the visual/auditory unimodal emotion stimuli displayed those emotional faces/vocalizations paired with a neutral vocalization/face, respectively. Increased activation of the corrugator supercilii muscle in response to audiovisual cries and the zygomaticus major in response to audiovisual laughter were observed between 500 and 1000 ms after stimulus onset, which clearly suggests rapid facial mimicry. By contrast, both visual and auditory unimodal emotion stimuli did not activate the infants' corresponding muscles. These results revealed that automatic facial mimicry is present as early as five months of age, when multimodal emotional information is present.
APA, Harvard, Vancouver, ISO, and other styles
14

Hübner, Amelie M., Ima Trempler, Corinna Gietmann, and Ricarda I. Schubotz. "Interoceptive sensibility predicts the ability to infer others’ emotional states." PLOS ONE 16, no. 10 (October 6, 2021): e0258089. http://dx.doi.org/10.1371/journal.pone.0258089.

Full text
Abstract:
Emotional sensations and inferring another’s emotional states have been suggested to depend on predictive models of the causes of bodily sensations, so-called interoceptive inferences. In this framework, higher sensibility for interoceptive changes (IS) reflects higher precision of interoceptive signals. The present study examined the link between IS and emotion recognition, testing whether individuals with higher IS recognize others’ emotions more easily and are more sensitive to learn from biased probabilities of emotional expressions. We recorded skin conductance responses (SCRs) from forty-six healthy volunteers performing a speeded-response task, which required them to indicate whether a neutral facial expression dynamically turned into a happy or fearful expression. Moreover, varying probabilities of emotional expressions by their block-wise base rate aimed to generate a bias for the more frequently encountered emotion. As a result, we found that individuals with higher IS showed lower thresholds for emotion recognition, reflected in decreased reaction times for emotional expressions especially of high intensity. Moreover, individuals with increased IS benefited more from a biased probability of an emotion, reflected in decreased reaction times for expected emotions. Lastly, weak evidence supporting a differential modulation of SCR by IS as a function of varying probabilities was found. Our results indicate that higher interoceptive sensibility facilitates the recognition of emotional changes and is accompanied by a more precise adaptation to emotion probabilities.
APA, Harvard, Vancouver, ISO, and other styles
15

Kim, YoungSik, and YongWon Suh. "The effect of organizational member's allocentrism-idiocentrism on the emotion expression and favoritism." Korean Journal of Industrial and Organizational Psychology 28, no. 4 (November 30, 2015): 689–722. http://dx.doi.org/10.24230/kjiop.v28i4.689-722.

Full text
Abstract:
In this article, three studies were performed to investigate the differences of the tendency to regulate emotion expression in terms of the organizational member's cultural dispositions. Study 1 four hypothsises. First, allocentrics will have a higher level of emotion suppression than that of idiocentrics. Second, allocentrics will have a higher level of negative attitude towords emotion expressions than idiocentrices. third, the relation between allocentrics and emotion suppression will mediated by negative attitude to emotional expression. finally, allocentircs will be negatively evaluated than idiocentrics who shows emotional expression freely. For this study, data was collected from 196 employees by survey questionnaires. In study 1, it was found that allocentrics have a higher level of emotional suppression and negative attitude towards emotional expression than idiocentirics. The relation between allocentrics and emotional expression were mediated by negative attitude to emotional expression. But hypothesis 4 was not supported. In study 2, we experimented by including positive and negative conditions to examine the difference of emotional regulations between allocentrics and idiocentrics. The results show that allocentrics and idiocentrics do not differ in positive condition. However, in negative condition, allocentrics are more emotionally suppressed than that of idocentrics. Study 3 shows that by applying emotion type we were able to evaluate the fourth hypothesis of the first study. In socially engaged conditions, allocentrics were more favorable than idiocentrics. In socially disengaged conditions shows that allocentrics favored anger suppressing individuals over idiocentrics. Finally, implications and limitations of these results were discussed.
APA, Harvard, Vancouver, ISO, and other styles
16

Mandal, Manas K., and Nalini Ambady. "Laterality of Facial Expressions of Emotion: Universal and Culture-Specific Influences." Behavioural Neurology 15, no. 1-2 (2004): 23–34. http://dx.doi.org/10.1155/2004/786529.

Full text
Abstract:
Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals.
APA, Harvard, Vancouver, ISO, and other styles
17

Yuen, Shannon, Boya Li, Yung-Ting Tsou, Qi Meng, Liyan Wang, Wei Liang, and Carolien Rieffe. "Family Systems and Emotional Functioning in Deaf or Hard-of-Hearing Preschool Children." Journal of Deaf Studies and Deaf Education 27, no. 2 (January 31, 2022): 125–36. http://dx.doi.org/10.1093/deafed/enab044.

Full text
Abstract:
Abstract This study examined how deaf or hard-of-hearing (DHH) and typically hearing (TH) children may differ in their family system and emotional functioning and examined the relations between family system and children’s emotional functioning. Parents of 106 DHH and 99 TH children (2–6 years) reported on family cohesion and adaptability, parental emotion communication, and their child’s emotional functioning. The DHH children were rated lower on family cohesion and positive emotion expression than the TH children. Higher levels of family cohesion related to more positive emotion expression in TH children but not in DHH children. For all children, higher levels of family cohesion related to fewer negative emotion expressions and more parental emotion communication related to more negative emotion expression. The results emphasize the importance of sharing leisure activities together and open communication within the family, which can support DHH and TH children’s experience of emotions and their expressions of them.
APA, Harvard, Vancouver, ISO, and other styles
18

LI, YI, and MINORU HASHIMOTO. "EMOTIONAL SYNCHRONIZATION-BASED HUMAN–ROBOT COMMUNICATION AND ITS EFFECTS." International Journal of Humanoid Robotics 10, no. 01 (March 2013): 1350014. http://dx.doi.org/10.1142/s021984361350014x.

Full text
Abstract:
This paper presents a natural and comfortable communication system between human and robot based on synchronization to human emotional state using human facial expression recognition. The system consists of three parts: human emotion recognition, robotic emotion generation, and robotic emotion expression. The robot recognizes human emotion through human facial expressions, and robotic emotion is generated and synchronized with human emotion dynamically using a vector field of dynamics. The robot makes dynamically varying facial expressions to express its own emotions to the human. A communication experiment was conducted to examine the effectiveness of the proposed system. The authors found that subjects became much more comfortable after communicating with the robot with synchronized emotions. Subjects felt somewhat uncomfortable after communicating with the robot with non-synchronized emotions. During emotional synchronization, subjects communicated much more with the robot, and the communication time was double that during non-synchronization. Furthermore, in the case of emotional synchronization, subjects had good impressions of the robot, much better than the impressions in the case of non-synchronization. It was confirmed in this study that emotional synchronization in human–robot communication can be effective in making humans comfortable and makes the robot much more favorable and acceptable to humans.
APA, Harvard, Vancouver, ISO, and other styles
19

López-Gil, Juan-Miguel, Rosa Gil, and Roberto García. "Do Deepfakes Adequately Display Emotions? A Study on Deepfake Facial Emotion Expression." Computational Intelligence and Neuroscience 2022 (October 18, 2022): 1–12. http://dx.doi.org/10.1155/2022/1332122.

Full text
Abstract:
Recent technological advancements in Artificial Intelligence make it easy to create deepfakes and hyper-realistic videos, in which images and video clips are processed to create fake videos that appear authentic. Many of them are based on swapping faces without the consent of the person whose appearance and voice are used. As emotions are inherent in human communication, studying how deepfakes transfer emotional expressions from original to fakes is relevant. In this work, we conduct an in-depth study on facial emotional expression in deepfakes using a well-known face swap-based deepfake database. Firstly, we extracted the photograms from their videos. Then, we analyzed the emotional expression in the original and faked versions of video recordings for all performers in the database. Results show that emotional expressions are not adequately transferred between original recordings and the deepfakes created from them. High variability in emotions and performers detected between original and fake recordings indicates that performer emotion expressiveness should be considered for better deepfake generation or detection.
APA, Harvard, Vancouver, ISO, and other styles
20

Caruana, Fausto, and Vittorio Gallese. "Overcoming the emotion experience/expression dichotomy." Behavioral and Brain Sciences 35, no. 3 (May 23, 2012): 145–46. http://dx.doi.org/10.1017/s0140525x11001476.

Full text
Abstract:
AbstractWe challenge the classic experience/expression dichotomous account of emotions, according to which experiencing and expressing an emotion are two independent processes. By endorsing Dewey's and Mead's accounts of emotions, and capitalizing upon recent empirical findings, we propose that expression is part of the emotional experience. This proposal partly challenges the purely constructivist approach endorsed by the authors of the target article.
APA, Harvard, Vancouver, ISO, and other styles
21

Abstract, Nicky James. "Emotional Labour: Skill and Work in the Social Regulation of Feelings." Sociological Review 37, no. 1 (February 1989): 15–42. http://dx.doi.org/10.1111/j.1467-954x.1989.tb00019.x.

Full text
Abstract:
I define emotional labour as the labour involved in dealing with other peoples' feelings, a core component of which is the regulation of emotions. The aims of the paper are firstly to suggest that the expression of feelings is a central problem of capital and paid work and secondly to highlight the contradictions of emotions at work. To begin with I argue that ‘emotion’ is a subject area fitting for inclusion in academic discussion, and that the expression of emotions is regulated by a form of labour. In the section ‘Emotion at home’ I suggest that emotional labour is used to lay the foundations of a social expression of emotion in the privacy of the domestic domain. However the forms emotional labour takes and the skills it involves leave women subordinated as unskilled and stigmatised as emotional. In the section ‘Emotion at work’ I argue that emotional labour is also a commodity. Though it may remain invisible or poorly paid, emotional labour facilitates and regulates the expression of emotion in the public domain. Studies of home and the workplace are used to begin the process of recording the work carried out in managing emotions and drawing attention to its significance in the social reproduction of labour power and social relations of production.
APA, Harvard, Vancouver, ISO, and other styles
22

Surcinelli, Paola, Bruno Baldaro, Antonio Balsamo, Roberto Bolzani, Monia Gennari, and Nicolino C. F. Rossi. "Emotion Recognition and Expression in Young Obese Participants: Preliminary Study." Perceptual and Motor Skills 105, no. 2 (October 2007): 477–82. http://dx.doi.org/10.2466/pms.105.2.477-482.

Full text
Abstract:
This study of the presence of alexithymic characteristics in obese adolescents and preadolescents tested the hypothesis of whether they showed impaired recognition and expression of emotion. The sample included 30 obese young participants and a control group of 30 participants of normal weight for their ages. Stimuli, 42 faces representing seven emotional expressions, were shown to participants who identified the emotion expressed in the face. The Level of Emotional Awareness Scale was adapted for children to evaluate their ability to describe their emotions. Young obese participants had significantly lower scores than control participants, but no differences were found in recognition of emotion. The lack of words to describe emotions might suggest a greater prevalence of alexithymic characteristics in the obese participants, but the hypothesis of a general deficit in the processing of emotional experiences was not supported.
APA, Harvard, Vancouver, ISO, and other styles
23

Riberto, Martina, Deborah Talmi, and Gorana Pobric. "Symmetry in Emotional and Visual Similarity between Neutral and Negative Faces." Symmetry 13, no. 11 (November 4, 2021): 2091. http://dx.doi.org/10.3390/sym13112091.

Full text
Abstract:
Is Mr. Hyde more similar to his alter ego Dr. Jekyll, because of their physical identity, or to Jack the Ripper, because both evoke fear and loathing? The relative weight of emotional and visual dimensions in similarity judgements is still unclear. We expected an asymmetric effect of these dimensions on similarity perception, such that faces that express the same or similar feeling are judged as more similar than different emotional expressions of same person. We selected 10 male faces with different expressions. Each face posed one neutral expression and one emotional expression (five disgust, five fear). We paired these expressions, resulting in 190 pairs, varying either in emotional expressions, physical identity, or both. Twenty healthy participants rated the similarity of paired faces on a 7-point scale. We report a symmetric effect of emotional expression and identity on similarity judgements, suggesting that people may perceive Mr. Hyde to be just as similar to Dr. Jekyll (identity) as to Jack the Ripper (emotion). We also observed that emotional mismatch decreased perceived similarity, suggesting that emotions play a prominent role in similarity judgements. From an evolutionary perspective, poor discrimination between emotional stimuli might endanger the individual.
APA, Harvard, Vancouver, ISO, and other styles
24

Buluk, Katarzyna, and Celina Timoszyk-Tomczak. "„Co wyraża twarz?” – rozpoznawanie ekspresji emocjonalnej twarzy przez osoby głuche i słyszące." Psychologia Rozwojowa 25, no. 4 (2020): 101–10. http://dx.doi.org/10.4467/20843879pr.20.030.13438.

Full text
Abstract:
„What does the Face Express?” – Recognition of Emotional Facial Expressions in Deaf and Hearing People An analysis of emotional functioning of deaf people is important for understanding their activities in different areas of life. Emotional functioning is related to emotional intelligence, which involves emotion perception and recognition as well as emotional expressiveness. The aim of the study was to compare the ability to recognize facial emotional expression among deaf and hearing people. The present study was conducted on 80 individuals (40 deaf people and 40 hearing people). The Emotional Intelligence Scale – Faces (Matczak, Piekarska, Studniarek, 2005) and a set of photographs used by Paul Ekman in his study of basic emotions were used for the data collection. The results obtained show that deaf people differ from hearing people in recognizing facial expressions. The analysis was conducted in terms of differences in recognition of expression of basic and complex emotions. The study included variables such as the moment of hearing loss (congenital or acquired deafness) or upbringing with deaf or hearing parents.
APA, Harvard, Vancouver, ISO, and other styles
25

Wilson, Glenn. "Emotional expression." Personality and Individual Differences 8, no. 2 (January 1987): 292. http://dx.doi.org/10.1016/0191-8869(87)90196-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Matsuda, Yasuhiro, Ichiro Sakuma, Yasuhiko Jimbo, Etsuko Kobayashi, Tatsuhiko Arafune, and Tsuneshi Isomura. "Emotional Communication in Finger Braille." Advances in Human-Computer Interaction 2010 (2010): 1–23. http://dx.doi.org/10.1155/2010/830759.

Full text
Abstract:
We describe analyses of the features of emotions (neutral, joy, sadness, and anger) expressed by Finger Braille interpreters and subsequently examine the effectiveness of emotional expression and emotional communication between people unskilled in Finger Braille. The goal is to develop a Finger Braille system to teach emotional expression and a system to recognize emotion. The results indicate the following features of emotional expression by interpreters. The durations of the code of joy were significantly shorter than the durations of the other emotions, the durations of the code of sadness were significantly longer, and the finger loads of anger were significantly larger. The features of emotional expression by unskilled subjects were very similar to those of the interpreters, and the coincidence ratio of emotional communication was 75.1%. Therefore, it was confirmed that people unskilled in Finger Braille can express and communicate emotions using this communication medium.
APA, Harvard, Vancouver, ISO, and other styles
27

Castellano, Ginevra, Marcello Mortillaro, Antonio Camurri, Gualtiero Volpe, and Klaus Scherer. "Automated Analysis of Body Movement in Emotionally Expressive Piano Performances." Music Perception 26, no. 2 (December 1, 2008): 103–19. http://dx.doi.org/10.1525/mp.2008.26.2.103.

Full text
Abstract:
EMOTIONAL EXPRESSION IN MUSIC PERFORMANCE includes important cues arising from the body movement of the musician. This movement is related to both the musical score execution and the emotional intention conveyed. In this experiment, a pianist was asked to play the same excerpt with different emotionally expressive intentions. The aim was to verify whether different expressions could be distinguished based on movement by trying to determine which motion cues were most emotion-sensitive. Analyses were performed via an automated system capable of detecting the temporal profiles of two motion cues: the quantity of motion of the upper body and the velocity of head movements. Results showed that both were sensitive to emotional expression, especially the velocity of head movements. Further, some features conveying information about movement temporal dynamics varied among expressive conditions allowing emotion discrimination. These results are in line with recent theories that underlie the dynamic nature of emotional expression.
APA, Harvard, Vancouver, ISO, and other styles
28

Ra, Eunhee, Sangwoon Kim, and Sukyoung Yun. "The Effects of Recall and Sensory Stimulation Horticultural Therapy on Emotion in Terminal Cancer Patients." Korean Society of Culture and Convergence 44, no. 5 (May 31, 2022): 833–46. http://dx.doi.org/10.33645/cnc.2022.5.44.5.833.

Full text
Abstract:
This study Based on the themes and stories of terminal cancer patients, was able to observe various emotional expressions of 120 terminal cancer patients (60 men and 60 women) through recall and sensory stimulation horticultural therapy once a week for 10 to 15 minutes, one to one at D city D hospital from July 12, 2017 to December 27, 2019. As emotional expression, the ratio of positive and negative emotions was investigated. The ratio of pleasure (32.5%) and fun (28.3%) was high in static emotion. In emotional expression by horticultural material, the rate of using cut flower (30.0%) in static emotional pleasure, using differentiation (27.5%), and using processification (40.0%) was shown. The feeling of helplessness in the expression of negative emotions was 26.7% for both males and females, and it is thought that horticultural therapy may be a vital force for the boredom that comes from living in a hospital and fighting a disease. It is insufficient to generalize as it was conducted in one hospice hospital. It is thought that various attempts to change emotions and physiology are needed based on the current emotional response. Based on the fact that both men and women had a lot of helplessness in negative emotions, it is thought that it will be helpful for a more lively and happy life if we study how the activity of sympathetic nervous system works on vitality. In this study, the ratio of static emotional expression was high, so it is expected to be used as basic data on how the expression of static emotion works on the activity of the parasympathetic nervous system.
APA, Harvard, Vancouver, ISO, and other styles
29

Kulkarni, Praveen, and Rajesh T. M. "Analysis on techniques used to recognize and identifying the Human emotions." International Journal of Electrical and Computer Engineering (IJECE) 10, no. 3 (June 1, 2020): 3307. http://dx.doi.org/10.11591/ijece.v10i3.pp3307-3314.

Full text
Abstract:
Facial expression is a major area for non-verbal language in day to day life communication. As the statistical analysis shows only 7 percent of the message in communication was covered in verbal communication while 55 percent transmitted by facial expression. Emotional expression has been a research subject of physiology since Darwin’s work on emotional expression in the 19th century. According to Psychological theory the classification of human emotion is classified majorly into six emotions: happiness, fear, anger, surprise, disgust, and sadness. Facial expressions which involve the emotions and the nature of speech play a foremost role in expressing these emotions. Thereafter, researchers developed a system based on Anatomic of face named Facial Action Coding System (FACS) in 1970. Ever since the development of FACS there is a rapid progress of research in the domain of emotion recognition. This work is intended to give a thorough comparative analysis of the various techniques and methods that were applied to recognize and identify human emotions. This analysis results will help to identify the proper and suitable techniques, algorithms and the methodologies for future research directions. In this paper extensive analysis on the various recognition techniques used to identify the complexity in recognizing the facial expression is presented. This work will also help researchers and scholars to ease out the problem in choosing the techniques used in the identification of the facial expression domain.
APA, Harvard, Vancouver, ISO, and other styles
30

Trkulja, Kyla. "Integrating Cultural Expression with Universal Emotions: How Cultural Differences in Expression Do Not Refute the Universal Hypothesis." Journal of Undergraduate Life Sciences 15, no. 1 (July 16, 2021): 7. http://dx.doi.org/10.33137/juls.v15i1.36955.

Full text
Abstract:
The universal hypothesis of emotions argues that due to the functionality that emotions and their behavioural components provide, they show similar patterns across all cultures. Though there is substantive evidence supporting this theory, there are several cases were emotional expression does differ between cultures. This paper argues that such differences in expression are not necessarily evidence against the universal hypothesis as they are not due to innate biological differences in the emotional experience. Instead, differences in expression are the result of culture-specific learning and act to modify the expression of emotion to meet social norms. Since differences in expression are not innate, individuals are capable of experiencing emotions in an evolutionary adaptive way, regardless of culture. This has implications for better understanding individuals across cultures and why some individuals may act differently than others, despite having a similar emotional experience.
APA, Harvard, Vancouver, ISO, and other styles
31

Nicolini, Ylenia, Barbara Manini, Elisa De Stefani, Gino Coudé, Daniela Cardone, Anna Barbot, Chiara Bertolini, et al. "Autonomic Responses to Emotional Stimuli in Children Affected by Facial Palsy: The Case of Moebius Syndrome." Neural Plasticity 2019 (April 8, 2019): 1–13. http://dx.doi.org/10.1155/2019/7253768.

Full text
Abstract:
According to embodied simulation theories, others’ emotions are recognized by the unconscious mimicking of observed facial expressions, which requires the implicit activation of the motor programs that produce a specific expression. Motor responses performed during the expression of a given emotion are hypothesized to be directly linked to autonomic responses associated with that emotional behavior. We tested this hypothesis in 9 children (Mage=5.66) affected by Moebius syndrome (MBS) and 15 control children (Mage=6.6). MBS is a neurological congenital disorder characterized by underdevelopment of the VI and VII cranial nerves, which results in paralysis of the face. Moebius patients’ inability to produce facial expressions impairs their capacity to communicate emotions through the face. We therefore assessed Moebius children’s autonomic response to emotional stimuli (video cartoons) by means of functional infrared thermal (fIRT) imaging. Patients showed weaker temperature changes compared to controls, suggesting impaired autonomic activity. They also showed difficulties in recognizing facial emotions from static illustrations. These findings reveal that the impairment of facial movement attenuates the intensity of emotional experience, probably through the diminished activation of autonomic responses associated with emotional stimuli. The current study is the first to investigate emotional responses in MBS children, providing important insights into the role of facial expressions in emotional processing during early development.
APA, Harvard, Vancouver, ISO, and other styles
32

Jee, Eun-Sook, Chong Hui Kim, and Hisato Kobayashi. "Modulation of Musical Sound Clips for Robot’s Dynamic Emotional Expression." Journal of Robotics and Mechatronics 23, no. 3 (June 20, 2011): 451–57. http://dx.doi.org/10.20965/jrm.2011.p0451.

Full text
Abstract:
Sound is an important medium for human-robot interaction. Single sound or music clip is not enough to express delicate emotions, especially it is almost impossible to represent emotional changings. This paper tries to express different emotional levels of sounds and their transitions. In this paper, happiness, sadness, anger, and surprise are considered as a basic set of robots’ emotion. By using previous proposed nominal sound clips of the four emotions, this paper proposes a method to reproduce the different emotional levels of sounds by modulating their musical parameters ‘tempo,’ ‘pitch,’ and ‘volume.’ Basic experiments whether human subject can discern three different emotional intensity levels of the four emotions are carried out. By comparing the recognition rate, the proposed modulation works fairly well and at least shows possibility of letting humans identify three intensity levels of emotions. Since the modulation can be done by dynamically changing the three musical parameters of sound clip, our method can be expanded to dynamical changing of emotional sounds.
APA, Harvard, Vancouver, ISO, and other styles
33

Yang, Defeng, Hao Shen, and Robert S. Wyer. "The face is the index of the mind: understanding the association between self-construal and facial expressions." European Journal of Marketing 55, no. 6 (January 26, 2021): 1664–78. http://dx.doi.org/10.1108/ejm-03-2019-0295.

Full text
Abstract:
Purpose This study aims to examine the relationship between consumers’ emotional expressions and their self-construals. The authors suggest that because an independent self-construal can reinforce the free expression of emotion, the expression of extreme emotions is likely to become associated with feelings of independence through social learning. Design/methodology/approach The paper includes five studies. Study 1A provided evidence that priming participants with different types of self-construal can influence the extremity of their emotional expressions. Study 1B showed that chronic self-construal could predict facial expressions of students who were told to smile for a group photograph. Studies 2–4 found that inducing people to either manifest or to simply view an extreme facial expression activated an independent social orientation and influenced their performance on tasks that reflect this orientation. Findings The studies provide support for a bidirectional causal relationship between individuals’ self-construals and the extremity of their emotional expressions. They show that people’s general social orientation could predict the spontaneous facial expressions that they manifest in their daily lives. Research limitations/implications Although this research was generally restricted to the effects of smiling, similar considerations influence the expression of other emotions. That is, dispositions to exhibit extreme expressions can generalize over different types of emotions. To this extent, expressions of sadness, anger or fear might be similarly associated with people’s social orientation and the behavior that is influenced by it. Practical implications The paper provides marketing implications into how marketers can influence consumers’ choices of unique options and how marketers can assess consumers’ social orientation based on their observation of consumers’ emotional expressions. Originality/value To the best of the authors’ knowledge, this research is the first to demonstrate a bidirectional causal relationship between individuals’ self-construals and the extremity of their emotional expressions, and to demonstrate the association between chronic social orientation and emotional expression people spontaneously make in their daily lives.
APA, Harvard, Vancouver, ISO, and other styles
34

London, Justin. "Some theories of emotion in music and their implications for research in music psychology." Musicae Scientiae 5, no. 1_suppl (September 2001): 23–36. http://dx.doi.org/10.1177/10298649020050s102.

Full text
Abstract:
Work in musical aesthetics on musical meaning is relevant to psychological research on musical expressions of emotion. Distinctions between simple emotions, higher emotions, and moods are given, and arguments as to what kinds of emotions or moods music might be able to express (given music's semantic capacities and limitations) are summarized. Next, the question as to how music might express these emotions and moods is considered. The paper concludes with a number of cautionary points for researchers in the psychology of musical emotion: (1) musical expression always involves sonic properties, which must be taken into account. (2) If one uses “real world” musical stimuli, one may be faced with associative interference. (3) Context will often individuate emotional expression, transforming a simple emotion to a higher emotion by providing an intentional object. (4) There is not a simple linear relationship between intensity of a musical parameter and the intensity of an emotional expression. (5) Some perfectly good musical expressions of emotion may not arouse those emotions in the listener, yet it would be incorrect to call such passages “inexpressive.” (6) Any emotions aroused by listening to music, while similar to emotions that occur in non-musical contexts, will nonetheless have a number of important differences.
APA, Harvard, Vancouver, ISO, and other styles
35

Хворова, Екатерина. "Когнитивно-культурные, индивидуально-психологические и возрастные особенности способности к распознаванию эмоций." Problemy Wczesnej Edukacji 32, no. 1 (March 31, 2016): 126–29. http://dx.doi.org/10.5604/01.3001.0008.5641.

Full text
Abstract:
This article describes the features of the development of the emotional sphere. It emphasizes the importance of the primary school age in the development of certain components of emotional intelligence, one of which is the ability to recognize emotions. In the early school years, children are able to understand emotions, but mostly with the help of their own emotional experience and/ or according to the situations they are used to experiencing, they mostly rely on the context of the situation, and, as we know, it does not always work correctly: different people in the same situations may experience completely different emotions. Few children are able to establish the reasons that caused other people emotions. Besides, one of the components of emotional intelligence is the ability to control one’s own emotions. Emotion regulation becomes available for children after the socialization associated with the first years at school. Child development is partly determined by the process of socialization, which determines specific cognitive representations of emotions, so called emotional prototypes. Also the culture in which the child grows up has effects on the process of emotion recognition and expression, so, for example, in the individualistic culture emotional expression and recognition is encouraged, and in collectivist cultures, there are certain rules of emotional expression fixing in which situations and to what extent the expression of emotions is permissible.
APA, Harvard, Vancouver, ISO, and other styles
36

von Scheve, Christian. "The Social Calibration of Emotion Expression." Sociological Theory 30, no. 1 (March 2012): 1–14. http://dx.doi.org/10.1177/0735275112437163.

Full text
Abstract:
This article analyzes the role of emotions in social interaction and their effects on social structuration and the emergence of micro-social order. It argues that facial expressions of emotion are key in generating robust patterns of social interaction. First, the article shows that actors’ encoding of facial expressions combines hardwired physiological principles on the one hand and socially learned aspects on the other hand, leading to fine-grained and socially differentiated dialects of expression. Second, it is argued that decoding facial expression is contingent upon this combination so that reciprocal attributions of emotional states, situational interpretations, and action tendencies are more effective within rather than across social units. Third, this conjunction affects the conditions for emotional contagion, which is argued to be more effective within social units exhibiting similar encoding and decoding characteristics, and thus aligns emotions and action tendencies in a coherent, yet socially differentiated way.
APA, Harvard, Vancouver, ISO, and other styles
37

Uraeva, Darmon Saidakhmedovna, Iroda Sidikovna Khakharova, and Gulrukh Shavkatovna Khakhorova. "MEANING OF EMOTIONAL WORDS IN THE FORMATION OF EXPRESSIONS IN ENGLISH AND UZBEK LANGUAGES." Scientific Reports of Bukhara State University 3, no. 2 (February 28, 2019): 54–62. http://dx.doi.org/10.52297/2181-1466/2019/3/2/8.

Full text
Abstract:
The article analyzes the emotional perception and understanding of emotion in the mind through examples from more than twenty stories by the English author Somerset Maugham. The most characteristic syntactic function of pronouns in the Uzbek language is manifested in the expression of emotions such as command, desire, emotion, as a single sentence consisting of one component. It has been established that emotional words are related to the system of mind and language.
APA, Harvard, Vancouver, ISO, and other styles
38

Barrett, Lisa Feldman. "Was Darwin Wrong About Emotional Expressions?" Current Directions in Psychological Science 20, no. 6 (December 2011): 400–406. http://dx.doi.org/10.1177/0963721411429125.

Full text
Abstract:
Emotional expressions have endured as a topic of profound scientific interest for over a century, in part due to Darwin’s classic volume, The Expression of Emotions in Man and Animals. Since its publication, there has been a strong, spirited debate over the origin, nature, and function of emotional expressions. In this article, I consider two basic questions: What did Darwin really write about emotional expressions, and how well does his account match the modern, conventional, “basic emotion” account? And does the scientific evidence specifically support the modern account of Darwin’s view, or are there alternative hypotheses that provide good (or even better) interpretations for the data at hand? I discuss the various ways that Darwin might be correct (and incorrect) about how emotions and their manifestations have been sculpted by natural selection.
APA, Harvard, Vancouver, ISO, and other styles
39

Nakashima, Satoshi F., Masatoshi Ukezono, Hiroshi Nishida, Ryunosuke Sudo, and Yuji Takano. "Receiving of emotional signal of pain from conspecifics in laboratory rats." Royal Society Open Science 2, no. 4 (April 2015): 140381. http://dx.doi.org/10.1098/rsos.140381.

Full text
Abstract:
Though recent studies have shown that rodents express emotions with their face, whether emotional expression in rodents has a communicative function between conspecifics is still unclear. Here, we demonstrate the ability of visual recognition of emotional expressions in laboratory rats. We found that Long-Evans rats avoid images of pain expressions of conspecifics but not those of neutral expressions. The results indicate that rats use visual emotional signals from conspecifics to adjust their behaviour in an environment to avoid a potentially dangerous place. Therefore, emotional expression in rodents, rather than just a mere ‘expression’ of emotional states, might have a communicative function.
APA, Harvard, Vancouver, ISO, and other styles
40

G, Nikhil, Naganarasimha M, and Yogesh S. "HUMAN FACIAL EMOTION RECOGNITION USING CNN." International Journal of Engineering Applied Sciences and Technology 7, no. 1 (May 1, 2022): 321–23. http://dx.doi.org/10.33564/ijeast.2022.v07i01.049.

Full text
Abstract:
Human beings express emotions in everyday interactions. Understanding their emotions and knowing how to react to these expressions greatly enhances the interaction. An automatic Facial Expression Recognition system must try to solve the following problems: detection and location of faces ina cluttered scene, facial feature extraction, and facial expression classification. Knowing the user emotion, the system can adapt to the user. Facial expressions play an important role in recognition of emotions and are used in the process of non-verbal communication. They are very important in daily emotional communication, just next to the tone of voice. They are also an indicator of feelings, allowing a man to express an emotional state. The main motivation behind this project is to detect mental health of an individual.
APA, Harvard, Vancouver, ISO, and other styles
41

Mendolia, Marilyn. "Facial Identity Memory Is Enhanced When Sender’s Expression Is Congruent to Perceiver’s Experienced Emotion." Psychological Reports 121, no. 5 (November 24, 2017): 892–908. http://dx.doi.org/10.1177/0033294117741655.

Full text
Abstract:
The role of the social context in facial identity recognition and expression recall was investigated by manipulating the sender’s emotional expression and the perceiver’s experienced emotion during encoding. A mixed-design with one manipulated between-subjects factor (perceiver’s experienced emotion) and two within-subjects factors (change in experienced emotion and sender’s emotional expression) was used. Senders’ positive and negative expressions were implicitly encoded while perceivers experienced their baseline emotion and then either a positive or a negative emotion. Facial identity recognition was then tested using senders’ neutral expressions. Memory for senders previously seen expressing positive or negative emotion was facilitated if the perceiver initially encoded the expression while experiencing a positive or a negative emotion, respectively. Furthermore, perceivers were confident of their decisions. This research provides a more detailed understanding of the social context by exploring how the sender–perceiver interaction affects the memory for the sender.
APA, Harvard, Vancouver, ISO, and other styles
42

SHIPMAN, KIMBERLY L., and JANICE ZEMAN. "Socialization of children's emotion regulation in mother–child dyads: A developmental psychopathology perspective." Development and Psychopathology 13, no. 2 (May 16, 2001): 317–36. http://dx.doi.org/10.1017/s0954579401002073.

Full text
Abstract:
This study investigated the socialization of children's emotion regulation in 25 physically maltreating and 25 nonmaltreating mother–child dyads. Maltreating mothers and their 6- to 12-year-old children were recruited from two parenting programs affiliated with Children's Protective Services with a control group matched on race, SES, child gender, and child age. Children and their mothers were interviewed individually about their (a) management of emotional expression, (b) strategies for coping with emotional arousal, and (c) anticipated consequences following emotional displays. Compared to controls, maltreated children expected less maternal support in response to their emotional displays, reported being less likely to display emotions to their mothers, and generated fewer effective coping strategies for anger. Maltreating mothers indicated less understanding of children's emotional displays and fewer effective strategies for helping children to cope with emotionally arousing situations than nonmaltreating mothers. Further, findings indicated that maternal socialization practices (e.g., providing support in response to children's emotional display, generating effective coping strategies for their child) mediate the relation between child maltreatment and children's regulation of emotional expression and emotional arousal. These findings suggest that children's emotion regulation strategies are influenced by their relationship with their social environment (e.g., physically maltreating, nonmaltreating) and that the experience of a physically maltreating relationship may interfere with children's emotional development.
APA, Harvard, Vancouver, ISO, and other styles
43

Torres, Bianca, Raquel Luiza Santos, Maria Fernanda Barroso de Sousa, José Pedro Simões Neto, Marcela Moreira Lima Nogueira, Tatiana T. Belfort, Rachel Dias, and Marcia Cristina Nascimento Dourado. "Facial expression recognition in Alzheimer’s disease: a longitudinal study." Arquivos de Neuro-Psiquiatria 73, no. 5 (May 2015): 383–89. http://dx.doi.org/10.1590/0004-282x20150009.

Full text
Abstract:
Facial recognition is one of the most important aspects of social cognition. In this study, we investigate the patterns of change and the factors involved in the ability to recognize emotion in mild Alzheimer’s disease (AD). Through a longitudinal design, we assessed 30 people with AD. We used an experimental task that includes matching expressions with picture stimuli, labelling emotions and emotionally recognizing a stimulus situation. We observed a significant difference in the situational recognition task (p ≤ 0.05) between baseline and the second evaluation. The linear regression showed that cognition is a predictor of emotion recognition impairment (p ≤ 0.05). The ability to perceive emotions from facial expressions was impaired, particularly when the emotions presented were relatively subtle. Cognition is recruited to comprehend emotional situations in cases of mild dementia.
APA, Harvard, Vancouver, ISO, and other styles
44

Balconi, Michela, and Claudio Lucchiari. "Consciousness and Emotional Facial Expression Recognition." Journal of Psychophysiology 21, no. 2 (January 2007): 100–108. http://dx.doi.org/10.1027/0269-8803.21.2.100.

Full text
Abstract:
Abstract. In this study we analyze whether facial expression recognition is marked by specific event-related potential (ERP) correlates and whether conscious and unconscious elaboration of emotional facial stimuli are qualitatively different processes. ERPs elicited by supraliminal and subliminal (10 ms) stimuli were recorded when subjects were viewing emotional facial expressions of four emotions or neutral stimuli. Two ERP effects (N2 and P3) were analyzed in terms of their peak amplitude and latency variations. An emotional specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). Unaware information processing proved to be quite similar to aware processing in terms of peak morphology but not of latency. A major result of this research was that unconscious stimulation produced a more delayed peak variation than conscious stimulation did. Also, a more posterior distribution of the ERP was found for N2 as a function of emotional content of the stimulus. On the contrary, cortical lateralization (right/left) was not correlated to conscious/unconscious stimulation. The functional significance of our results is underlined in terms of subliminal effect and emotion recognition.
APA, Harvard, Vancouver, ISO, and other styles
45

Arar, Khalid. "Emotional expression at different managerial career stages." Educational Management Administration & Leadership 45, no. 6 (July 18, 2016): 929–43. http://dx.doi.org/10.1177/1741143216636114.

Full text
Abstract:
This paper examines emotional expression experienced by female principals in the Arab school system in Israel over their managerial careers – role-related emotions that they choose to express or repress before others. I employed narrative methodology, interviewing nine female principals from the Arab school system to investigate expression of emotions in professional life stories that they narrated. Findings indicate that the principals’ emotional expressions differ according to career stage; on induction into principalship, they are stressed, feel threatened, distressed and challenged. As they establish themselves in their role they are calmer, use more humour and more ‘correct’ facial expressions. At a more advanced career stage, they express empathy and compassion, and concern for the maintenance of educational achievements. Understanding principals’ emotional expression at different career stages contributes to the quality of principal-teacher relations in the school.
APA, Harvard, Vancouver, ISO, and other styles
46

Ma, Jiajia. "Emotional Expression and Analysis in Music Performance Based on Edge Computing." Mobile Information Systems 2022 (September 5, 2022): 1–12. http://dx.doi.org/10.1155/2022/4856977.

Full text
Abstract:
The expression of emotion in music performance is the soul of music, and the emotion revealed by the performer during the performance can bring emotional resonance to the audience. The emotions expressed by music such as joy, anger, and sadness are the meaning of music’s existence. Music without emotion will be dead. However, the music itself has no emotion at all; it is just a regular sound, so the emotional reading of music performance is very important. Music performance is an interpretation of music, and it is the most important emotional information and communication medium for human beings. Through the appreciation of the works to express the author’s emotions, the different performance forms of musical instruments, dance, and singing bring emotional resonance to the audience. Edge computing is the core technology and edge node of the Internet of everything in the new era, and it is constantly innovating with the rapid development of computers and the great changes brought about by them. Nowadays, people’s demand for emotional information processing of music performances has also increased. The research attention to music performance and the attention to its application technology have also received unprecedented development, so the requirements for the ability of human and machine interaction are getting higher and higher. With the increasing maturity of multimedia and communication technologies, there is an increasing expectation of using computers to express human thoughts and emotions. By combining the two, Dongfeng’s expression and analysis of music emotions through edge computing have also ushered in new developments. For example, people upload or share music and dance videos to their friends through WeChat, QQ, Douyin, etc., which greatly enriches people’s emotional world. The analysis and judgment of music emotion are the main subject of the joint development of both musicology and psychological research. With the help of computer science technology and artificial intelligence and other tools, the purpose of music emotional research can also be achieved. Particularly with the advancement of science and technology and the vigorous development of computer application technology, people’s needs for emotional expression and analysis of music can now be carried out with the help of computers. However, the amount of data generated is extremely large, and using edge servers for data processing can improve the efficiency of analysis and processing to meet people’s needs.
APA, Harvard, Vancouver, ISO, and other styles
47

Qi, Baohua. "ON THE EXPRESSION AND GUIDANCE OF NETWORK EMOTION IN EMERGENCIES FROM THE CHANGE OF EMOTIONAL BEHAVIOR -- TAKING THE RAINSTORM IN ZHENGZHOU ON JULY 20 AS AN EXAMPLE." International Journal of Neuropsychopharmacology 25, Supplement_1 (July 1, 2022): A37—A38. http://dx.doi.org/10.1093/ijnp/pyac032.052.

Full text
Abstract:
Abstract Background With the development of the Internet and the wide popularization of social media, the Internet has become an important channel for netizens to express their views and emotions. Due to the huge impact, emergencies often attract the high attention and extensive emotional response of Internet users. As the most active factor in network public opinion, network emotion affects the development of network public opinion. Therefore, it is of great significance to strengthen the research on the change law and guidance of network emotional expression under emergencies. Subjects and Methods Taking the 7.20 rainstorm in Zhengzhou as an example, this study randomly selects some microblog comment data in the event as samples, uses the text emotion analysis method to analyze and study the microblog comment data samples, and discusses the generation and development rules and guidance strategies of network emotion expression, so as to provide reference for the in-depth study of the law of network emotion and the governance of network public opinion under emergencies. Results The study found that in this emergency, the expression of network emotion mainly focused on positive emotions such as care, safety and encouragement, and negative emotions accounted for about 1 / 3, mainly referring to the emergency response of relevant departments and urban drainage. Negative emotions should be guided in time; The dedication of rescue workers and the spirit of mutual assistance of all parties in case of difficulties have stimulated the positive emotions of netizens; In terms of emotional guidance of Internet users, the government and media departments respond to the concerns of Internet users in a timely and accurate manner, which is helpful to dredge the Internet emotion; Effective information supply and topic focus transfer help to promote the rationalization of network emotion. Through the research, it is found that with the continuous development of the Internet, public emotional behavior will have an important impact on microblog public opinion. There are two directions: one is the top-down impact, the other is the bottom-up impact; It is mainly manifested in two typical ways: social mobilization and emotional social struggle in microblog public opinion. The main expressions of their emotional behavior are: weakness, anger, sadness and anger, etc. On the basis of combing the collective behavior and emotional struggle, the research finds that the communication framework of emotional behavior mainly includes the communication paths of discourse co meaning, identity co meaning and emotional co meaning; Functional analysis includes target function, attribution function and ideographic function. From the tendency of public sentiment, microblog public opinion shows criticism, populism, nationalism, pragmatism, patriotism and justice. The social expression of public sentiment in microblog public opinion includes the spiral phenomenon of silence, butterfly effect, herd effect, resentment and so on. From the perspective of psychology, the public emotions in microblog public opinion are expressed as fear, anxiety, anger and sadness, while in terms of expression, the public express their feelings through direct expression, folk language and other ways. Conclusion Through the research, we found that the development and change of network emotion under emergencies has periodic characteristics. Government management departments should establish the prediction and disposal mechanism of network emotion according to the phased characteristics of network emotion, and scientifically deal with the development and change of network emotion. Network emotion is the realistic mapping and presentation of the development of emergencies. When disasters are effectively controlled, negative emotions will also be suppressed. Give full play to the social responsibility and guiding role of news media in network emotion guidance, provide real event information to Internet users in time and accurately, and dredge network emotion. In the guidance of network emotion, we should give full play to the hedging role of positive emotion and the communication role of positive emotion. Due to the limitations of the author's research technology and ability, the evolution of network emotion and the relationship between rainstorm disaster and people's emotional response need to be further studied.
APA, Harvard, Vancouver, ISO, and other styles
48

Liu, Shiguang, Huixin Wang, and Min Pei. "Facial-expression-aware Emotional Color Transfer Based on Convolutional Neural Network." ACM Transactions on Multimedia Computing, Communications, and Applications 18, no. 1 (January 31, 2022): 1–19. http://dx.doi.org/10.1145/3464382.

Full text
Abstract:
Emotional color transfer aims to change the evoked emotion of a source image to that of a target image by adjusting color distribution. Most of existing emotional color transfer methods only consider the low-level visual features of an image and ignore the facial expression features when the image contains a human face, which would cause incorrect emotion evaluation for the given image. In addition, previous emotional color transfer methods may easily result in ambiguity between the emotion of resulting image and target image. For example, if the background of the target image is dark while the facial expression is happiness, then previous methods would directly transfer dark color to the source image, neglecting the facial emotion in the image. To solve this problem, we propose a new facial-expression-aware emotional color transfer framework. Given a target image with facial expression features, we first predict the facial emotion label of the image through the emotion classification network. Then, facial emotion labels are matched with pre-trained emotional color transfer models. Finally, we use the matched emotion model to transfer the color of the target image to the source image. Considering none of the existing emotion image databases, which focus on images that contain face and background, we built an emotion database for our new emotional color transfer framework that is called “Face-Emotion database.” Experiments demonstrate that our method can successfully capture and transfer facial emotions, outperforming state-of-the-art methods.
APA, Harvard, Vancouver, ISO, and other styles
49

Fausto, Caruana. "The Integration of Emotional Expression and Experience: A Pragmatist Review of Recent Evidence From Brain Stimulation." Emotion Review 11, no. 1 (August 4, 2017): 27–38. http://dx.doi.org/10.1177/1754073917723461.

Full text
Abstract:
A common view in affective neuroscience considers emotions as a multifaceted phenomenon constituted by independent affective and motor components. Such dualistic connotation, obtained by rephrasing the classic Darwin and James’s theories of emotion, leads to the assumption that emotional expression is controlled by motor centers in the anterior cingulate, frontal operculum, and supplementary motor area, whereas emotional experience depends on interoceptive centers in the insula. Recent stimulation studies provide a different perspective. I will outline two sets of findings. First, affective experiences can be elicited also following the stimulation of motor centers. Second, emotional expressions can be elicited by stimulating interoceptive regions. Echoing the original pragmatist theories of emotion, I will make a case for the notion that emotional experience emerges from the integration of sensory and motor signals, encoded in the same functional network.
APA, Harvard, Vancouver, ISO, and other styles
50

Sapsaman, Temsiri, and Teerawat Benjawilaikul. "Parameterization of Emotion Expression through Robot’s Body Language." Advanced Materials Research 605-607 (December 2012): 1656–60. http://dx.doi.org/10.4028/www.scientific.net/amr.605-607.1656.

Full text
Abstract:
To enhance the human-robot interaction social robots have been developed with focuses on facial expression and verbal language. However, little to none has been done on emotional expression through robot’s body language. This work uses a parameterization method with human’s emotion theory and experiments to find robot parameters for expressing emotions through body language. Mapping is done on 2- and 3-dimensional emotion space, and obtained coefficients can be used to determine the influence level of motion parameters to emotion domains.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography