To see the other types of publications on this topic, follow the link: Emotion perception.

Journal articles on the topic 'Emotion perception'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Emotion perception.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Gabrielsson, Alf. "Emotion perceived and emotion felt: Same or different?" Musicae Scientiae 5, no. 1_suppl (September 2001): 123–47. http://dx.doi.org/10.1177/10298649020050s105.

Full text
Abstract:
A distinction is made between emotion perception, that is, to perceive emotional expression in music without necessarily being affected oneself, and emotion induction, that is, listeners’ emotional response to music. This distinction is not always observed, neither in everyday conversation about emotions, nor in scientific papers. Empirical studies of emotion perception are briefly reviewed with regard to listener agreement concerning expressed emotions, followed by a selective review of empirical studies on emotional response to music. Possible relationships between emotion perception and emotional response are discussed and exemplified: positive relationship, negative relationship, no systematic relationship and no relationship. It is emphasised that both emotion perception and, especially, emotional response are dependent on an interplay between musical, personal, and situational factors. Some methodological questions and suggestions for further research are discussed.
APA, Harvard, Vancouver, ISO, and other styles
2

MacCann, Carolyn, Yasemin Erbas, Egon Dejonckheere, Amirali Minbashian, Peter Kuppens, and Kirill Fayn. "Emotional Intelligence Relates to Emotions, Emotion Dynamics, and Emotion Complexity." European Journal of Psychological Assessment 36, no. 3 (May 2020): 460–70. http://dx.doi.org/10.1027/1015-5759/a000588.

Full text
Abstract:
Abstract. Emotional intelligence (EI) should relate to people’s emotional experiences. We meta-analytically summarize associations of felt affect with ability EI branches (perception, facilitation, understanding, and management) and total scores ( k = 7–14; N = 1,584–2,813). We then use experience sampling ( N = 122 undergraduates over 5 days, 24 beeps) to test whether EI predicts emotion dynamics and complexity. Meta-analyses show that EI correlates significantly with lower negative affect (NA; ρ = −.21) but not higher positive affect (PA; ρ = .05). PA (but not NA) shows a significantly stronger relationship with emotion management (ρ = .23) versus other EI branches (ρ = −.01 to .07). In the experience sampling study, only management significantly related to higher PA, whereas lower NA was significantly related to total EI, perception, facilitation, and management. After controlling for mean affect: (a) only understanding significantly predicted NA dynamics whereas only management and facilitation significantly predicted PA dynamics; (b) management and facilitation predicted lower PA differentiation (EI was unrelated to NA differentiation); and (c) perception and facilitation predicted greater bipolarity. Results show that EI predicts affect, emotion dynamics, and emotion complexity. We discuss the importance of distinguishing between different branches of ability EI.
APA, Harvard, Vancouver, ISO, and other styles
3

Levitan, Carmel A., Isabelle Rusk, Danielle Jonas-Delson, Hanyun Lou, Lennon Kuzniar, Gray Davidson, and Aleksandra Sherman. "Mask wearing affects emotion perception." i-Perception 13, no. 3 (May 2022): 204166952211073. http://dx.doi.org/10.1177/20416695221107391.

Full text
Abstract:
To reduce the spread of COVID-19, mask wearing has become ubiquitous in much of the world. We studied the extent to which masks impair emotion recognition and dampen the perceived intensity of facial expressions by naturalistically inducing positive, neutral, and negative emotions in individuals while they were masked and unmasked. Two groups of online participants rated the emotional intensity of each presented image. One group rated full faces (N=104); the other (N=102) rated cropped images where only the upper face was visible. We found that masks impaired the recognition of and rated intensity of positive emotions. This happened even when the faces were cropped and the lower part of the face was not visible. Masks may thus reduce positive emotion and/or expressivity of positive emotion. However, perception of negativity was unaffected by masking, perhaps because unlike positive emotions like happiness which are signaled more in the mouth, negative emotions like anger rely more on the upper face.
APA, Harvard, Vancouver, ISO, and other styles
4

Bourassa, Maureen, Kelton Doraty, Loleen Berdahl, Jana Fried, and Scott Bell. "Support, opposition, emotion and contentious issue risk perception." International Journal of Public Sector Management 29, no. 2 (March 7, 2016): 201–16. http://dx.doi.org/10.1108/ijpsm-10-2015-0172.

Full text
Abstract:
Purpose – Research on emotion in the context of risk perception has historically focused on negative emotions, and has emphasized the effect of these negative emotions on the perception of risk amongst those who oppose (rather than support) contentious issues. Drawing on theory, the purpose of this paper is to hypothesize that both positive and negative emotions are correlated with risk perceptions regarding contentious public issues and that this occurs amongst supporters and opponents alike. Design/methodology/approach – The paper explores the relationship between emotions and perceived risk through consideration of the highly contentious case of nuclear energy in Saskatchewan, Canada. The analysis uses data from a representative telephone survey of 1,355 residents. Findings – The results suggest that positive emotions, like negative emotions, are related to nuclear energy risk perceptions. Emotions are related to risk perception amongst both supporters and opponents. Research limitations/implications – The data set’s limited number of emotion measures and single public issue focus, combined with the survey’s cross-sectional design, make this research exploratory in nature. Future research should incorporate multiple positive emotions, explore opposition, and support across a range of contentious public issues, and consider experimental models to assess causal relationships. Practical implications – The paper offers insights into how public sector managers must be cognizant of the emotional underpinnings of risk perceptions amongst both supporters and opponents of contentious public issues. Originality/value – This paper builds on and expands previous work by considering both positive and negative emotions and both supporters and opponents of contentious issues.
APA, Harvard, Vancouver, ISO, and other styles
5

Neal, David T., and Tanya L. Chartrand. "Embodied Emotion Perception." Social Psychological and Personality Science 2, no. 6 (April 21, 2011): 673–78. http://dx.doi.org/10.1177/1948550611406138.

Full text
Abstract:
How do we recognize the emotions other people are feeling? One source of information may be facial feedback signals generated when we automatically mimic the expressions displayed on others' faces. Supporting this “embodied emotion perception,” dampening (Experiment 1) and amplifying (Experiment 2) facial feedback signals, respectively, impaired and improved people’s ability to read others' facial emotions. In Experiment 1, emotion perception was significantly impaired in people who had received a cosmetic procedure that reduces muscular feedback from the face (Botox) compared to a procedure that does not reduce feedback (a dermal filler). Experiment 2 capitalized on the fact that feedback signals are enhanced when muscle contractions meet resistance. Accordingly, when the skin was made resistant to underlying muscle contractions via a restricting gel, emotion perception improved, and did so only for emotion judgments that theoretically could benefit from facial feedback.
APA, Harvard, Vancouver, ISO, and other styles
6

Starkey, Charles. "Perceptual Emotions and Emotional Virtue." Journal of Philosophy of Emotion 3, no. 1 (September 30, 2021): 10–15. http://dx.doi.org/10.33497/2021.summer.3.

Full text
Abstract:
In this essay I focus on two areas discussed in Michael Brady’s Emotion: The Basics, namely perceptual models of emotion and the relation between emotion and virtue. Brady raises two concerns about perceptual theories: that they arguably collapse into feeling or cognitive theories of emotion; and that the analogy between emotion and perception is questionable at best, and is thus not an adequate way of characterizing emotion. I argue that a close look at perception and emotional experience reveals a structure of emotion that avoids these problems. I then explore other ways in which emotions can be operative in virtuous acts and virtue traits outside of their relation to motivation. The patterns of emotional response that we have can affect virtue because they affect the way in which we see and take-in information about the world, and the gravity that such perceptions have for us. In addition, emotions are critical to virtue because they maintain the level of importance that values have for us, and in doing so forestall axiological entropy, namely the fading of the importance that values have for us, and thus the virtues that are dependent on those values.
APA, Harvard, Vancouver, ISO, and other styles
7

Lucin, D. V., Y. A. Kozhukhova, and E. A. Suchkova. "Emotion congruence in the perception of ambiguous facial expressions." Experimental Psychology (Russia) 12, no. 1 (2019): 27–39. http://dx.doi.org/10.17759/exppsy.2019120103.

Full text
Abstract:
Emotion congruence in emotion perception is manifested in increasing sensitivity to the emotions corresponding to the perceiver’s emotional state. In this study, an experimental procedure that robustly generates emotion congruence during the perception of ambiguous facial expressions has been developed. It was hypothesized that emotion congruence will be stronger in the early stages of perception. In two experiments, happiness and sadness were elicited in 69 (mean age 20.2, 57 females) and 58 (mean age 18.2, 50 females) participants. Then they determined what emotions were present in the ambiguous faces. The duration of stimulus presentation varied for the analysis of earlier and later stages of perception. The effect of emotion congruence was obtained in both experiments: happy participants perceived more happiness and less sadness in ambiguous facial expression compared to sad participants. Stimulus duration did not influence emotion congruence. Further studies should focus on the juxtaposition of the models connecting the emotion congruence mechanisms either with perception or with response generation.
APA, Harvard, Vancouver, ISO, and other styles
8

Gendron, Maria, and Lisa Feldman Barrett. "Emotion Perception as Conceptual Synchrony." Emotion Review 10, no. 2 (March 20, 2018): 101–10. http://dx.doi.org/10.1177/1754073917705717.

Full text
Abstract:
Psychological research on emotion perception anchors heavily on an object perception analogy. We present static “cues,” such as facial expressions, as objects for perceivers to categorize. Yet in the real world, emotions play out as dynamic multidimensional events. Current theoretical approaches and research methods are limited in their ability to capture this complexity. We draw on insights from a predictive coding account of neural activity and a grounded cognition account of concept representation to conceive of emotion perception as a stream of synchronized conceptualizations between two individuals, which is supported and shaped by language. We articulate how this framework can illuminate the fundamental need to study culture, as well as other sources of conceptual variation, in unpacking conceptual synchrony in emotion. We close by suggesting that the conceptual system provides the necessary flexibility to overcome gaps in emotional synchrony.
APA, Harvard, Vancouver, ISO, and other styles
9

Montagne, Barbara, Gudrun M. S. Nys, Martine J. E. van Zandvoort, L. Jaap Kappelle, Edward H. F. de Haan, and Roy P. C. Kessels. "The perception of emotional facial expressions in stroke patients with and without depression." Acta Neuropsychiatrica 19, no. 5 (October 2007): 279–83. http://dx.doi.org/10.1111/j.1601-5215.2007.00235.x.

Full text
Abstract:
Background:Emotion perception may be impaired after stroke. No study on emotion perception after stroke has taken the influence of post-stroke depressive symptoms into account, although depressive symptoms themselves may hamper emotion perception.Objective:To compare the perception of emotional facial expressions in stroke patients with and without depressive symptoms.Methods:Twenty-two stroke patients participated whose depressive symptoms were classified using the Montgomery-Åsberg Depression Rating Scale (cutoff = 10) and who were compared with healthy controls. Emotion recognition was measured using morphed images of facial expressions.Results:Patients with depressive symptoms performed worse than controls on all emotions; patients without depressive symptoms performed at control level. Patients with depressive symptoms were less sensitive to the emotions anger, happiness and sadness compared with patients without depressive symptoms.Conclusions:Post-stroke depressive symptoms impair emotion perception. This extends findings in bipolar disorder indicating that emotion perception deficits are strongly related to the level of depression.
APA, Harvard, Vancouver, ISO, and other styles
10

Rahmawati, Nada, and Saodah Wok. "The Effects of Organizational Change on Students' Emotions." GATR Global Journal of Business and Social Science Review (GJBSSR) Vol.5(3) Jul-Sep 2017 5, no. 3 (June 28, 2017): 100–105. http://dx.doi.org/10.35609/gjbssr.2017.5.3(13).

Full text
Abstract:
Objective - This study aims to examine the effects of perception on technological change, leadership change and structural change towards students' emotions; and to analyze the mediating effect of experience on perception towards emotion resulting from organizational changes. Using the Theory of Emotional Contagion (Hatfield, Cacioppo & Rapson, 1993), organizational change can produce a number of positive and negative emotional responses that can be transferred to others. Methodology/Technique - The study employs the quantitative research design using the survey method with the self-administered questionnaire. A total of 223 respondents were identified among the undergraduate students at a faculty in a public university who have faced organizational changes (technological, leadership and structural). Findings - The results reveal that perceptions of technology, leadership and structural changes are found to have moderate effects on students' emotions. However, experiences of change partially mediate students' emotion and perception of technological, leadership and structural changes. Experience with organizational changes affects students' emotions badly. Novelty - The implications of the Emotional Contagion Theory holds true for organizational changes as the hypotheses are supported. Students' emotions are equally important to be considered before applying any change to any academic institution. Type of Paper: Empirical. Keywords: Emotional Contagion Theory; Emotional Effect; Leadership Change; Structural Change; Technological Change. JEL Classification: I21, O33.
APA, Harvard, Vancouver, ISO, and other styles
11

Weiss, Elisabeth M., H. Harald Freudenthaler, Andreas Fink, Eva M. Reiser, Harald Niederstätter, Simone Nagl, Walther Parson, and Ilona Papousek. "Differential Influence of 5-HTTLPR - Polymorphism and COMT Val158Met - Polymorphism on Emotion Perception and Regulation in Healthy Women." Journal of the International Neuropsychological Society 20, no. 5 (March 31, 2014): 516–24. http://dx.doi.org/10.1017/s135561771400023x.

Full text
Abstract:
AbstractConverging evidence indicates that a considerable amount of variance in self-estimated emotional competency can be directly attributed to genetic factors. The current study examined the associations between the polymorphisms of the Catechol-O-methyltransferase (COMT Met158Val) and the serotonin transporter (5-HTTLPR) and specific measures of the self-estimated effectiveness of an individual’s emotion perception and regulation. Emotional competence was measured in a large sample of 289 healthy women by using the Self-report Emotional Ability Scale (SEAS), which includes two subscales for the assessment of emotion perception and regulation in the intra-personal domain and two subscales for the assessment of emotion perception and regulation in the inter-personal domain. Participants’ reports of effective emotion regulation in everyday life were associated with the COMT Met-allele, with women homozygous for the Val-allele scoring lowest on this scale. Self-estimated effectiveness of emotion perception of the individual’s own emotions was related to the 5-HTTLPR. Both homozygous groups (s/s and l/l) rated their intra-personal emotion perception less effective than participants in the heterozygous s/l group. Taken together, the results indicate that genetic variants of the COMT and 5HTTLPR genes are differentially associated with specific measures of the self-estimated effectiveness of an individual’s emotion perception and regulation in the intra-personal domain. (JINS, 2014, 20, 1–9)
APA, Harvard, Vancouver, ISO, and other styles
12

Papousek, Ilona, Elisabeth M. Weiss, Eva M. Reiser, Günter Schulter, Heribert Harald Freudenthaler, and Helmut K. Lackner. "Self-rated social-emotional perception and its neurophysiologic and cardiac correlates while viewing a film showing the suffering of other people." International Journal of Psychological Research 6 (October 30, 2013): 42–45. http://dx.doi.org/10.21500/20112084.718.

Full text
Abstract:
Using electroencephalographic (EEG) and cardiac measures, the study examined relevant mechanisms that may explain individual differences in self-rated emotion perception (i.e., the propensity of perceiving the emotional states of other persons in everyday life). Healthy women (n = 122) were confronted with film scenes showing the suffering of other people. Functional coupling between prefrontal and posterior cortices, measured by EEG coherences, more strongly decreased in individuals higher on emotion perception. This finding suggests that the propensity to loosen prefrontal inhibitory control on posterior cortical areas involved in basic processes of emotion perception is associated with higher susceptibility to social-emotional information and, therefore, with higher scores on self-rated emotion perception. In addition, higher self-rated perception of other persons' emotions was related to more pronounced cardiac responses to the observation of horrifying events occurring to people in the film which indicate enhanced attention and heightened perceptual processing
APA, Harvard, Vancouver, ISO, and other styles
13

Dmitrieva, E. S., M. N. Anderson, and V. Ya Gelman. "A comparative study of visual and auditory perception of emotions in children of primary school age." Experimental Psychology (Russia) 9, no. 1 (2016): 38–52. http://dx.doi.org/10.17759/exppsy.2016090104.

Full text
Abstract:
We investigated the characteristics of perception of nonverbal emotional information in the two modalities of presentation, visual and auditory, in 32 schoolchildren of 8-9 years old. We studied the recognition of four basic emotions: «happiness», «sadness», «anger», «fear» in facial expressions and in tone of voice. We found that in visual perception, the identification of emotion is more accurate. The lack of correlation between the results of emotions recognition in different modalities indicates the independence of processes of emotion recognition in the visual and auditory modalities. The detected hierarchically coincident unevenness in forming different emotion perception mechanism in both modalities, apparently, is largely determined by external factors.
APA, Harvard, Vancouver, ISO, and other styles
14

Pohl, Anna, Sebastian Dummel, Mascha Bothur, and Alexander L. Gerlach. "Interoceptive accuracy does not predict emotion perception in daily life." Open Psychology 4, no. 1 (January 1, 2022): 175–86. http://dx.doi.org/10.1515/psych-2022-0009.

Full text
Abstract:
Abstract Peripheral emotion theories suggest a crucial role of interoception for emotion perception, which in turn facilitates emotion regulation. Laboratory studies found positive relations between interoceptive accuracy and perceived emotion intensity and arousal. Studies in natural settings are largely missing, but seem important by virtue of emotional experience and regulation diversity. On hundred seven participants underwent a cardiovascular interoceptive accuracy task. Afterwards, participants provided detailed information on perceived emotions and emotion regulation strategies in an ecological momentary assessment (EMA). Multilevel models were calculated. In consideration of valence, emotion intensity, arousal, intensity of body sensations and, emotion regulation success were modeled as a function of centered interoceptive accuracy. Interoceptive accuracy did not predict any emotion perception criterion. Lower accuracy was related to a slightly stronger decrease of perceived arousal after regulation. Differences in emotion categories, intensity, and sample collection might explain divergences to laboratory studies.
APA, Harvard, Vancouver, ISO, and other styles
15

Cavieres, Alvaro, Rocío Maldonado, Amy Bland, and Rebecca Elliott. "Relationship Between Gender and Performance on Emotion Perception Tasks in a Latino Population." International Journal of Psychological Research 14, no. 1 (April 30, 2021): 106–14. http://dx.doi.org/10.21500/20112084.5032.

Full text
Abstract:
Basic emotions are universally recognized, although differences across cultures and between genders have been described. We report results in two emotion recognition tasks, in a sample of healthy adults from Chile. Methods: 192 volunteers (mean 31.58 years, s.d. 8.36; 106 women) completed the Emotional Recognition Task, in which they were asked to identify a briefly displayed emotion, and the Emotional Intensity Morphing Task, in which they viewed faces with increasing or decreasing emotional intensity and indicated when they either detected or no longer detected the emotion. Results: All emotions were recognized at above chance levels. The only sex differences present showed men performed better at identifying anger (p = .0485), and responded more slowly to fear (p = .0057), than women. Discussion: These findings are consistent with some, though not all, prior literature on emotion perception. Crucially, we report data on emotional perception in a healthy adult Latino population for the first time, which contributes to emerging literature on cultural differences in affective processing.
APA, Harvard, Vancouver, ISO, and other styles
16

Ji, Eunhee, Lisa K. Son, and Min-Shik Kim. "Emotion Perception Rules Abide by Cultural Display Rules." Experimental Psychology 69, no. 2 (March 2022): 83–103. http://dx.doi.org/10.1027/1618-3169/a000550.

Full text
Abstract:
Abstract. The current study compared emotion perception in two cultures where display rules for emotion expression deviate. In Experiment 1, participants from America and Korea played a repeated prisoner’s dilemma game with a counterpart, who was, in actuality, a programmed defector. Emotion expressions were exchanged via emoticons at the end of every round. After winning more points by defecting, the counterpart sent either a matching emoticon (a joyful face) or a mismatching emoticon (a regretful face). The results showed that Americans in the matching condition were more likely to defect, or to punish, compared to those in the mismatching condition, suggesting that more weight was given to their counterpart’s joyful expression. This difference was smaller for Koreans, suggesting a higher disregard for the outward expression. In a second, supplementary experiment, we found that Korean participants were more likely to cooperate in the mismatching or regretful condition, when they thought their counterpart was a Westerner. Overall, our data suggest that emotion perception rules abide by the display rules of one’s culture but are also influenced by the counterpart’s culture.
APA, Harvard, Vancouver, ISO, and other styles
17

Adolphs, Ralph. "Perception and Emotion." Current Directions in Psychological Science 15, no. 5 (October 2006): 222–26. http://dx.doi.org/10.1111/j.1467-8721.2006.00440.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Farmer, Eliot, Crescent Jicol, and Karin Petrini. "Musicianship Enhances Perception But Not Feeling of Emotion From Others’ Social Interaction Through Speech Prosody." Music Perception 37, no. 4 (March 11, 2020): 323–38. http://dx.doi.org/10.1525/mp.2020.37.4.323.

Full text
Abstract:
Music expertise has been shown to enhance emotion recognition from speech prosody. Yet, it is currently unclear whether music training enhances the recognition of emotions through other communicative modalities such as vision and whether it enhances the feeling of such emotions. Musicians and nonmusicians were presented with visual, auditory, and audiovisual clips consisting of the biological motion and speech prosody of two agents interacting. Participants judged as quickly as possible whether the expressed emotion was happiness or anger, and subsequently indicated whether they also felt the emotion they had perceived. Measures of accuracy and reaction time were collected from the emotion recognition judgements, while yes/no responses were collected as indication of felt emotions. Musicians were more accurate than nonmusicians at recognizing emotion in the auditory-only condition, but not in the visual-only or audiovisual conditions. Although music training enhanced recognition of emotion through sound, it did not affect the felt emotion. These findings indicate that emotional processing in music and language may use overlapping but also divergent resources, or that some aspects of emotional processing are less responsive to music training than others. Hence music training may be an effective rehabilitative device for interpreting others’ emotion through speech.
APA, Harvard, Vancouver, ISO, and other styles
19

Bao, Qiang, Xujuan Zhang, Xijuan Wu, Qiang Zhang, and Jinshou Chen. "Research on Public Environmental Perception of Emotion, Taking Haze as an Example." International Journal of Environmental Research and Public Health 18, no. 22 (November 18, 2021): 12115. http://dx.doi.org/10.3390/ijerph182212115.

Full text
Abstract:
Ecological and environmental problems have become increasingly prominent in recent years. Environmental problems represented by haze have become a topic that affects the harmonious ecology of human beings. The trend of this topic is on the rise. People’s perception of the environment after the impact of haze has also changed. A real-time grasp of the dynamic public environment perception of emotions is often an important basis for environmental management departments to effectively solve environmental problems through public opinion. This article focuses on the problem of the public perception of emotional changes, which is caused by fog and hazy weather, proposes an environmental emotion perception model, using Weibo comment data about fog and haze as environmental perception data, and analyzes the impact of fog and haze on the public in four seasonal time dimensions. The post-environment perception of emotion changes: the results show that in spring, the public’s environmental perception of emotions is mainly negative emotions at the beginning of the season; in summer, positive emotions become dominant emotions; in autumn, the public’s environmental perception of emotions is dominated by negative emotions that increase substantially; and in winter, the dominant environmental perception of emotions of the public is still negative. This theory provides support for research on social emotions and public opinion behavior.
APA, Harvard, Vancouver, ISO, and other styles
20

Sauter, Disa A. "Is There a Role for Language in Emotion Perception?" Emotion Review 10, no. 2 (January 25, 2018): 111–15. http://dx.doi.org/10.1177/1754073917693924.

Full text
Abstract:
What is the relationship between language, emotion concepts, and perceptual categories? Here I compare the strong Whorfian view of linguistic relativity, which argues that language plays a necessary role in the perception of emotions, to the alternative view that different levels of processing (e.g., linguistic, conceptual, perceptual) are relatively independent and thus, that language does not play a foundational role in emotion perception. I examine neuropsychological studies that have tested strong claims of linguistic relativity, and discuss research on categorical perception of emotional expressions, where the two accounts have been directly tested against each other. As in other perceptual domains, there is little evidence that language plays a foundational role in the perception of emotion.
APA, Harvard, Vancouver, ISO, and other styles
21

Hess, Ursula, and Konstantinos Kafetsios. "Infusing Context Into Emotion Perception Impacts Emotion Decoding Accuracy." Experimental Psychology 68, no. 6 (November 2021): 285–94. http://dx.doi.org/10.1027/1618-3169/a000531.

Full text
Abstract:
Abstract. The accurate decoding of facial emotion expressions lies at the center of many research traditions in psychology. Much of this research, while paying lip service to the importance of context in emotion perception, has used stimuli that were carefully created to be deprived of contextual information. The participants' task is to associate the expression shown in the face with a correct label, essentially changing a social perception task into a cognitive task. In fact, in many cases, the task can be carried out correctly without engaging emotion recognition at all. The present article argues that infusing context in emotion perception does not only add an additional source of information but changes the way that participants approach the task by rendering it a social perception task rather than a cognitive task. Importantly, distinguishing between accuracy (perceiving the intended emotions) and bias (perceiving additional emotions to those intended) leads to a more nuanced understanding of social emotion perception. Results from several studies that use the Assessment of Contextual Emotions demonstrate the significance and social functionality of simultaneously considering emotion decoding accuracy and bias for social interaction in different cultures, their key personality and societal correlates, and their function for close relationships processes.
APA, Harvard, Vancouver, ISO, and other styles
22

Green, Melissa J., and Gin S. Malhi. "Neural mechanisms of the cognitive control of emotion." Acta Neuropsychiatrica 18, no. 3-4 (June 2006): 144–53. http://dx.doi.org/10.1111/j.1601-5215.2006.00149.x.

Full text
Abstract:
Background:Emotion regulation involves the initiation of new emotional responses and continual alteration of current emotions in response to rapidly changing environmental and social stimuli. The capacity to effectively implement emotion regulation strategies is essential for psychological health; impairments in the ability to regulate emotions may be critical to the development of clinical levels of depression, anxiety and mania.Objective:This review provides a summary of findings from current research examining the neural mechanisms of emotion regulation by means of conscious cognitive strategies of reappraisal. These findings are considered in the context of related concepts of emotion perception and emotion generation, with discussion of the likely cognitive neuropsychological contributions to emotion regulation and the implications for psychiatric disorders.Results:Convergent evidence implicates an inhibitory role of prefrontal cortex and cingulate regions upon subcortical and cortical emotion generation systems in the cognitive control of emotional experience. Concurrent modulation of cortical activity by the peripheral nervous system is highlighted by recent studies using simultaneous physiological and neuroimaging techniques. Individual differences in emotion perception, generation of affect and neuropsychological skills are likely to have direct consequences for emotion regulation.Conclusions:Emotion regulation relies on synergy within brain stem, limbic and cortical processes that promote the adaptive perception, generation and regulation of affect. Aberrant emotion processing in any of these stages may disrupt this self-sustaining regulatory system, with the potential to manifest in distinct forms of emotion dysregulation as seen in major psychiatric disorders such as depression, bipolar disorder and schizophrenia.
APA, Harvard, Vancouver, ISO, and other styles
23

Chen, Peiyao, Ashley Chung-Fat-Yim, and Viorica Marian. "Cultural Experience Influences Multisensory Emotion Perception in Bilinguals." Languages 7, no. 1 (January 10, 2022): 12. http://dx.doi.org/10.3390/languages7010012.

Full text
Abstract:
Emotion perception frequently involves the integration of visual and auditory information. During multisensory emotion perception, the attention devoted to each modality can be measured by calculating the difference between trials in which the facial expression and speech input exhibit the same emotion (congruent) and trials in which the facial expression and speech input exhibit different emotions (incongruent) to determine the modality that has the strongest influence. Previous cross-cultural studies have found that individuals from Western cultures are more distracted by information in the visual modality (i.e., visual interference), whereas individuals from Eastern cultures are more distracted by information in the auditory modality (i.e., auditory interference). These results suggest that culture shapes modality interference in multisensory emotion perception. It is unclear, however, how emotion perception is influenced by cultural immersion and exposure due to migration to a new country with distinct social norms. In the present study, we investigated how the amount of daily exposure to a new culture and the length of immersion impact multisensory emotion perception in Chinese-English bilinguals who moved from China to the United States. In an emotion recognition task, participants viewed facial expressions and heard emotional but meaningless speech either from their previous Eastern culture (i.e., Asian face-Mandarin speech) or from their new Western culture (i.e., Caucasian face-English speech) and were asked to identify the emotion from either the face or voice, while ignoring the other modality. Analyses of daily cultural exposure revealed that bilinguals with low daily exposure to the U.S. culture experienced greater interference from the auditory modality, whereas bilinguals with high daily exposure to the U.S. culture experienced greater interference from the visual modality. These results demonstrate that everyday exposure to new cultural norms increases the likelihood of showing a modality interference pattern that is more common in the new culture. Analyses of immersion duration revealed that bilinguals who spent more time in the United States were equally distracted by faces and voices, whereas bilinguals who spent less time in the United States experienced greater visual interference when evaluating emotional information from the West, possibly due to over-compensation when evaluating emotional information from the less familiar culture. These findings suggest that the amount of daily exposure to a new culture and length of cultural immersion influence multisensory emotion perception in bilingual immigrants. While increased daily exposure to the new culture aids with the adaptation to new cultural norms, increased length of cultural immersion leads to similar patterns in modality interference between the old and new cultures. We conclude that cultural experience shapes the way we perceive and evaluate the emotions of others.
APA, Harvard, Vancouver, ISO, and other styles
24

de Boer, Minke J., Deniz Başkent, and Frans W. Cornelissen. "Eyes on Emotion: Dynamic Gaze Allocation During Emotion Perception From Speech-Like Stimuli." Multisensory Research 34, no. 1 (July 7, 2020): 17–47. http://dx.doi.org/10.1163/22134808-bja10029.

Full text
Abstract:
Abstract The majority of emotional expressions used in daily communication are multimodal and dynamic in nature. Consequently, one would expect that human observers utilize specific perceptual strategies to process emotions and to handle the multimodal and dynamic nature of emotions. However, our present knowledge on these strategies is scarce, primarily because most studies on emotion perception have not fully covered this variation, and instead used static and/or unimodal stimuli with few emotion categories. To resolve this knowledge gap, the present study examined how dynamic emotional auditory and visual information is integrated into a unified percept. Since there is a broad spectrum of possible forms of integration, both eye movements and accuracy of emotion identification were evaluated while observers performed an emotion identification task in one of three conditions: audio-only, visual-only video, or audiovisual video. In terms of adaptations of perceptual strategies, eye movement results showed a shift in fixations toward the eyes and away from the nose and mouth when audio is added. Notably, in terms of task performance, audio-only performance was mostly significantly worse than video-only and audiovisual performances, but performance in the latter two conditions was often not different. These results suggest that individuals flexibly and momentarily adapt their perceptual strategies to changes in the available information for emotion recognition, and these changes can be comprehensively quantified with eye tracking.
APA, Harvard, Vancouver, ISO, and other styles
25

Dmitrieva, E. S., and V. Ya Gelman. "Perception of Auditory and Visual Emotional Information in Primary School Age Children and its Impact on Their Academic Progress." Психологическая наука и образование 23, no. 5 (2018): 29–39. http://dx.doi.org/10.17759/pse.2018230504.

Full text
Abstract:
This work explored the connection between the characteristics of perception of non-verbal emotional information in two modalities of presentation — visual and auditory — with indicators of school achievements in 32 schoolchildren aged 8—9 years.We studied how the children recognised four basic emotions — "joy", "sadness", "anger", "fear" — in facial expressions and intonation of speech.The characteristics of their perceptions were compared with their academic achievements in three school disciplines: Russian language, reading and mathematics.It is shown that there is a clear correlation between the child’s school progress and acoustic perception of emotions, while no connection with visual perception was found.It was revealed that the features of the relationship between the effectiveness of perception of emotions and school performance differed in boys and girls and also depended on the specific school subject and the type of emotion.Unlike girls, boys showed an improvement in academic performance when the accuracy of their emotion recognition increased.There was no evidence of a link between successful learning and the preferred type of perception of emotional information (acoustic or visual) in primary school children.
APA, Harvard, Vancouver, ISO, and other styles
26

Zhao, Yingying, Yuhu Chang, Yutian Lu, Yujiang Wang, Mingzhi Dong, Qin Lv, Robert P. Dick, et al. "Do Smart Glasses Dream of Sentimental Visions?" Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, no. 1 (March 29, 2022): 1–29. http://dx.doi.org/10.1145/3517250.

Full text
Abstract:
Emotion recognition in smart eyewear devices is valuable but challenging. One key limitation of previous works is that the expression-related information like facial or eye images is considered as the only evidence of emotion. However, emotional status is not isolated; it is tightly associated with people's visual perceptions, especially those with emotional implications. However, little work has examined such associations to better illustrate the causes of emotions. In this paper, we study the emotionship analysis problem in eyewear systems, an ambitious task that requires classifying the user's emotions and semantically understanding their potential causes. To this end, we describe EMOShip, a deep-learning-based eyewear system that can automatically detect the wearer's emotional status and simultaneously analyze its associations with semantic-level visual perception. Experimental studies with 20 participants demonstrate that, thanks to its awareness of emotionship, EMOShip achieves superior emotion recognition accuracy compared to existing methods (80.2% vs. 69.4%) and provides a valuable understanding of the causes of emotions. Further pilot studies with 20 additional participants further motivate the potential use of EMOShip to empower emotion-aware applications, such as emotionship self-reflection and emotionship life-logging.
APA, Harvard, Vancouver, ISO, and other styles
27

Werner, S., and G. N. Petrenko. "Speech Emotion Recognition: Humans vs Machines." Discourse 5, no. 5 (December 18, 2019): 136–52. http://dx.doi.org/10.32603/2412-8562-2019-5-5-136-152.

Full text
Abstract:
Introduction. The study focuses on emotional speech perception and speech emotion recognition using prosodic clues alone. Theoretical problems of defining prosody, intonation and emotion along with the challenges of emotion classification are discussed. An overview of acoustic and perceptional correlates of emotions found in speech is provided. Technical approaches to speech emotion recognition are also considered in the light of the latest emotional speech automatic classification experiments.Methodology and sources. The typical “big six” classification commonly used in technical applications is chosen and modified to include such emotions as disgust and shame. A database of emotional speech in Russian is created under sound laboratory conditions. A perception experiment is run using Praat software’s experimental environment.Results and discussion. Cross-cultural emotion recognition possibilities are revealed, as the Finnish and international participants recognised about a half of samples correctly. Nonetheless, native speakers of Russian appear to distinguish a larger proportion of emotions correctly. The effects of foreign languages knowledge, musical training and gender on the performance in the experiment were insufficiently prominent. The most commonly confused pairs of emotions, such as shame and sadness, surprise and fear, anger and disgust as well as confusions with neutral emotion were also given due attention.Conclusion. The work can contribute to psychological studies, clarifying emotion classification and gender aspect of emotionality, linguistic research, providing new evidence for prosodic and comparative language studies, and language technology, deepening the understanding of possible challenges for SER systems.
APA, Harvard, Vancouver, ISO, and other styles
28

Kannan, S., and Shreya N. "Understanding Emoticons: Perception and Usage of Emoticons in WhatsApp." Artha - Journal of Social Sciences 16, no. 3 (July 1, 2017): 49–68. http://dx.doi.org/10.12724/ajss.42.4.

Full text
Abstract:
The use of emoticons in different forms such as text-based, computer-mediated and Instant Messaging services is ever increasing. Emoticons which evolved decades ago, was basically used to represent emotion. With the advent of technology, emoticons have been graphically/digitally transformed and evolved in the way people look and for what they are used for. With a wide array of emoticons at disposal, these emoticons are used for expressing emotions and also numerous activities and phrases. The usage of emoticons, their perception and interpretation varies from person to person. The study reveals that majority believe that emoticons make conversations more interactive, enhance online message communication and generate the required response from the receiver. The study also showed that over majority of the respondents look at both eye and mouth of emoticons for facial emotion recognition.
APA, Harvard, Vancouver, ISO, and other styles
29

Murry, Matthew W. E., and Derek M. Isaacowitz. "Age differences in emotion perception." International Journal of Behavioral Development 41, no. 5 (September 6, 2016): 597–604. http://dx.doi.org/10.1177/0165025416667493.

Full text
Abstract:
Older adults tend to have lower emotion-perception accuracy compared to younger adults. Previous studies have centered on individual characteristics, including cognitive decline and positive attentional preferences, as possible mechanisms underlying these age differences in emotion perception; however, thus far, no perceiver-focused factor has accounted for the age differences. The present study focuses on perceived social-context factors and uses the Social Input Model as the framework for investigating the relation between the expressivity of the social environment and emotion-perception accuracy in younger and older adults. Younger ( n = 32) and older adults ( n = 29) reported on the make-up of their social circles and the expressivity of their three closest social partners and then completed a static facial emotion-perception task. Older adults reported greater positive and negative expressivity in their social partners compared to younger adults. Moreover, older adults were marginally less accurate than younger adults when perceiving emotions. Positive expressivity of the social partners predicted lower emotion-perception accuracy in younger but not older adults. Our findings mark the first step to identifying possible characteristics of the social environment that may contribute to the age difference in emotion-perception accuracy.
APA, Harvard, Vancouver, ISO, and other styles
30

Atkinson, Anthony P. "Emotion-specific clues to the neural substrate of empathy." Behavioral and Brain Sciences 25, no. 1 (February 2002): 22–23. http://dx.doi.org/10.1017/s0140525x02240017.

Full text
Abstract:
Research only alluded to by Preston & de Waal (P&deW) indicates the disproportionate involvement of some brain regions in the perception and experience of certain emotions. This suggests that the neural substrate of primitive emotional contagion has some emotion-specific aspects, even if cognitively sophisticated forms of empathy do not. Goals for future research include determining the ways in which empathy is emotion-specific and dependent on overt or covert perception.
APA, Harvard, Vancouver, ISO, and other styles
31

Gao, Yi, Zhiguo Li, and Kashif Khan. "Effect of Cognitive Variables and Emotional Variables on Urban Residents’ Recycled Water Reuse Behavior." Sustainability 11, no. 8 (April 12, 2019): 2208. http://dx.doi.org/10.3390/su11082208.

Full text
Abstract:
Urban residents’ perception of recycled water reuse is the foundation for the realization of recycled water reuse behavior. However, even though the perception factor is equipped, it does not mean that urban residents will use recycled water continuously for sure. Therefore, in this research, the authors try to put cognitive factors and emotional factors into a unified behavioral process. Based on this theory framework, the paper will interpret the initiation, formation and continuing process of recycled water reuse behavior of urban residents. On the basis of previous studies, this study established a theoretical model to study the influence of cognitive factors and emotional factors on recycled water reuse behavior of the residents. Based on the data of 325 samples, the direct and indirect relationship between the variables in the model is verified through path analysis and mediation analysis. The empirical results show that: firstly, the urban residents’ perception of recycled water reuse can activate their emotion for recycled water, and the emotion includes both positive emotion and negative emotion; secondly, although the recognition of recycled water can stimulate both positive and negative emotional factors, there are great differences between positive and negative emotions on the initiation, formation and sustainability of recycled water behavior. Negative emotion has a certain effect on the initiation of recycled water reuse behavior, but it has no significant effect on the formation and sustainability of recycled water reuse behavior. By contrast, positive emotion has no significant effect on the initiation of recycled water reuse behavior, but it has a significant effect on the formation and sustainability of recycled water reuse behavior. That is to say, at different stages, the recycled water reuse behaviors are affected differently by positive emotions and negative emotions. Thirdly, compared with negative emotional variables, positive emotions have a greater impact on individual recycled water reuse behavior. Positive emotional variables can significantly mediate the impact of cognitive variables on recycled water reuse behavior habits. In other words, positive emotions play a vital role in the sustainability of recycled water reuse.
APA, Harvard, Vancouver, ISO, and other styles
32

Kirandziska, Vesna, and Nevena Ackovska. "Comparison of emotion evaluation perception using human voice signals of robots and humans." International Journal of Business & Technology 2, no. 2 (May 2014): 13–17. http://dx.doi.org/10.33107/ijbte.2014.2.2.03.

Full text
Abstract:
Emotion perception is the process of perceiving other people’s emotions. It can be based on their facial expression, movement, voice and other biosignals people emit. The evaluation of human’s emotion is one characteristic of emotions. One of the research areas in Robotics is adapting humanistic behavior in robots. Today many robots are constructed. Some of them can even perceive emotions. In this paper a custom built emotion aware robot that perceives emotion evaluation is used to investigate the similarity and differences of the robot's and human's emotion perception. Voice signals from real human were recorded and the information for the emotion evaluation was obtained from our robot, but also from a set of human evaluators. This paper presents the results of the experiments done. The experimental results show the difficulty of the problem of emotion evaluation perception in general. The significance of human voice signals in emotion evaluation is also investigated.
APA, Harvard, Vancouver, ISO, and other styles
33

VAN DE VELDE, Daan J., Niels O. SCHILLER, Claartje C. LEVELT, Vincent J. VAN HEUVEN, Mieke BEERS, Jeroen J. BRIAIRE, and Johan H. M. FRIJNS. "Prosody perception and production by children with cochlear implants." Journal of Child Language 46, no. 1 (October 18, 2018): 111–41. http://dx.doi.org/10.1017/s0305000918000387.

Full text
Abstract:
AbstractThe perception and production of emotional and linguistic (focus) prosody were compared in children with cochlear implants (CI) and normally hearing (NH) peers. Thirteen CI and thirteen hearing-age-matched school-aged NH children were tested, as baseline, on non-verbal emotion understanding, non-word repetition, and stimulus identification and naming. Main tests were verbal emotion discrimination, verbal focus position discrimination, acted emotion production, and focus production. Productions were evaluated by NH adult Dutch listeners. All scores between groups were comparable, except a lower score for the CI group for non-word repetition. Emotional prosody perception and production scores correlated weakly for CI children but were uncorrelated for NH children. In general, hearing age weakly predicted emotion production but not perception. Non-verbal emotional (but not linguistic) understanding predicted CI children's (but not controls’) emotion perception and production. In conclusion, increasing time in sound might facilitate vocal emotional expression, possibly requiring independently maturing emotion perception skills.
APA, Harvard, Vancouver, ISO, and other styles
34

Xue, Jing. "EEG Analysis with Wavelet Transform under Music Perception Stimulation." Journal of Healthcare Engineering 2021 (December 15, 2021): 1–7. http://dx.doi.org/10.1155/2021/9725762.

Full text
Abstract:
In order to improve the classification accuracy and reliability of emotional state assessment and provide support and help for music therapy, this paper proposes an EEG analysis method based on wavelet transform under the stimulation of music perception. Using the data from the multichannel standard emotion database (DEAP), α, ß, and θ rhythms are extracted in frontal (F3 and F4), temporal (T7 and T8), and central (C3 and C4) channels with wavelet transform. EMD is performed on the extracted EEG rhythm to obtain intrinsic mode function (IMF) components, and then, the average energy and amplitude difference eigenvalues of IMF components of EEG rhythm waves are further extracted, that is, each rhythm wave contains three average energy characteristics and two amplitude difference eigenvalues so as to fully extract EEG feature information. Finally, emotional state evaluation is realized based on a support vector machine classifier. The results show that the correct rate between no emotion, positive emotion, and negative emotion can reach more than 90%. Among the pairwise classification problems among the four emotions selected, the classification accuracy obtained by this EEG feature extraction method is higher than that obtained by general feature extraction methods, which can reach about 70%. Changes in EEG α wave power were closely correlated with the polarity and intensity of emotion; α wave power varied significantly between “happiness and fear,” “pleasure and fear,” and “fear and sadness.” It has a good application prospect in both psychological and physiological research of emotional perception and practical application.
APA, Harvard, Vancouver, ISO, and other styles
35

Joseph, Philip L. A., David A. Sturgeon, and Julian Leff. "The Perception of Emotion by Schizophrenic Patients." British Journal of Psychiatry 161, no. 5 (November 1992): 603–9. http://dx.doi.org/10.1192/bjp.161.5.603.

Full text
Abstract:
Studies which have examined the perception of emotion by schizophrenic patients have produced conflicting results, an outcome which may, in part, be due to difficulties in presenting a realistic portrayal of emotion. This study exposed 32 schizophrenic patients in remission and ten controls to five videotaped scenes of emotional situations played by actors. The schizophrenic patients were divided into three groups, namely those living with high-EE relatives, those living with low-EE relatives and those living alone, in order to test the hypothesis that patients in a high-EE environment are less able to identify emotionally charged situations. Measures of electrodermal activity and self-ratings of tension were recorded concomitantly. The schizophrenic patients in all groups were as adept at identifying emotions as were the controls. There was no difference between the groups in electrodermal activity and subjective tension for all video scenes, except for the one which portrayed the only pleasant interaction; the high-EE group was significantly more aroused on both measures, which were independent of each other.
APA, Harvard, Vancouver, ISO, and other styles
36

Achiche, Sofiane, and Saeema Ahmed-Kristensen. "Genetic fuzzy modeling of user perception of three-dimensional shapes." Artificial Intelligence for Engineering Design, Analysis and Manufacturing 25, no. 1 (February 2011): 93–107. http://dx.doi.org/10.1017/s0890060410000466.

Full text
Abstract:
AbstractDefining the aesthetic and emotional value of a product is an important consideration for its design. Furthermore, if several designers are faced with the task of creating an object that describes a certain emotion/perception (aggressive, soft, heavy, etc.), each is most likely to interpret the emotion/perception with different shapes composed of a set of different geometric features. The authors propose an automatic approach to formalize the relationships between geometric information of three-dimensional objects and the intended emotional content using fuzzy logic. In addition, the automatically generated fuzzy knowledge base was compared to the user's perceptions and to the manually constructed fuzzy knowledge base. The initial findings indicate that the approach is valid to formalize geometric information with perceptions and validate the author's manually developed fuzzy models.
APA, Harvard, Vancouver, ISO, and other styles
37

Deonna, Julien A. "Emotion, Perception and Perspective." dialectica 60, no. 1 (March 2006): 29–46. http://dx.doi.org/10.1111/j.1746-8361.2005.01031.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Niedenthal, Paula M., and Marc B. Setterlund. "Emotion Congruence in Perception." Personality and Social Psychology Bulletin 20, no. 4 (August 1994): 401–11. http://dx.doi.org/10.1177/0146167294204007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Tanaka, Akihiro. "3. Crossmodal Emotion Perception." Journal of The Institute of Image Information and Television Engineers 72, no. 1 (2018): 12–16. http://dx.doi.org/10.3169/itej.72.12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Barrett, Lisa Feldman, Batja Mesquita, and Maria Gendron. "Context in Emotion Perception." Current Directions in Psychological Science 20, no. 5 (October 2011): 286–90. http://dx.doi.org/10.1177/0963721411422522.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Phillips, Mary L., Wayne C. Drevets, Scott L. Rauch, and Richard Lane. "Neurobiology of emotion perception I: the neural basis of normal emotion perception." Biological Psychiatry 54, no. 5 (September 2003): 504–14. http://dx.doi.org/10.1016/s0006-3223(03)00168-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Kashyap, Naveen. "Is Emotion Perception Relative? Evaluating Sleep Effects on Relativity of Emotion Perception." Psychological Studies 59, no. 3 (September 2014): 284–88. http://dx.doi.org/10.1007/s12646-014-0275-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Kaiser, Ramona, and Peter E. Keller. "Music's Impact on the Visual Perception of Emotional Dyadic Interactions." Musicae Scientiae 15, no. 2 (July 2011): 270–87. http://dx.doi.org/10.1177/102986491101500208.

Full text
Abstract:
Research showing that emotions can be recognized in point-light displays of human dyadic interactions was extended in the current study by investigating the impact of music on the perception of normal and exaggerated expressions of happiness, contentedness, sadness and anger in such visual stimuli. Sixteen musically untrained participants viewed short video clips of these emotion portrayals, each presented with emotionally compatible (e.g., happy music accompanies a happy interaction) and emotionally incompatible piano music (e.g., sad music accompanies a happy interaction). It was hypothesized that music will increase the accuracy of emotion judgements in displays where auditory and visual information is compatible relative to displays with incompatible audio-visual information. A two-dimensional emotion space was used to record participants' judgements of emotion in only the visual stimuli. Results indicated that music affected the accuracy of emotion judgements. Happiness and sadness were perceived more accurately in compatible than in incompatible conditions, while the opposite was the case for contentedness. Anger was perceived accurately in all conditions. Exaggerated expressions of sadness, which were evaluated more accurately than normal expressions of sadness, were also found to be resistant to the music. These findings can be interpreted in the light of previous research demonstrating that music's cross-modal impact depends on the degree of emotional ambiguity in the visual display. More generally, the results demonstrate that the perception of emotions in biological motion can be affected by music.
APA, Harvard, Vancouver, ISO, and other styles
44

Kaiser, Ramona, and Peter E. Keller. "Music’s impact on the visual perception of emotional dyadic interactions." Musicae Scientiae 15, no. 2 (July 2011): 270–87. http://dx.doi.org/10.1177/1029864911401173.

Full text
Abstract:
Research showing that emotions can be recognized in point-light displays of human dyadic interactions was extended in the current study by investigating the impact of music on the perception of normal and exaggerated expressions of happiness, contentedness, sadness and anger in such visual stimuli. Sixteen musically untrained participants viewed short video clips of these emotion portrayals, each presented with emotionally compatible (e.g., happy music accompanies a happy interaction) and emotionally incompatible piano music (e.g., sad music accompanies a happy interaction). It was hypothesized that music will increase the accuracy of emotion judgements in displays where auditory and visual information is compatible relative to displays with incompatible audio-visual information. A two-dimensional emotion space was used to record participants’ judgements of emotion in only the visual stimuli. Results indicated that music affected the accuracy of emotion judgements. Happiness and sadness were perceived more accurately in compatible than in incompatible conditions, while the opposite was the case for contentedness. Anger was perceived accurately in all conditions. Exaggerated expressions of sadness, which were evaluated more accurately than normal expressions of sadness, were also found to be resistant to the music. These findings can be interpreted in the light of previous research demonstrating that music’s cross-modal impact depends on the degree of emotional ambiguity in the visual display. More generally, the results demonstrate that the perception of emotions in biological motion can be affected by music.
APA, Harvard, Vancouver, ISO, and other styles
45

Baumgartner, Thomas, Michaela Esslen, and Lutz Jäncke. "From emotion perception to emotion experience: Emotions evoked by pictures and classical music." International Journal of Psychophysiology 60, no. 1 (April 2006): 34–43. http://dx.doi.org/10.1016/j.ijpsycho.2005.04.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Banerjee, Abhishek, Uttaran Bhattacharya, and Aniket Bera. "Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 1 (June 28, 2022): 3–10. http://dx.doi.org/10.1609/aaai.v36i1.19873.

Full text
Abstract:
We present a novel generalized zero-shot algorithm to recognize perceived emotions from gestures. Our task is to map gestures to novel emotion categories not encountered in training. We introduce an adversarial autoencoder-based representation learning that correlates 3D motion-captured gesture sequences with the vectorized representation of the natural-language perceived emotion terms using word2vec embeddings. The language-semantic embedding provides a representation of the emotion label space, and we leverage this underlying distribution to map the gesture sequences to the appropriate categorical emotion labels. We train our method using a combination of gestures annotated with known emotion terms and gestures not annotated with any emotions. We evaluate our method on the MPI Emotional Body Expressions Database (EBEDB) and obtain an accuracy of 58.43%. We see an improvement in performance compared to current state-of-the-art algorithms for generalized zero-shot learning by an absolute 25-27%. We also demonstrate our approach on publicly available online videos and movie scenes, where the actors' pose has been extracted and map to their respective emotive states.
APA, Harvard, Vancouver, ISO, and other styles
47

Silva, I. Carneiro, A. Gouveia, G. Dalagna, J. M. Oliveira, P. Carvalho, R. Costa, and J. Gama. "Music and emotion." European Psychiatry 64, S1 (April 2021): S671—S672. http://dx.doi.org/10.1192/j.eurpsy.2021.2018.

Full text
Abstract:
Introduction Music has been said to be emotion’s language. Research confirms a link between music structure and triggered emotions.ObjectivesTo assess the relationship between selected music excerpts and the emotions trigged, in order that the former will be used in future research.MethodsAn anonymous study was performed in April 2019 on 65 subjects of both sexes, aged 19- 33 (mean=21,09; SD=3,05).Subjects listened 4 excerpts of music, believed to be related either to excitement or to calmness, and answered to a questionary on emotion’s triggered by each exposure.ResultsRegarding to the music excerpts that were believed to induce excitement 80% of the subjects mentioned exciting emotions, 78% enjoyed the music while 78% didn’t knew them. For the ones that were believed to induce calmness 69% of the subjects mentioned calm emotions, 84% enjoyed the music and 62% didn’t knew the music. In an excerpt of music related to calmness, we observed association between knowing the music and the emotion trigged (p=0,027). The triggered emotion responses were independent of liking the music (P>0,05).ConclusionsIn our study, independent of liking the music, the participants reported to have perceived the expected emotions triggered by musical excerpts, showing this to be a phenomenon related to music structure. Calmness perception may be also influenced by previous knowledge of the music and related experiences. The role of individual perceptions will be looked for in following studies.DisclosureNo significant relationships.
APA, Harvard, Vancouver, ISO, and other styles
48

BOZIKAS, VASILIS P., MARY H. KOSMIDIS, MARIA GIANNAKOU, MIHALIS SAITIS, KOSTAS FOKAS, and GEORGE GARYFALLOS. "Emotion perception in obsessive–compulsive disorder." Journal of the International Neuropsychological Society 15, no. 1 (January 2009): 148–53. http://dx.doi.org/10.1017/s1355617708090097.

Full text
Abstract:
AbstractThe purpose of the present study was to investigate the ability to perceive facial and vocal affect in a group of patients with obsessive–compulsive disorder (OCD) and to explore the specific emotions that might be troublesome for them. Participants were 25 patients with OCD and 25 healthy controls, matched for age, education, and gender. They were assessed with computerized tests of affect perception using visual faces [Kinney’s Affect Matching Test (KAMT)], visual everyday scenarios [Fantie’s Cartoon Test (FCT)], and prosody [Affective Prosody Test (APT)], as well as a facial recognition test [Kinney’s Identity Matching Test (KIMT)]. Severity of OCD symptoms in the patient group was measured with the Yale–Brown Obsessive Compulsive Scale. Patients with OCD were not impaired in the perception of emotion, in either the visual [still photographs (KAMT) or sketches of everyday scenarios (FCT)] or the vocal (APT) modality, as compared with age-, sex-, and education-matched healthy individuals. Moreover, patients with OCD did not differ from healthy individuals in discriminating facial identity (KIMT). With regard to each emotion type separately, patients performed equally well as healthy individuals in all the emotions examined. Emotion processing of both facial expressions and prosody does not appear to be deficient in patients with OCD (JINS, 2009, 15, 148–153).
APA, Harvard, Vancouver, ISO, and other styles
49

Martin, Rod A., Glen E. Berry, Tobi Dobranski, Marilyn Horne, and Philip G. Dodgson. "Emotion Perception Threshold: Individual Differences in Emotional Sensitivity." Journal of Research in Personality 30, no. 2 (June 1996): 290–305. http://dx.doi.org/10.1006/jrpe.1996.0019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Yamamoto, Hisako, Misako Kawahara, Mariska Kret, and Akihiro Tanaka. "Cultural Differences in Emoticon Perception: Japanese See the Eyes and Dutch the Mouth of Emoticons." Letters on Evolutionary Behavioral Science 11, no. 2 (December 15, 2020): 40–45. http://dx.doi.org/10.5178/lebs.2020.80.

Full text
Abstract:
This study investigated cultural differences in the perception of emoticons between Japanese and Dutch participants. We manipulated the eyes and mouth of emoticons independently and asked participants to evaluate the emotion of each emoticon. The results show that Japanese participants tended to focus on the emotion expressed with the eyes while Dutch participants put weight on the shape of the mouth when evaluating emoticons. This tendency is consistent with a previous cross-cultural study comparing people from Japan and the United States (Yuki et al., 2007).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography