Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Neutral emotion.

Статті в журналах з теми "Neutral emotion"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Neutral emotion".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Brito, Pedro Quelhas, Sandra Torres, and Jéssica Fernandes. "What kind of emotions do emoticons communicate?" Asia Pacific Journal of Marketing and Logistics 32, no. 7 (December 10, 2019): 1495–517. http://dx.doi.org/10.1108/apjml-03-2019-0136.

Повний текст джерела
Анотація:
Purpose The purpose of this paper is to study the nature and concept of emoticons/emojis. Instead of taking for granted that these user-generated formats are necessarily emotional, we empirically assessed in what extent are they and the specificity of each one. Drawing on congruent mood state, valence core and emotion appraisal theories we expected a compatible statistical association between positive/negative/neutral emotional valence expressions and emoticons of similar valence. The positive emoticons were consistently associated with positive valence posts. Added to that analysis, 21 emotional categories were identified in posts and correlated with eight emoticons. Design/methodology/approach Two studies were used to address this question. The first study defined emoticon concept and interpreted their meaning highlighting their communication goals and anticipated effects. The link between emojis and emoticons was also obtained. Some emoticons types present more ambiguity than others. In the second study, three years of real and private (Facebook) posts from 82 adolescents were content analyzed and coded. Findings Only the neutral emoticons always matched neutral emotional categories found in the written interaction. Although the emoticon valence and emotional category congruence pattern was the rule, we also detected a combination of different valence emoticons types and emotion categories valence expressions. Apparently the connection between emoticon and emotion are not so obviously straightforward as the literature used to assume. The created objects designed to communicate emotions (emoticons) have their specific corresponding logic with the emotional tone of the message. Originality/value Theoretically, we discussed the emotional content of emoticons/emojis. Although this king of signals have an Asian origin and later borrowed from the western countries, their ambiguity and differing specificity have never been analyzed.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Shahin, Ismail. "Employing Emotion Cues to Verify Speakers in Emotional Talking Environments." Journal of Intelligent Systems 25, no. 1 (January 1, 2016): 3–17. http://dx.doi.org/10.1515/jisys-2014-0118.

Повний текст джерела
Анотація:
AbstractUsually, people talk neutrally in environments where there are no abnormal talking conditions such as stress and emotion. Other emotional conditions that might affect people’s talking tone include happiness, anger, and sadness. Such emotions are directly affected by the patient’s health status. In neutral talking environments, speakers can be easily verified; however, in emotional talking environments, speakers cannot be easily verified as in neutral talking ones. Consequently, speaker verification systems do not perform well in emotional talking environments as they do in neutral talking environments. In this work, a two-stage approach has been employed and evaluated to improve speaker verification performance in emotional talking environments. This approach employs speaker’s emotion cues (text-independent and emotion-dependent speaker verification problem) based on both hidden Markov models (HMMs) and suprasegmental HMMs as classifiers. The approach is composed of two cascaded stages that combine and integrate an emotion recognizer and a speaker recognizer into one recognizer. The architecture has been tested on two different and separate emotional speech databases: our collected database and the Emotional Prosody Speech and Transcripts database. The results of this work show that the proposed approach gives promising results with a significant improvement over previous studies and other approaches such as emotion-independent speaker verification approach and emotion-dependent speaker verification approach based completely on HMMs.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Shu, Lin, Yang Yu, Wenzhuo Chen, Haoqiang Hua, Qin Li, Jianxiu Jin, and Xiangmin Xu. "Wearable Emotion Recognition Using Heart Rate Data from a Smart Bracelet." Sensors 20, no. 3 (January 28, 2020): 718. http://dx.doi.org/10.3390/s20030718.

Повний текст джерела
Анотація:
Emotion recognition and monitoring based on commonly used wearable devices can play an important role in psychological health monitoring and human-computer interaction. However, the existing methods cannot rely on the common smart bracelets or watches for emotion monitoring in daily life. To address this issue, our study proposes a method for emotional recognition using heart rate data from a wearable smart bracelet. A ‘neutral + target’ pair emotion stimulation experimental paradigm was presented, and a dataset of heart rate from 25 subjects was established, where neutral plus target emotion (neutral, happy, and sad) stimulation video pairs from China’s standard Emotional Video Stimuli materials (CEVS) were applied to the recruited subjects. Normalized features from the data of target emotions normalized by the baseline data of neutral mood were adopted. Emotion recognition experiment results approved the effectiveness of ‘neutral + target’ video pair simulation experimental paradigm, the baseline setting using neutral mood data, and the normalized features, as well as the classifiers of Adaboost and GBDT on this dataset. This method will promote the development of wearable consumer electronic devices for monitoring human emotional moods.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Dyck, M., U. Habel, J. Slodczyk, J. Schlummer, V. Backes, F. Schneider, and M. Reske. "Negative bias in fast emotion discrimination in borderline personality disorder." Psychological Medicine 39, no. 5 (August 28, 2008): 855–64. http://dx.doi.org/10.1017/s0033291708004273.

Повний текст джерела
Анотація:
BackgroundThe ability to decode emotional information from facial expressions is crucial for successful social interaction. Borderline personality disorder (BPD) is characterized by serious problems in interpersonal relationships and emotional functioning. Empirical research on facial emotion recognition in BPD has been sparsely published and results are inconsistent. To specify emotion recognition deficits in BPD more closely, the present study implemented two emotion recognition tasks differing in response format.MethodNineteen patients with BPD and 19 healthy subjects were asked to evaluate the emotional content of visually presented stimuli (emotional and neutral faces). The first task, the Fear Anger Neutral (FAN) Test, required a rapid discrimination between negative or neutral facial expressions whereas in the second task, the Emotion Recognition (ER) Test, a precise decision regarding default emotions (sadness, happiness, anger, fear and neutral) had to be achieved without a time limit.ResultsIn comparison to healthy subjects, BPD patients showed a deficit in emotion recognition only in the fast discrimination of negative and neutral facial expressions (FAN Test). Consistent with earlier findings, patients demonstrated a negative bias in the evaluation of neutral facial expressions. When processing time was unlimited (ER Test), BPD patients performed as well as healthy subjects in the recognition of specific emotions. In addition, an association between performance in the fast discrimination task (FAN Test) and post-traumatic stress disorder (PTSD) co-morbidity was indicated.ConclusionsOur data suggest a selective deficit of BPD patients in rapid and direct discrimination of negative and neutral emotional expressions that may underlie difficulties in social interactions.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Pousson, Jachin Edward, Aleksandras Voicikas, Valdis Bernhofs, Evaldas Pipinis, Lana Burmistrova, Yuan-Pin Lin, and Inga Griškova-Bulanova. "Spectral Characteristics of EEG during Active Emotional Musical Performance." Sensors 21, no. 22 (November 10, 2021): 7466. http://dx.doi.org/10.3390/s21227466.

Повний текст джерела
Анотація:
The research on neural correlates of intentional emotion communication by the music performer is still limited. In this study, we attempted to evaluate EEG patterns recorded from musicians who were instructed to perform a simple piano score while manipulating their manner of play to express specific contrasting emotions and self-rate the emotion they reflected on the scales of arousal and valence. In the emotional playing task, participants were instructed to improvise variations in a manner by which the targeted emotion is communicated. In contrast, in the neutral playing task, participants were asked to play the same piece precisely as written to obtain data for control over general patterns of motor and sensory activation during playing. The spectral analysis of the signal was applied as an initial step to be able to connect findings to the wider field of music-emotion research. The experimental contrast of emotional playing vs. neutral playing was employed to probe brain activity patterns differentially involved in distinct emotional states. The tasks of emotional and neutral playing differed considerably with respect to the state of intended-to-transfer emotion arousal and valence levels. The EEG activity differences were observed between distressed/excited and neutral/depressed/relaxed playing.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Liao, Songyang, Katsuaki Sakata, and Galina V. Paramei. "Color Affects Recognition of Emoticon Expressions." i-Perception 13, no. 1 (January 2022): 204166952210807. http://dx.doi.org/10.1177/20416695221080778.

Повний текст джерела
Анотація:
In computer-mediated communication, emoticons are conventionally rendered in yellow. Previous studies demonstrated that colors evoke certain affective meanings, and face color modulates perceived emotion. We investigated whether color variation affects the recognition of emoticon expressions. Japanese participants were presented with emoticons depicting four basic emotions (Happy, Sad, Angry, Surprised) and a Neutral expression, each rendered in eight colors. Four conditions (E1–E4) were employed in the lab-based experiment; E5, with an additional participant sample, was an online replication of the critical E4. In E1, colored emoticons were categorized in a 5AFC task. In E2–E5, stimulus affective meaning was assessed using visual scales with anchors corresponding to each emotion. The conditions varied in stimulus arrays: E2: light gray emoticons; E3: colored circles; E4 and E5: colored emoticons. The affective meaning of Angry and Sad emoticons was found to be stronger when conferred in warm and cool colors, respectively, the pattern highly consistent between E4 and E5. The affective meaning of colored emoticons is regressed to that of achromatic expression counterparts and decontextualized color. The findings provide evidence that affective congruency of the emoticon expression and the color it is rendered in facilitates recognition of the depicted emotion, augmenting the conveyed emotional message.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Zhang, Zhan, Yufei Song, Liqing Cui, Xiaoqian Liu, and Tingshao Zhu. "Emotion recognition based on customized smart bracelet with built-in accelerometer." PeerJ 4 (July 26, 2016): e2258. http://dx.doi.org/10.7717/peerj.2258.

Повний текст джерела
Анотація:
Background:Recently, emotion recognition has become a hot topic in human-computer interaction. If computers could understand human emotions, they could interact better with their users. This paper proposes a novel method to recognize human emotions (neutral, happy, and angry) using a smart bracelet with built-in accelerometer.Methods:In this study, a total of 123 participants were instructed to wear a customized smart bracelet with built-in accelerometer that can track and record their movements. Firstly, participants walked two minutes as normal, which served as walking behaviors in a neutral emotion condition. Participants then watched emotional film clips to elicit emotions (happy and angry). The time interval between watching two clips was more than four hours. After watching film clips, they walked for one minute, which served as walking behaviors in a happy or angry emotion condition. We collected raw data from the bracelet and extracted a few features from raw data. Based on these features, we built classification models for classifying three types of emotions (neutral, happy, and angry).Results and Discussion:For two-category classification, the classification accuracy can reach 91.3% (neutral vs. angry), 88.5% (neutral vs. happy), and 88.5% (happy vs. angry), respectively; while, for the differentiation among three types of emotions (neutral, happy, and angry), the accuracy can reach 81.2%.Conclusions:Using wearable devices, we found it is possible to recognize human emotions (neutral, happy, and angry) with fair accuracy. Results of this study may be useful to improve the performance of human-computer interaction.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Charernboon, Thammanard. "Negative and Neutral Valences of Affective Theory of Mind are More Impaired than Positive Valence in Clinically Stable Schizophrenia Patients." Psychiatry Investigation 17, no. 5 (May 15, 2020): 460–64. http://dx.doi.org/10.30773/pi.2020.0040.

Повний текст джерела
Анотація:
Objective People with schizophrenia show impairment in social cognition, such as emotion recognition and theory of mind. The current study aims to compare the ability of clinically stable schizophrenia patients to decode the positive, negative and neutral affective mental state of others with educational match-paired normal control.Methods 50 people with schizophrenia and 50 matched controls were compared on the positive, negative and neutral emotional valence of affective theory of mind using the Reading the Mind in the Eyes Tests.Results The results showed that people with schizophrenia performed worse in negative and neutral emotional valence than normal controls; however, no significant differences in decoding positive valence were found.Conclusion Our data suggest that there is variability in the performance of affective theory of mind according to emotion valence; the impairments seem to be specific to only negative and neutral emotions, but not positive ones.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Horan, W. P., G. Hajcak, J. K. Wynn, and M. F. Green. "Impaired emotion regulation in schizophrenia: evidence from event-related potentials." Psychological Medicine 43, no. 11 (January 28, 2013): 2377–91. http://dx.doi.org/10.1017/s0033291713000019.

Повний текст джерела
Анотація:
BackgroundAlthough several aspects of emotion seem to be intact in schizophrenia, there is emerging evidence that patients show an impaired ability to adaptively regulate their emotions. This event-related potential (ERP) study examined whether schizophrenia is associated with impaired neural responses to appraisal frames, that is when negative stimuli are presented in a less negative context.MethodThirty-one schizophrenia out-patients and 27 healthy controls completed a validated picture-viewing task with three conditions: (1) neutral pictures preceded by neutral descriptions (‘Neutral’), (2) unpleasant pictures preceded by negative descriptions (‘Preappraised negative’), and (3) unpleasant pictures preceded by more neutral descriptions (‘Preappraised neutral’). Analyses focused on the late positive potential (LPP), an index of facilitated attention to emotional stimuli that is reduced following cognitive emotion regulation strategies, during four time windows from 300 to 2000 ms post-picture onset.ResultsReplicating prior studies, controls showed smaller LPP in Preappraised neutral and Neutral versus Preappraised negative conditions throughout the 300–2000-ms time period. By contrast, patients showed (a) larger LPP in Preappraised neutral and Preappraised negative versus Neutral conditions in the initial period (300–600 ms) and (b) an atypical pattern of larger LPP to Preappraised neutral versus Preappraised negative and Neutral conditions in the 600–1500-ms epochs.ConclusionsModulation of neural responses by a cognitive emotion regulation strategy seems to be impaired in schizophrenia during the first 2 s after exposure to unpleasant stimuli.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Grace, Sally A., Wei Lin Toh, Ben Buchanan, David J. Castle, and Susan L. Rossell. "Impaired Recognition of Negative Facial Emotions in Body Dysmorphic Disorder." Journal of the International Neuropsychological Society 25, no. 08 (May 17, 2019): 884–89. http://dx.doi.org/10.1017/s1355617719000419.

Повний текст джерела
Анотація:
Abstract Objectives: Patients with body dysmorphic disorder (BDD) have difficulty in recognising facial emotions, and there is evidence to suggest that there is a specific deficit in identifying negative facial emotions, such as sadness and anger. Methods: This study investigated facial emotion recognition in 19 individuals with BDD compared with 21 healthy control participants who completed a facial emotion recognition task, in which they were asked to identify emotional expressions portrayed in neutral, happy, sad, fearful, or angry faces. Results: Compared to the healthy control participants, the BDD patients were generally less accurate in identifying all facial emotions but showed specific deficits for negative emotions. The BDD group made significantly more errors when identifying neutral, angry, and sad faces than healthy controls; and were significantly slower at identifying neutral, angry, and happy faces. Conclusions: These findings add to previous face-processing literature in BDD, suggesting deficits in identifying negative facial emotions. There are treatment implications as future interventions would do well to target such deficits.
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Deckert, Matthias, Michaela Schmoeger, Eduard Auff, and Ulrike Willinger. "Subjective emotional arousal: an explorative study on the role of gender, age, intensity, emotion regulation difficulties, depression and anxiety symptoms, and meta-emotion." Psychological Research 84, no. 7 (May 16, 2019): 1857–76. http://dx.doi.org/10.1007/s00426-019-01197-z.

Повний текст джерела
Анотація:
Abstract Subjective emotional arousal in typically developing adults was investigated in an explorative study. 177 participants (20–70 years) rated facial expressions and words for self-experienced arousal and perceived intensity, and completed the Difficulties in Emotion Regulation scale and the Hospital Anxiety and Depression scale (HADS-D). Exclusion criteria were psychiatric or neurological diseases, or clinically relevant scores in the HADS-D. Arousal regarding faces and words was significantly predicted by emotional clarity. Separate analyses showed following significant results: arousal regarding faces and arousal regarding words constantly predicted each other; negative faces were predicted by age and intensity; neutral faces by gender and impulse control; positive faces by gender and intensity; negative words by emotional clarity; and neutral words by gender. Males showed higher arousal scores than females regarding neutral faces and neutral words; for the other arousal scores, no explicit group differences were shown. Cluster analysis yielded three distinguished emotional characteristics groups: “emotional difficulties disposition group” (mainly females; highest emotion regulation difficulties, depression and anxiety scores; by trend highest arousal), “low emotional awareness group” (exclusively males; lowest awareness regarding currently experienced emotions; by trend intermediate arousal), and a “low emotional difficulties group” (exclusively females; lowest values throughout). No age effect was shown. Results suggest that arousal elicited by facial expressions and words are specialized parts of a greater emotional processing system and that typically developing adults show some kind of stable, modality-unspecific dispositional baseline of emotional arousal. Emotional awareness and clarity, and impulse control probably are trait aspects of emotion regulation that influence emotional arousal in typically developing adults and can be regarded as aspects of meta-emotion. Different emotional personality styles were shown between as well as within gender groups.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

De Panfilis, Chiara, Camilla Antonucci, Kevin B. Meehan, Nicole M. Cain, Antonio Soliani, Carlo Marchesi, John F. Clarkin, and Fabio Sambataro. "Facial Emotion Recognition and Social-Cognitive Correlates of Narcissistic Features." Journal of Personality Disorders 33, no. 4 (August 2019): 433–49. http://dx.doi.org/10.1521/pedi_2018_32_350.

Повний текст джерела
Анотація:
Narcissistic personality disorder (NPD) is associated with both seeming indifference and hypersensitivity to social feedback. This study evaluated whether rejection sensitivity and empathic difficulties in NPD are accounted for by altered facial emotion recognition (FER). Two-hundred non-clinical individuals self-reported NPD features, rejection sensitivity, and empathy and performed an FER task assessing the ability to determine the presence or absence of an emotion when viewing neutral and negative facial stimuli presented at varying emotional intensities (25%, 50%, 75%). Those with higher NPD features were faster at accurately recognizing neutral and low, 25%–intensity emotional stimuli. This response pattern mediated the association between NPD features and increased anger about rejection. Thus, individuals with high NPD traits are hypervigilant toward subtle negative emotions and neutral expressions; this may explain their tendency to experience intense angry feelings when facing the possibility that the others would not meet their need for acceptance.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Tamuri, Kairi. "Fundamental frequency in Estonian emotional read-out speech." Eesti ja soome-ugri keeleteaduse ajakiri. Journal of Estonian and Finno-Ugric Linguistics 6, no. 1 (June 8, 2014): 9–21. http://dx.doi.org/10.12697/jeful.2015.6.1.01.

Повний текст джерела
Анотація:
Fundamental frequency (F0, perceived as pitch) is an important prosodic cue of emotion. The aim of the present study was to find out if sentence emotion has any influence detectable in the F0 height and range of Estonian read-out speech. Thus the F0 of each vowel found in Estonian read-out sentences was measured, and its median for three emotions (anger, joy, sadness) and for neutral speech was calculated. In addition, the F0 range was measured for emotional and neutral speech, as well as the height of F0 for sentence-initial and sentence-final positions. The results revealed that in the investigated material, F0 was highest for joy and lowest for anger. The F0 range, however, was widest for anger and narrowest for sadness. The differences in F0 height at the beginning versus the end of sentences were not statistically significant, either for pairs of emotions or for emotions compared with neutral speech.
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Evers, Kris, Inneke Kerkhof, Jean Steyaert, Ilse Noens, and Johan Wagemans. "No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces." Autism Research and Treatment 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/345878.

Повний текст джерела
Анотація:
Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Vos, Silke, Olivier Collignon, and Bart Boets. "The Sound of Emotion: Pinpointing Emotional Voice Processing Via Frequency Tagging EEG." Brain Sciences 13, no. 2 (January 18, 2023): 162. http://dx.doi.org/10.3390/brainsci13020162.

Повний текст джерела
Анотація:
Successfully engaging in social communication requires efficient processing of subtle socio-communicative cues. Voices convey a wealth of social information, such as gender, identity, and the emotional state of the speaker. We tested whether our brain can systematically and automatically differentiate and track a periodic stream of emotional utterances among a series of neutral vocal utterances. We recorded frequency-tagged EEG responses of 20 neurotypical male adults while presenting streams of neutral utterances at a 4 Hz base rate, interleaved with emotional utterances every third stimulus, hence at a 1.333 Hz oddball frequency. Four emotions (happy, sad, angry, and fear) were presented as different conditions in different streams. To control the impact of low-level acoustic cues, we maximized variability among the stimuli and included a control condition with scrambled utterances. This scrambling preserves low-level acoustic characteristics but ensures that the emotional character is no longer recognizable. Results revealed significant oddball EEG responses for all conditions, indicating that every emotion category can be discriminated from the neutral stimuli, and every emotional oddball response was significantly higher than the response for the scrambled utterances. These findings demonstrate that emotion discrimination is fast, automatic, and is not merely driven by low-level perceptual features. Eventually, here, we present a new database for vocal emotion research with short emotional utterances (EVID) together with an innovative frequency-tagging EEG paradigm for implicit vocal emotion discrimination.
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Korpal, Paweł, and Aleksandra Jasielska. "Investigating interpreters’ empathy." Target. International Journal of Translation Studies 31, no. 1 (July 31, 2018): 2–24. http://dx.doi.org/10.1075/target.17123.kor.

Повний текст джерела
Анотація:
Abstract An experimental study was conducted to examine whether simultaneous interpreters are affected by the speaker’s emotions. To this end, two measures of emotion were used: galvanic skin response (GSR) as a marker of emotional arousal, and SUPIN – the Polish adaptation of PANAS (Positive and Negative Affect Schedule). A group of interpreters with Polish as their A language and English as their B language (N = 20) took part in the experiment. They were asked to simultaneously interpret two speeches (recordings accompanied by video) from Polish into English: a neutral speech and an emotional speech. The results show that the interpreters are indeed affected by the speaker’s emotions, which is reflected in both a greater galvanic skin response and higher SUPIN scores for the emotional speech, when compared to the neutral speech and baseline values. The results may shed new light on the importance of emotion processing in simultaneous interpreting.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Zhang, Yuanyuan, Baolin Liu, and Xiaorong Gao. "Investigation of the interaction between emotion and working memory load using spatiotemporal pattern similarity analysis." Journal of Neural Engineering 18, no. 6 (November 15, 2021): 066011. http://dx.doi.org/10.1088/1741-2552/ac3347.

Повний текст джерела
Анотація:
Abstract Objective. Accumulating evidence has revealed that emotions can be provided with the modulatory effect on working memory (WM) and WM load is an important factor for the interaction between emotion and WM. However, it remains controversial whether emotions inhibit or facilitate WM and the interaction between cognitive task, processing load and emotional processing remains unclear. Approach. In this study, we used a change detection paradigm wherein memory items have four different load sizes and emotion videos to induce three emotions (negative, neutral, and positive). We performed an event-related spectral perturbation (ERSP) analysis and a spatiotemporal pattern similarity (STPS) analysis on the electroencephalography data. Main results. The ERSP results indicated that alpha and beta oscillations can reflect the difference among WM load sizes and also can reflect the difference among emotions under middle high WM load over posterior brain region in the maintenance stage. Moreover, the STPS results demonstrated a significant interaction between emotion and WM load size in the posterior region and found significantly higher similarity indexes for the negative emotion to the neutral emotion under the middle high WM load during WM maintenance. In addition, The STPS results also revealed that both positive emotion and negative emotion could interfere with the distinction of load sizes. Significance. The consistence of the behavioral, ERSP and STPS results suggested that when the memory load approaches the limit of WM capacity, negative emotion could facilitate WM through the top–down attention modulation promoting the most relevant information storage during WM maintenance.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Ojha, Amitash, Charles Forceville, and Bipin Indurkhya. "An experimental study on the effect of emotion lines in comics." Semiotica 2021, no. 243 (October 7, 2021): 305–24. http://dx.doi.org/10.1515/sem-2019-0079.

Повний текст джерела
Анотація:
Abstract Both mainstream and art comics often use various flourishes surrounding characters’ heads. These so-called “pictorial runes” (also called “emanata”) help convey the emotional states of the characters. In this paper, using (manipulated) panels from Western and Indian comic albums as well as neutral emoticons and basic shapes in different colors, we focus on the following two issues: (a) whether runes increase the awareness in comics readers about the emotional state of the character; and (b) whether a correspondence can be found between the types of runes (twirls, spirals, droplets, and spikes) and specific emotions. Our results show that runes help communicate emotion. Although no one-to-one correspondence was found between the tested runes and specific emotions, it was found that droplets and spikes indicate generic emotions, spirals indicate negative emotions, and twirls indicate confusion and dizziness.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Hoertnagl, Christine M., Falko Biedermann, Nursen Yalcin-Siedentopf, Anna-Sophia Welte, Beatrice Frajo-Apor, Eberhard A. Deisenhammer, Armand Hausmann, Georg Kemmler, Moritz Muehlbacher, and Alex Hofer. "Combined Processing of Facial and Vocal Emotion in Remitted Patients With Bipolar I Disorder." Journal of the International Neuropsychological Society 25, no. 3 (February 7, 2019): 275–84. http://dx.doi.org/10.1017/s1355617718001145.

Повний текст джерела
Анотація:
AbstractObjectives: Bipolar disorder (BD) is associated with impairments in facial emotion and emotional prosody perception during both mood episodes and periods of remission. To expand on previous research, the current study investigated cross-modal emotion perception, that is, matching of facial emotion and emotional prosody in remitted BD patients. Methods: Fifty-nine outpatients with BD and 45 healthy volunteers were included into a cross-sectional study. Cross-modal emotion perception was investigated by using two subtests out of the Comprehensive Affective Testing System (CATS). Results: Compared to control subjects patients were impaired in matching sad (p < .001) and angry emotional prosody (p = .034) to one of five emotional faces exhibiting the corresponding emotion and significantly more frequently matched sad emotional prosody to happy faces (p < .001) and angry emotional prosody to neutral faces (p = .017). In addition, patients were impaired in matching neutral emotional faces to the emotional prosody of one of three sentences (p = .006) and significantly more often matched neutral faces to sad emotional prosody (p = .014). Conclusions: These findings demonstrate that, even during periods of symptomatic remission, patients suffering from BD are impaired in matching facial emotion and emotional prosody. As this type of emotion processing is relevant in everyday life, our results point to the necessity to provide specific training programs to improve psychosocial outcomes. (JINS, 2019, 25, 336–342)
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Martin, Jennifer M., and Jeanette Altarriba. "Effects of Valence on Hemispheric Specialization for Emotion Word Processing." Language and Speech 60, no. 4 (February 1, 2017): 597–613. http://dx.doi.org/10.1177/0023830916686128.

Повний текст джерела
Анотація:
The use of emotion in language is a key element of human interactions and a rich area for cognitive research. The present study examined reactions to words of five types: positive emotion (e.g., happiness), negative emotion (e.g., hatred), positive emotion-laden (e.g., blessing), negative emotion-laden (e.g., prison), and neutral (e.g., chance). Words and nonwords were intermixed in a lexical decision task using hemifield presentation. Results revealed a general left hemisphere advantage. Overall, reaction times for positive words were faster than for negative or neutral words and this effect varied by hemifield of presentation. These results support a valence hypothesis of specialized processing in the left hemisphere of the brain for positive emotions and the right hemisphere for negative emotions.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Harrison, A., S. Sullivan, K. Tchanturia, and J. Treasure. "Emotional functioning in eating disorders: attentional bias, emotion recognition and emotion regulation." Psychological Medicine 40, no. 11 (January 27, 2010): 1887–97. http://dx.doi.org/10.1017/s0033291710000036.

Повний текст джерела
Анотація:
BackgroundInterpersonal processes, anxiety and emotion regulation difficulties form a key part of conceptual models of eating disorders (EDs), such as anorexia nervosa (AN) and bulimia nervosa (BN), but the experimental findings to support this are limited.MethodThe Reading the Mind in the Eyes task, the Difficulties in Emotion Regulation Scale (DERS) and a computerized pictorial (angry and neutral faces) Stroop task were administered to 190 women [50 with AN, 50 with BN and 90 healthy controls (HCs)].ResultsThose with an ED showed attentional biases to faces in general (medium effect), but specifically to angry faces over neutral faces (large effect) compared to HCs. The ED group also reported significantly higher emotion regulation difficulties (large effect) than HCs. There was a small difference between the ED and HC groups for the emotion recognition task (small-medium effect), particularly in the restricting AN (RAN) group. Depression and attentional bias to faces significantly predicted emotion regulation difficulties in a regression model.ConclusionsThe data provide support for conceptualizations of EDs that emphasize the role of emotional functioning in the development and maintenance of EDs. Further research will concentrate on exploring whether these findings are state or trait features of EDs.
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Sato, Wataru, and Sakiko Yoshikawa. "Anti-expressions: Artificial control stimuli for the visual properties of emotional facial expressions." Social Behavior and Personality: an international journal 37, no. 4 (May 1, 2009): 491–501. http://dx.doi.org/10.2224/sbp.2009.37.4.491.

Повний текст джерела
Анотація:
The perceptual/cognitive processing for emotional facial expressions is effective compared to that for neutral facial expressions. To investigate whether this effectiveness can be attributed to the expression of emotion or to the visual properties of the facial expressions, we used computer morphing to develop a form of control stimuli. These "anti-expressions" changed the features in emotional facial expressions in the opposite direction from neutral expressions by amounts equivalent to the differences between emotional and neutral expressions. To examine if anti-expressions are usable as emotionally neutral faces, 35 participants were asked to categorize and rate the valence and arousal dimensions of six basic emotions for normal and anti-expressions. The results indicate that anti-expressions were assessed as neutral for anger, disgust, fear, and happiness, and these can be used as control stimuli in emotional facial expressions regarding visual properties.
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Yan, Fei, Abdullah M. Iliyasu, Zhen-Tao Liu, Ahmed S. Salama, Fangyan Dong, and Kaoru Hirota. "Bloch Sphere-Based Representation for Quantum Emotion Space." Journal of Advanced Computational Intelligence and Intelligent Informatics 19, no. 1 (January 20, 2015): 134–42. http://dx.doi.org/10.20965/jaciii.2015.p0134.

Повний текст джерела
Анотація:
A Bloch Sphere-based Emotion Space (BSES), where two angles φ and θ in the Bloch sphere represent the emotion (such as happiness, surprise, anger, sadness, expectation, or relaxation in [0, 2π)) and its intensity (from neutral to maximum in [0, π]), respectively, is proposed. It exploits the psychological interpretation of color to assign a basic color to each emotion subspace such that the BSES can be visualized, and by using quantum gates, changes in emotions can be tracked and recovered. In an experimental validation, two typical human emotions, happiness and sadness, are analyzed and visualized using the BSES according to a preset emotional transmission model. A transition matrix that tracks emotional change can be used to control robots allowing them to adapt and respond to human emotions.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Levy, Rachel S., and Spencer D. Kelly. "Emotion matters." Gesture 19, no. 1 (December 31, 2020): 41–71. http://dx.doi.org/10.1075/gest.19029.lev.

Повний текст джерела
Анотація:
Abstract Recent theories and neural models of co-speech gesture have extensively considered its cognitive role in language comprehension but have ignored the emotional function. We investigated the integration of speech and co-speech gestures in memory for verbal information with different emotional connotations (either positive, negative, or neutral). In a surprise cued-recall task, gesture boosted memory for speech with all three emotional valences. Interestingly, gesture was more likely to become integrated into memory of neutrally and positively valenced speech than negatively valenced speech. The results suggest that gesture-speech integration is modulated by emotional valence of speech, which has implications for the emotional function of gesture in language comprehension.
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Martinez, A., and D. Neth. "Emotion perception in neutral expressions." Journal of Vision 8, no. 6 (March 29, 2010): 713. http://dx.doi.org/10.1167/8.6.713.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Dores, Artemisa R., Fernando Barbosa, Cristina Queirós, Irene P. Carvalho, and Mark D. Griffiths. "Recognizing Emotions through Facial Expressions: A Largescale Experimental Study." International Journal of Environmental Research and Public Health 17, no. 20 (October 12, 2020): 7420. http://dx.doi.org/10.3390/ijerph17207420.

Повний текст джерела
Анотація:
Experimental research examining emotional processes is typically based on the observation of images with affective content, including facial expressions. Future studies will benefit from databases with emotion-inducing stimuli in which characteristics of the stimuli potentially influencing results can be controlled. This study presents Portuguese normative data for the identification of seven facial expressions of emotions (plus a neutral face), on the Radboud Faces Database (RaFD). The effect of participants’ gender and models’ sex on emotion recognition was also examined. Participants (N = 1249) were exposed to 312 pictures of white adults displaying emotional and neutral faces with a frontal gaze. Recognition agreement between the displayed and participants’ chosen expressions ranged from 69% (for anger) to 97% (for happiness). Recognition levels were significantly higher among women than among men only for anger and contempt. The emotion recognition was higher either in female models or in male models depending on the emotion. Overall, the results show high recognition levels of the facial expressions presented, indicating that the RaFD provides adequate stimuli for studies examining the recognition of facial expressions of emotion among college students. Participants’ gender had a limited influence on emotion recognition, but the sex of the model requires additional consideration.
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Tian, Wenqiang. "Personalized Emotion Recognition and Emotion Prediction System Based on Cloud Computing." Mathematical Problems in Engineering 2021 (May 26, 2021): 1–10. http://dx.doi.org/10.1155/2021/9948733.

Повний текст джерела
Анотація:
Promoting economic development and improving people’s quality of life have a lot to do with the continuous improvement of cloud computing technology and the rapid expansion of applications. Emotions play an important role in all aspects of human life. It is difficult to avoid the influence of inner emotions in people’s behavior and deduction. This article mainly studies the personalized emotion recognition and emotion prediction system based on cloud computing. This paper proposes a method of intelligently identifying users’ emotional states through the use of cloud computing. First, an emotional induction experiment is designed to induce the testers’ positive, neutral, and negative three basic emotional states and collect cloud data and EEG under different emotional states. Then, the cloud data is processed and analyzed to extract emotional features. After that, this paper constructs a facial emotion prediction system based on cloud computing data model, which consists of face detection and facial emotion recognition. The system uses the SVM algorithm for face detection, uses the temporal feature algorithm for facial emotion analysis, and finally uses the classification method of machine learning to classify emotions, so as to realize the purpose of identifying the user’s emotional state through cloud computing technology. Experimental data shows that the EEG signal emotion recognition method based on time domain features performs best has better generalization ability and is improved by 6.3% on the basis of traditional methods. The experimental results show that the personalized emotion recognition method based on cloud computing is more effective than traditional methods.
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Jaratrotkamjorn, Apichart. "Bimodal Emotion Recognition Using Deep Belief Network." ECTI Transactions on Computer and Information Technology (ECTI-CIT) 15, no. 1 (January 14, 2021): 73–81. http://dx.doi.org/10.37936/ecti-cit.2021151.226446.

Повний текст джерела
Анотація:
The emotions are very important in human daily life. In order to make the machine can recognize the human emotional state, and it can intelligently respond to need for human, which are very important in human-computer interaction. The majority of existing work concentrate on the classification of six basic emotions only. In this research work propose the emotion recognition system through the multimodal approach, which integrated information from both facial and speech expressions. The database has eight basic emotions (neutral, calm, happy, sad, angry, fearful, disgust, and surprised). Emotions are classified using deep belief network method. The experiment results show that the performance of bimodal emotion recognition system, it has better improvement. The overall accuracy rate is 97.92%.
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Dupuis, Kate, and M. Kathleen Pichora-Fuller. "Aging Affects Identification of Vocal Emotions in Semantically Neutral Sentences." Journal of Speech, Language, and Hearing Research 58, no. 3 (June 2015): 1061–76. http://dx.doi.org/10.1044/2015_jslhr-h-14-0256.

Повний текст джерела
Анотація:
Purpose The authors determined the accuracy of younger and older adults in identifying vocal emotions using the Toronto Emotional Speech Set (TESS; Dupuis & Pichora-Fuller, 2010a) and investigated the possible contributions of auditory acuity and suprathreshold processing to emotion identification accuracy. Method In 2 experiments, younger and older adults with normal hearing listened to and identified vocal emotions in the TESS stimuli. The TESS consists of phrases with controlled syntactic, lexical, and phonological properties spoken by an older female talker and a younger female talker to convey 7 emotion conditions (anger, disgust, fear, sadness, neutral, happiness, and pleasant surprise). Participants in both experiments completed audiometric testing; participants in Experiment 2 also completed 3 tests of suprathreshold auditory processing. Results Identification by both age groups was above chance for all emotions. Accuracy was lower for older adults in both experiments. The pattern of results was similar across age groups and experiments. Auditory acuity did not predict identification accuracy for either age group in either experiment, nor did performance on tests of auditory processing in Experiment 2. Conclusions These results replicate and extend previous findings concerning age-related differences in ability to identify vocal emotions and suggest that older adults' auditory abilities do not explain their difficulties in identifying vocal emotions.
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Kim, Tae-Yeun, Hoon Ko, Sung-Hwan Kim, and Ho-Da Kim. "Modeling of Recommendation System Based on Emotional Information and Collaborative Filtering." Sensors 21, no. 6 (March 12, 2021): 1997. http://dx.doi.org/10.3390/s21061997.

Повний текст джерела
Анотація:
Emotion information represents a user’s current emotional state and can be used in a variety of applications, such as cultural content services that recommend music according to user emotional states and user emotion monitoring. To increase user satisfaction, recommendation methods must understand and reflect user characteristics and circumstances, such as individual preferences and emotions. However, most recommendation methods do not reflect such characteristics accurately and are unable to increase user satisfaction. In this paper, six human emotions (neutral, happy, sad, angry, surprised, and bored) are broadly defined to consider user speech emotion information and recommend matching content. The “genetic algorithms as a feature selection method” (GAFS) algorithm was used to classify normalized speech according to speech emotion information. We used a support vector machine (SVM) algorithm and selected an optimal kernel function for recognizing the six target emotions. Performance evaluation results for each kernel function revealed that the radial basis function (RBF) kernel function yielded the highest emotion recognition accuracy of 86.98%. Additionally, content data (images and music) were classified based on emotion information using factor analysis, correspondence analysis, and Euclidean distance. Finally, speech information that was classified based on emotions and emotion information that was recognized through a collaborative filtering technique were used to predict user emotional preferences and recommend content that matched user emotions in a mobile application.
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Agrima, Abdellah, Ilham Mounir, Abdelmajid Farchi, Laila Elmazouzi, and Badia Mounir. "Emotion recognition based on the energy distribution of plosive syllables." International Journal of Electrical and Computer Engineering (IJECE) 12, no. 6 (December 1, 2022): 6159. http://dx.doi.org/10.11591/ijece.v12i6.pp6159-6171.

Повний текст джерела
Анотація:
<p><span lang="EN-US">We usually encounter two problems during speech emotion recognition (SER): expression and perception problems, which vary considerably between speakers, languages, and sentence pronunciation. In fact, finding an optimal system that characterizes the emotions overcoming all these differences is a promising prospect. In this perspective, we considered two emotional databases: Moroccan Arabic dialect emotional database (MADED), and Ryerson audio-visual database on emotional speech and song (RAVDESS) which present notable differences in terms of type (natural/acted), and language (Arabic/English). We proposed a detection process based on 27 acoustic features extracted from consonant-vowel (CV) syllabic units: </span><em><span lang="EN-US">\ba, \du, \ki, \ta</span></em><span lang="EN-US"> common to both databases. We tested two classification strategies: multiclass (all emotions combined: joy, sadness, neutral, anger) and binary (neutral vs. others, positive emotions (joy) vs. negative emotions (sadness, anger), sadness vs. anger). These strategies were tested three times: i) on MADED, ii) on RAVDESS, iii) on MADED and RAVDESS. The proposed method gave better recognition accuracy in the case of binary classification. The rates reach an average of 78% for the multi-class classification, 100% for neutral vs. other cases, 100% for the negative emotions (i.e. anger vs. sadness), and 96% for the positive vs. negative emotions.</span></p>
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Mohd Anuardi, Muhammad Nur Adilin, and Atsuko K. Yamazaki. "Effect of emotionally toned Malay language sounds on the brain: a NIRS analysis." International Journal on Perceptive and Cognitive Computing 5, no. 1 (April 11, 2019): 1–7. http://dx.doi.org/10.31436/ijpcc.v5i1.72.

Повний текст джерела
Анотація:
Speech recognition features such as emotion have always been involved in human communication. With the recent developments in the communication methods, researchers have investigated artificial and emotional intelligence to improve communication. This has led to the emergence of affective computing, which deals with processing information pertaining to human emotions. This study aims to determine positive influence of language sounds containing emotion on brain function for improved communication. Twenty-seven college-age Japanese subjects with no prior exposure to the Malay language listened to emotionally toned and emotionally neutral sounds in the Malay language. Their brain activities were measured using near-infrared spectroscopy (NIRS) as they listened to the sounds. A comparison between different NIRS signals revealed that emotionally toned language sounds had a greater impact on brain areas associated with attention and emotion. On the contrary, emotionally neutral Malay sounds affected brain areas involved in working memory and language processing. These results suggest that emotionally-charged sounds initiate listeners’ attention and emotion recognition even when the listeners do not understand the language. The ability to interpret emotions presents challenges in computer systems and robotics; therefore, we hope that our results can be used for the development of computational models of emotion for autonomous robot research in the field of communication.
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Sato, Wataru, Motoko Noguchi, and Sakiko Yoshikawa. "EMOTION ELICITATION EFFECT OF FILMS IN A JAPANESE SAMPLE." Social Behavior and Personality: an international journal 35, no. 7 (January 1, 2007): 863–74. http://dx.doi.org/10.2224/sbp.2007.35.7.863.

Повний текст джерела
Анотація:
Films are effective stimuli to elicit emotions. A set of films to elicit specific emotions was developed in a previous study (Gross & Levenson, 1995), and emotional reactions to them were investigated in Western participants. In this study, we investigated whether these films have a similar capacity to elicit emotions in a Japanese sample. Thirty-one Japanese participants viewed 8 films selected to elicit specific emotions (amusement, anger, contentment, disgust, fear, neutral, sadness, and surprise) and evaluated their emotional experiences using 16 discrete and 2 dimensional emotion scales. The discrete scales indicated that all of the films evidently elicited the target emotions, as well as some nontarget emotions. The dimensional emotion scales showed that almost all of the films elicited theoretically reasonable emotions in terms of valence. These results suggest that the films may have a universal capacity to elicit emotions.
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Walla, Peter, and Aimee Mavratzakis. "Associations between Cognitive Concepts of Self and Emotional Facial Expressions with an Emphasis on Emotion Awareness." Psych 3, no. 2 (April 27, 2021): 48–60. http://dx.doi.org/10.3390/psych3020006.

Повний текст джерела
Анотація:
Recognising our own and others’ emotions is vital for healthy social development. The aim of the current study was to determine how emotions related to the self or to another influence behavioural expressions of emotion. Facial electromyography (EMG) was used to record spontaneous facial muscle activity in nineteen participants while they passively viewed negative, positive and neutral emotional pictures during three blocks of referential instructions. Each participant imagined themself, another person or no one experiencing the emotional scenario, with the priming words “You”, “Him” or “None” presented before each picture for the respective block of instructions. Emotion awareness (EA) was also recorded using the TAS-20 alexithymia questionnaire. Corrugator supercilii (cs) muscle activity increased significantly between 500 and 1000 ms post stimulus onset during negative and neutral picture presentations, regardless of ownership. Independent of emotion, cs activity was greatest during the “no one” task and lowest during the “self” task from less than 250 to 1000 ms. Interestingly, the degree of cs activation during referential tasks was further modulated by EA. Low EA corresponded to significantly stronger cs activity overall compared with high EA, and this effect was even more pronounced during the “no one” task. The findings suggest that cognitive processes related to the perception of emotion ownership can influence spontaneous facial muscle activity, but that a greater degree of integration between higher cognitive and lower affective levels of information may interrupt or suppress these behavioural expressions of emotion.
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Daros, A. R., K. K. Zakzanis, and A. C. Ruocco. "Facial emotion recognition in borderline personality disorder." Psychological Medicine 43, no. 9 (November 13, 2012): 1953–63. http://dx.doi.org/10.1017/s0033291712002607.

Повний текст джерела
Анотація:
BackgroundEmotion dysregulation represents a core symptom of borderline personality disorder (BPD). Deficits in emotion perception are thought to underlie this clinical feature, although studies examining emotion recognition abilities in BPD have yielded inconsistent findings.MethodThe results of 10 studies contrasting facial emotion recognition in patients with BPD (n = 266) and non-psychiatric controls (n = 255) were quantitatively synthesized using meta-analytic techniques.ResultsPatients with BPD were less accurate than controls in recognizing facial displays of anger and disgust, although their most pronounced deficit was in correctly identifying neutral (no emotion) facial expressions. These results could not be accounted for by speed/accuracy in the test-taking approach of BPD patients.ConclusionsPatients with BPD have difficulties recognizing specific negative emotions in faces and may misattribute emotions to faces depicting neutral expressions. The contribution of state-related emotion perception biases to these findings requires further clarification.
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Levitan, Carmel A., Isabelle Rusk, Danielle Jonas-Delson, Hanyun Lou, Lennon Kuzniar, Gray Davidson, and Aleksandra Sherman. "Mask wearing affects emotion perception." i-Perception 13, no. 3 (May 2022): 204166952211073. http://dx.doi.org/10.1177/20416695221107391.

Повний текст джерела
Анотація:
To reduce the spread of COVID-19, mask wearing has become ubiquitous in much of the world. We studied the extent to which masks impair emotion recognition and dampen the perceived intensity of facial expressions by naturalistically inducing positive, neutral, and negative emotions in individuals while they were masked and unmasked. Two groups of online participants rated the emotional intensity of each presented image. One group rated full faces (N=104); the other (N=102) rated cropped images where only the upper face was visible. We found that masks impaired the recognition of and rated intensity of positive emotions. This happened even when the faces were cropped and the lower part of the face was not visible. Masks may thus reduce positive emotion and/or expressivity of positive emotion. However, perception of negativity was unaffected by masking, perhaps because unlike positive emotions like happiness which are signaled more in the mouth, negative emotions like anger rely more on the upper face.
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Padun, M. A., E. A. Sorokko, E. A. Suchkova, and D. V. Lyusin. "Emotion Sensitivity in Individuals with Various Degrees of Expressive Suppression: The Case of Policemen." Psychology and Law 11, no. 2 (2021): 26–39. http://dx.doi.org/10.17759/psylaw.2021110203.

Повний текст джерела
Анотація:
The article examines emotion sensitivity in policemen and its relationships with emotion suppression. It was hypothesized that individuals with high emotion suppression were less efficient in recognizing others’ negative emotions. Forty-nine policemen from the Arkhangelsk region of Russia aged from 22 to 50 took part in the study. Emotion sensitivity was measured presenting faces with dynamic changes in emotional expression from neutral to the one of four emotion categories, namely happiness, sadness, anger, and fear. Emotion suppression was measured with Gross’ Emotion Regulation Questionnaire (ERQ). Happiness was recognized faster and more accurately compared to negative emotions. Among negative emotions, the least intensity was needed for the recognition of fear, more intensity for the recognition of sadness, and even more for anger. Fear was recognized more accurately compared to anger; there was no difference in the accuracy of the recognition of fear and sadness. Individuals high in expressive suppression recognized happiness faster and mistook sadness for anger more often. The results are discussed in the context of the specific features of policemen professional activity.
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Sukhanova, Irina. "Emotional Potential of Non-Emotive Vocabulary in the Modern English Literary Text." Izvestia of Smolensk State University, no. 2(58) (July 3, 2022): 118–29. http://dx.doi.org/10.35785/2072-9464-2022-58-2-118-129.

Повний текст джерела
Анотація:
The study is devoted to the features of lexicalization of the basic emotion «fear» in an English literary text in order to determine the emotive potential of non-emotive vocabulary representing this emotion. The use of cognitive approaches to the study of the language, the internal structure in the naming of emotions, the ways of their objectification and lexicalization reflect the relevance of the study. The analysis of the semantic structure of the emotion «fear» in the Russian and English languages as well as the linguistic representation of the basic emotion «fear» in the texts of modern British authors made it possible to determine the features of the lexicalization of the emotion «fear» in a modern literary English text and to study the emotivity degree of non-emotive lexical means that are used to lexicalize the specified basic emotion. The results obtained are as follows: the emotion «fear» is lexicalized with the help of emotive vocabulary that contain the seme «fear», «escape», «retreat», «cry». Non-emotive vocabulary is systematized by thematic groups: achromatic colours and physiological reactions of a person. Non-emotive, neutral vocabulary which is used to describe the non-verbal manifestations of the «fear» emotion creates an emotional colouring of the description. The emotional potential of the corresponding lexemes varies under the influence of certain situations in such a way that neutral lexical units begin to express emotiveness. Non-emotive vocab- ulary is used by authors to reflect the emotional experience of the speaker, to describe the non-verbal manifestations of emotion, to create tension and emotional intensity of the narrative. The results of the study can be used in research work related to the study of the vocabulary of the modern English language.
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Li, Xiawen, Guanghui Zhang, Chenglin Zhou, and Xiaochun Wang. "Negative emotional state slows down movement speed: behavioral and neural evidence." PeerJ 7 (September 19, 2019): e7591. http://dx.doi.org/10.7717/peerj.7591.

Повний текст джерела
Анотація:
Background Athletic performance is affected by emotional state. Athletes may underperform in competition due to poor emotion regulation. Movement speed plays an important role in many competition events. Flexible control of movement speed is critical for effective athletic performance. Although behavioral evidence showed that negative emotion can influence movement speed, the nature of the relationship remains controversial. Thus, the present study investigated how negative emotion affects movement speed and the neural mechanism underlying the interaction between emotion processing and movement control. Methods The present study combined electroencephalography (EEG) technology with a cued-action task to investigate the effect of negative emotion on movement speed. In total, 21 undergraduate students were recruited for this study. Participants were asked to perform six consecutive action tasks after viewing an emotional picture. Pictures were presented in two blocks (one negative and one neutral). After the participants completed a set of tasks (neutral of negative), they were subjected to complete a 9-point self-assessment manikin scale. Participants underwent EEG while performing the tasks. Results At the behavior level, there was a significant main effect of emotional valence on movement speed, with participants exhibiting significantly slower movements in the negative emotional condition than in the neutral condition. EEG data showed increased theta oscillation and larger P1 amplitude in response to negative than to neural images suggesting that more cognitive resources were required to process negative than neutral images. EEG data also showed a larger late CNV area in the neutral condition than in the negative condition, which suggested that there was a significant decrease in brain activation during action tasks in negative emotional condition than in the neural. While the early CNV did not reveal a significant main effect of emotional valence. Conclusion The present results indicate that a negative emotion can slow movement, which is largely due to negative emotional processing consuming more resources than non-emotional processing and this interference effect mainly occurred in the late movement preparation phase.
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Son, Guiyoung, and Yaeri Kim. "EEG-Based Emotion Classification for Verifying the Korean Emotional Movie Clips with Support Vector Machine (SVM)." Complexity 2021 (September 8, 2021): 1–14. http://dx.doi.org/10.1155/2021/5497081.

Повний текст джерела
Анотація:
Emotion plays a crucial role in understanding each other under natural communication in daily life. Electroencephalogram (EEG), based on emotion classification, has been widely utilized in the fields of interdisciplinary studies because of emotion representation’s objectiveness. In this paper, it aimed to introduce the Korean continuous emotional database and investigate brain activity during emotional processing. Moreover, we selected emotion-related channels for verifying the generated database using the Support Vector Machine (SVM). First, we recorded EEG signals, collected from 28 subjects, to investigate the brain activity across brain areas while watching movie clips by five emotions (anger, excitement, fear, sadness, and happiness) and a neutral state. We analyzed EEG raw signals to investigate the emotion-related brain area and select suitable emotion-related channels using spectral power across frequency bands, i.e., alpha and beta bands. As a result, we select the eight-channel set, namely, AF3-AF4, F3-F4, F7-F8, and P7-P8, from statistical and brain topography analysis. We perform the classification using SVM and achieve the best accuracy of 94.27% when utilizing the selected channels set with five emotions. In conclusion, we provide a fundamental emotional database reflecting Korean feelings and the evidence of different emotions for application to broaden area.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Blanchette, Isabelle, and Joanna Leese. "The Effect of Negative Emotion on Deductive Reasoning." Experimental Psychology 58, no. 3 (November 1, 2011): 235–46. http://dx.doi.org/10.1027/1618-3169/a000090.

Повний текст джерела
Анотація:
In three experiments, we explore the link between peripheral physiological arousal and logicality in a deductive reasoning task. Previous research has shown that participants are less likely to provide normatively correct responses when reasoning about emotional compared to neutral contents. Which component of emotion is primarily involved in this effect has not yet been explored. We manipulated the emotional value of the reasoning stimuli through classical conditioning (Experiment 1), with simultaneous presentation of negative/neutral pictures (Experiment 2), or by using intrinsically negative/neutral words (Experiment 3). We measured skin conductance (SC) and subjective affective ratings of the stimuli. In all experiments, we observed a negative relationship between SC and logicality. Participants who showed greater SC reactivity to negative stimuli compared to neutral stimuli were more likely to make logical errors on negative, compared to neutral reasoning contents. There was no such link between affective ratings of the stimuli and the effect of emotion on reasoning.
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Caballero-Morales, Santiago-Omar. "Recognition of Emotions in Mexican Spanish Speech: An Approach Based on Acoustic Modelling of Emotion-Specific Vowels." Scientific World Journal 2013 (2013): 1–13. http://dx.doi.org/10.1155/2013/162093.

Повний текст джерела
Анотація:
An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR) system was built with Hidden Markov Models (HMMs), where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness). Then, estimation of the emotional state from a spoken sentence is performed by counting the number of emotion-specific vowels found in the ASR’s output for the sentence. With this approach, accuracy of 87–100% was achieved for the recognition of emotional state of Mexican Spanish speech.
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Trémolière, Bastien, Marie-Ève Gagnon, and Isabelle Blanchette. "Cognitive Load Mediates the Effect of Emotion on Analytical Thinking." Experimental Psychology 63, no. 6 (November 2016): 343–50. http://dx.doi.org/10.1027/1618-3169/a000333.

Повний текст джерела
Анотація:
Abstract. Although the detrimental effect of emotion on reasoning has been evidenced many times, the cognitive mechanism underlying this effect remains unclear. In the present paper, we explore the cognitive load hypothesis as a potential explanation. In an experiment, participants solved syllogistic reasoning problems with either neutral or emotional contents. Participants were also presented with a secondary task, for which the difficult version requires the mobilization of cognitive resources to be correctly solved. Participants performed overall worse and took longer on emotional problems than on neutral problems. Performance on the secondary task, in the difficult version, was poorer when participants were reasoning about emotional, compared to neutral contents, consistent with the idea that processing emotion requires more cognitive resources. Taken together, the findings afford evidence that the deleterious effect of emotion on reasoning is mediated by cognitive load.
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Siriwardena, Yashish M., Nadee Seneviratne, and Carol Espy-Wilson. "Emotion recognition with speech articulatory coordination features." Journal of the Acoustical Society of America 150, no. 4 (October 2021): A358. http://dx.doi.org/10.1121/10.0008586.

Повний текст джерела
Анотація:
Mental health illnesses like Major Depressive Disorder and Schizophrenia affect the coordination between articulatory gestures in speech production. Coordination features derived from Vocal tract variables (TVs) predicted by a speech inversion system can quantify the changes in articulatory gestures and have proven to be effective in the classification of mental health disorders. In this study we use data from the IEMOCAP (acted emotions) and MSP Podcast (natural emotions) datasets to understand how coordination features extracted from TVs can be used to capture changes between different emotions for the first time. We compared the eigenspectra extracted from channel delay correlation matrices for Angry, Sad and Happy emotions with respect to the “Neutral” emotion. Across both the datasets, it was observed that the “Sad” emotion follows a pattern suggesting simpler articulatory coordination while the “Angry” emotion follows the opposite showing signs of complex articulatory coordination. For the majority of subjects, the ‘Happy’ emotion follows a complex articulatory coordination pattern, but has significant confusion with “Neutral” emotion. We trained a Convolutional Neural Network with the coordination features as inputs to perform emotion classification. A detailed interpretation of the differences in eigenspectra and the results of the classification experiments will be discussed.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Asghar, Awais, Sarmad Sohaib, Saman Iftikhar, Muhammad Shafi, and Kiran Fatima. "An Urdu speech corpus for emotion recognition." PeerJ Computer Science 8 (May 9, 2022): e954. http://dx.doi.org/10.7717/peerj-cs.954.

Повний текст джерела
Анотація:
Emotion recognition from acoustic signals plays a vital role in the field of audio and speech processing. Speech interfaces offer humans an informal and comfortable means to communicate with machines. Emotion recognition from speech signals has a variety of applications in the area of human computer interaction (HCI) and human behavior analysis. In this work, we develop the first emotional speech database of the Urdu language. We also develop the system to classify five different emotions: sadness, happiness, neutral, disgust, and anger using different machine learning algorithms. The Mel Frequency Cepstrum Coefficient (MFCC), Linear Prediction Coefficient (LPC), energy, spectral flux, spectral centroid, spectral roll-off, and zero-crossing were used as speech descriptors. The classification tests were performed on the emotional speech corpus collected from 20 different subjects. To evaluate the quality of speech emotions, subjective listing tests were conducted. The recognition of correctly classified emotions in the complete Urdu emotional speech corpus was 66.5% with K-nearest neighbors. It was found that the disgust emotion has a lower recognition rate as compared to the other emotions. Removing the disgust emotion significantly improves the performance of the classifier to 76.5%.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Weiss, Elisabeth M., Christian G. Kohler, Colleen M. Brensinger, Warren B. Bilker, James Loughead, Margarete Delazer, and Karen A. Nolan. "Gender differences in facial emotion recognition in persons with chronic schizophrenia." European Psychiatry 22, no. 2 (March 2007): 116–22. http://dx.doi.org/10.1016/j.eurpsy.2006.05.003.

Повний текст джерела
Анотація:
AbstractBackgroundThe aim of the present study was to investigate possible sex differences in the recognition of facial expressions of emotion and to investigate the pattern of classification errors in schizophrenic males and females. Such an approach provides an opportunity to inspect the degree to which males and females differ in perceiving and interpreting the different emotions displayed to them and to analyze which emotions are most susceptible to recognition errors.MethodsFifty six chronically hospitalized schizophrenic patients (38 men and 18 women) completed the Penn Emotion Recognition Test (ER40), a computerized emotion discrimination test presenting 40 color photographs of evoked happy, sad, anger, fear expressions and neutral expressions balanced for poser gender and ethnicity.ResultsWe found a significant sex difference in the patterns of error rates in the Penn Emotion Recognition Test. Neutral faces were more commonly mistaken as angry in schizophrenic men, whereas schizophrenic women misinterpreted neutral faces more frequently as sad. Moreover, female faces were better recognized overall, but fear was better recognized in same gender photographs, whereas anger was better recognized in different gender photographs.ConclusionsThe findings of the present study lend support to the notion that sex differences in aggressive behavior could be related to a cognitive style characterized by hostile attributions to neutral faces in schizophrenic men.
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Werner, S., and G. N. Petrenko. "Speech Emotion Recognition: Humans vs Machines." Discourse 5, no. 5 (December 18, 2019): 136–52. http://dx.doi.org/10.32603/2412-8562-2019-5-5-136-152.

Повний текст джерела
Анотація:
Introduction. The study focuses on emotional speech perception and speech emotion recognition using prosodic clues alone. Theoretical problems of defining prosody, intonation and emotion along with the challenges of emotion classification are discussed. An overview of acoustic and perceptional correlates of emotions found in speech is provided. Technical approaches to speech emotion recognition are also considered in the light of the latest emotional speech automatic classification experiments.Methodology and sources. The typical “big six” classification commonly used in technical applications is chosen and modified to include such emotions as disgust and shame. A database of emotional speech in Russian is created under sound laboratory conditions. A perception experiment is run using Praat software’s experimental environment.Results and discussion. Cross-cultural emotion recognition possibilities are revealed, as the Finnish and international participants recognised about a half of samples correctly. Nonetheless, native speakers of Russian appear to distinguish a larger proportion of emotions correctly. The effects of foreign languages knowledge, musical training and gender on the performance in the experiment were insufficiently prominent. The most commonly confused pairs of emotions, such as shame and sadness, surprise and fear, anger and disgust as well as confusions with neutral emotion were also given due attention.Conclusion. The work can contribute to psychological studies, clarifying emotion classification and gender aspect of emotionality, linguistic research, providing new evidence for prosodic and comparative language studies, and language technology, deepening the understanding of possible challenges for SER systems.
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Ba, Shen, David Stein, Qingtang Liu, Taotao Long, Kui Xie, and Linjing Wu. "Examining the Effects of a Pedagogical Agent With Dual-Channel Emotional Cues on Learner Emotions, Cognitive Load, and Knowledge Transfer Performance." Journal of Educational Computing Research 59, no. 6 (February 8, 2021): 1114–34. http://dx.doi.org/10.1177/0735633121992421.

Повний текст джерела
Анотація:
Despite the continuous emphasis on emotion in multimedia learning, it was still unclear how pedagogical agent emotional cues might affect learning. In the present study, a between-subjects experiment was performed to examine the effects of a pedagogical agent with dual-channel emotional cues on learners' emotions, cognitive load, and knowledge transfer performance. Participants from a central Chinese university (age mean = 21.26, N = 66) were randomly divided into three groups. These groups received instructions from an affective pedagogical agent, a neutral pedagogical agent, or a neutral voice narration without pedagogical agent embodiment. Results showed that learners assigned the affective pedagogical agent reported a significantly higher emotional level than learners assigned the neutral pedagogical agent. Learners’ perceived task difficulty was not significantly different among groups while instructional efficiency was significantly higher for learners with the affective pedagogical agent. Moreover, learners assigned to the affective pedagogical agent performed significantly better on the knowledge transfer test than those assigned the neutral pedagogical agent or the neutral voice.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Santos, Isabel M., Pedro Bem-Haja, André Silva, Catarina Rosa, Diâner F. Queiroz, Miguel F. Alves, Talles Barroso, Luíza Cerri, and Carlos F. Silva. "The Interplay between Chronotype and Emotion Regulation in the Recognition of Facial Expressions of Emotion." Behavioral Sciences 13, no. 1 (December 31, 2022): 38. http://dx.doi.org/10.3390/bs13010038.

Повний текст джерела
Анотація:
Emotion regulation strategies affect the experience and processing of emotions and emotional stimuli. Chronotype has also been shown to influence the processing of emotional stimuli, with late chronotypes showing a bias towards better processing of negative stimuli. Additionally, greater eveningness has been associated with increased difficulties in emotion regulation and preferential use of expressive suppression strategies. Therefore, the present study aimed to understand the interplay between chronotype and emotion regulation on the recognition of dynamic facial expressions of emotion. To that end, 287 participants answered self-report measures and performed an online facial emotion recognition task from short video clips where a neutral face gradually morphed into a full-emotion expression (one of the six basic emotions). Participants should press the spacebar to stop each video as soon as they could recognize the emotional expression, and then identify it from six provided labels/emotions. Greater eveningness was associated with shorter response times (RT) in the identification of sadness, disgust and happiness. Higher scores of expressive suppression were associated with longer RT in identifying sadness, disgust, anger and surprise. Expressive suppression significantly moderated the relationship between chronotype and the recognition of sadness and anger, with chronotype being a significant predictor of emotion recognition times only at higher levels of expressive suppression. No significant effects were observed for cognitive reappraisal. These results are consistent with a negative bias in emotion processing in late chronotypes and increased difficulty in anger and sadness recognition for expressive suppressor morning-types.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Nie, Chunyan, and Huiyu Wang. "Correlation study between chaos features of physiological signals and emotions based on the information gain method." Journal of Physics: Conference Series 2395, no. 1 (December 1, 2022): 012006. http://dx.doi.org/10.1088/1742-6596/2395/1/012006.

Повний текст джерела
Анотація:
Abstract Clarifying the correlation between the chaos features of physiological signals and emotions plays an important role in improving the accuracy of emotion recognition. In this paper, the correlations of chaos features such as Lyapunov exponent, complexity, box dimension, and approximate entropy of physiological signals with positive, negative, and neutral emotions are analyzed and studied using the information gain method. Based on this, the Bagging-CHAID model is used for emotion recognition validation analysis. The results show that (1) the approximate entropy of physiological signals has the greatest correlation with positive, negative, and neutral emotions, followed by complexity and Lyapunov exponent features, and box dimensional features have the least correlation with the three emotions, thus, the approximate entropy can be chosen to improve the emotion validated based on Bagging-CHAID model, which showed the accuracy of the correlation analysis.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії