To see the other types of publications on this topic, follow the link: Emotion perception.

Dissertations / Theses on the topic 'Emotion perception'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Emotion perception.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Taylor, Richard James. "Affective perception." Thesis, University of Oxford, 2010. http://ora.ox.ac.uk/objects/uuid:a5fe8467-c5e5-4cda-9875-ab46d67c4a62.

Full text
Abstract:
This thesis aims to present and defend an account of affective perception. The central argument seeks to establish three claims. 1) Certain emotional bodily feelings (and not just psychic feelings) are world-directed intentional states. 2) Their intentionality is to be understood in perceptual terms: such feelings are affective perceptions of emotional properties of a certain kind. 3) These ‘emotion-proper properties’ are response-dependent in a way that entails that appropriate affective responses to their token instances qualify, ipso facto, as perceptions of those instances. The arguments for (1) and (2) appeal directly to the phenomenology of emotional experience and draw heavily from recent research by Peter Goldie and Matthew Ratcliffe. By applying Goldie’s insights into the intentional structure of psychic feelings to the case of emotional bodily feelings, it is shown that certain of the latter—particularly those pertaining to the so-called ‘standard’ emotions—exemplify world-directed intentionality analogous to the perceptual intentionality of tactile feelings. Adapting Ratcliffe’s account of the analogy between tactile feelings and what he terms ‘existential feelings’, it is argued that standard emotional bodily feelings are at the same time intrinsically intentional world-directed perceptual states (affective perceptions) through which the defining properties of emotional objects (emotion-proper properties) are apprehended. The subsequent account of these properties endorses a response-dependence thesis similar to that defended by John McDowell and David Wiggins and argues that tokening an appropriate emotional affective state in response to a token emotion-proper property is both a necessary and a sufficient condition for perception of that property (Claim (3)). The central claim is thus secured by appeal both to the nature of the relevant feelings and the nature of the relevant properties (the former being intrinsically intentional representational states and the latter being response-dependent in a way that guarantees the perceptual status of the former).
APA, Harvard, Vancouver, ISO, and other styles
2

Gay, R. "Morality : Emotion, perception and belief." Thesis, University of Oxford, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.371649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lawrie, Louisa. "Adult ageing and emotion perception." Thesis, University of Aberdeen, 2018. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=239235.

Full text
Abstract:
Older adults are worse than young adults at perceiving emotions in others. However, it is unclear why these age-related differences in emotion perception exist. The studies presented in this thesis investigated the cognitive, emotional and motivational factors influencing age differences in emotion perception. Study 1 revealed no age differences in mood congruence effects: sad faces were rated as more sad when participants experienced negative mood. In contrast, Study 2 demonstrated that sad mood impaired recognition accuracy for sad faces. Together, findings suggested that different methods of assessing emotion perception engage the use of discrete processing strategies. These mood influences on emotion perception are similar in young and older adults. Studies 3 and 4 investigated age differences in emotion perception tasks which are more realistic and contextualised than still photographs of facial expressions. Older adults were worse than young at recognising emotions from silent dynamic displays; however, older adults outperformed young in a film task that displayed emotional information in multiple modalities (Study 3). Study 4 suggested that the provision of vocal information was particularly beneficial to older adults. Furthermore, vocabulary mediated the relationship between age and performance on the contextual film task. However, age-related deficits in decoding basic emotions were established in a separate multi-modal video-based task. In addition, age differences in the perception of neutral expressions were also examined. Neutral expressions were interpreted as displaying positive emotions by older adults. Using a dual-task paradigm, Study 5 suggested that working memory processes are involved in decoding emotions. However, age-related declines in working memory were not driving age effects in emotion perception. Neuropsychological, motivational and cognitive explanations for these results are evaluated. Implications of these findings for older adults' social functioning are discussed.
APA, Harvard, Vancouver, ISO, and other styles
4

Araya, Jose Manuel. "Emotion and predictive processing : emotions as perceptions?" Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/33156.

Full text
Abstract:
In this Thesis, I systematize, clarify, and expand the current theory of emotion based on the principles of predictive processing-the interoceptive inference view of emotion-so as to show the following: (1) as it stands, this view is problematic. (2) Once expanded, the view in question can deal with its more pressing problems, and it compares favourably to competing accounts. Thus, the interoceptive inference view of emotion stands out as a plausible theory of emotion. According to the predictive processing (PP) framework, all what the brain does, in all its functions, is to minimize its precision-weighted prediction error (PE) (Clark, 2013, 2016; Hohwy, 2013). Roughly, PE consist in the difference between the sensory signals expected (and generated) from the top-down and the actual, incoming sensory signals. Now, in the PP framework, visual percepts are formed by minimizing visual PE in a specific manner: via visual perceptual inference. That is, the brain forms visual percepts in a top-down fashion by predicting its incoming lower-level sensory signals from higher-level models of the likely (hidden) causes of those visual signals. Such models can be seen as putting forward content-specifying hypotheses about the object or event responsible for triggering incoming sensory activity. A contentful percept is formed once a certain hypothesis achieves to successfully match, and thus supress, current lower-level sensory signals. In the interoceptive inference approach to interoception (Seth, 2013, 2015), the principles of PP have been extended to account for interoception, i.e., the perception of our homeostatic, physiological condition. Just as perception in the visual domain arises via visual perceptual inference, the interoceptive inference approach holds that perception of the inner, physiological milieu arises via interoceptive perceptual inference. Now, what might be called the interoceptive inference theory of valence (ITV) holds that the interoceptive inference approach can be used so as to account for subjective feeling states in general, i.e., mental states that feel good or bad-i.e., valenced mental states. According to ITV, affective valence arises by way of interoceptive perceptual inference. On the other hand, what might be called the interoceptive inference view of emotion (IIE) holds that the interoceptive inference approach can be used so as to account for emotions per se (e.g., fear, anger, joy). More precisely, IIE holds that, in direct analogy to the way in which visual percepts are formed, emotions arise from interoceptive predictions of the causes of current interoceptive afferents. In other words, emotions per se amount to interceptive percepts formed via higher-level, content-specifying emotion hypotheses. In this Thesis, I aim to systematize, clarify, and expand the interoceptive inference approach to interoception, in order to show that: (1) contrary to non-sensory theories of affective valence, valence is indeed constituted by interoceptive perceptions, and that interoceptive percepts do arise via interoceptive perceptual inference. Therefore, ITV holds. (2) Considering that IIE exhibits problematic assumptions, it should be amended. In this respect, I will argue that emotions do not arise via interoceptive perceptual inference (as IIE claims), since this assumes that there must be regularities pertaining to emotion in the physiological domain. I will suggest that emotions arise instead by minimizing interoceptive PE in another fashion. That is, emotions arise via external interoceptive active inference: by sampling and modifying the external environment in order to change an already formed interoceptive percept (which has been formed via interoceptive perceptual inference). That is, emotions are specific strategies for regulating affective valence. More precisely, I will defend the view that a certain emotion E amounts to a specific strategy for minimizing interoceptive PE by way of a specific set of stored knowledge of the counterfactual relations that obtain between (possible) actions and its prospective interoceptive, sensory consequences ("if I act in this manner, interoceptive signals should evolve in such-and-such way"). An emotion arises when such knowledge is applied in order to regulate valence.
APA, Harvard, Vancouver, ISO, and other styles
5

CARRETERO, MIGUEL RAMOS. "Expression of Emotion in Virtual Crowds:Investigating Emotion Contagion and Perception of Emotional Behaviour in Crowd Simulation." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-153966.

Full text
Abstract:
Emotional behaviour in the context of crowd simulationis a topic that is gaining particular interest in the area of artificial intelligence. Recent efforts in this domain havelooked for the modelling of emotional emergence and socialinteraction inside a crowd of virtual agents, but further investigation is still needed in aspects such as simulation of emotional awareness and emotion contagion. Also, in relation to perception of emotions, many questions remain about perception of emotional behaviour in the context of virtual crowds.This thesis investigates the current state-of-the-art of emotional characters in virtual crowds and presents the implementation of a computational model able to generate expressive full-body motion behaviour and emotion contagion in a crowd of virtual agents. Also, as a second part of the thesis, this project presents a perceptual study in which the perception of emotional behaviour is investigated in the context of virtual crowds. The results of this thesis reveal some interesting findings in relation to the perception and modelling of virtual crowds, including some relevant effectsin relation to the influence of emotional crowd behaviourin viewers, specially when virtual crowds are not the mainfocus of a particular scene. These results aim to contribute for the further development of this interdisciplinary area of computer graphics, artificial intelligence and psychology.
Emotionellt beteende i simulerade folkmassor är ett ämne med ökande intresse, inom området för artificiell intelligens. Nya studier har tittat på modellen för social interaktion inuti en grupp av virtuella agenter, men fortsatt utredning behövs fortfarande inom aspekter så som simulation av emotionell medvetenhet och emotionell smitta. Också, när det gäller synen på känslor, kvarstår många frågor kring synen på känslomässigt beteende i samband med virtuella folkmassor. Denna studie undersöker de nuvarande "state-of-theart" emotionella egenskaperna i virtuella folksamlingar och presenterar implementationen av en datormodell som kan generera smittsamma känslor i en grupp av virtuella agenter. Också, när det gäller synen på känslor, kvarstår många frågor kring synen på känslomässigt beteende i samband med virtuella folksamlingar. Som en andra del av denna avhandlingen presenteras, i detta projekt, en perceptuell studie där uppfattningen av emotionella beteenden undersöks i samband med virtuella folksamlingar.
APA, Harvard, Vancouver, ISO, and other styles
6

Kosti, Ronak. "Visual scene context in emotion perception." Doctoral thesis, Universitat Oberta de Catalunya, 2019. http://hdl.handle.net/10803/667808.

Full text
Abstract:
Els estudis psicològics demostren que el context de l'escena, a més de l'expressió facial i la postura corporal, aporta informació important a la nostra percepció de les emocions de les persones. Tot i això, el processament del context per al reconeixement automàtic de les emocions no s'ha explorat a fons, en part per la manca de dades adequades. En aquesta tesi presentem EMOTIC, un conjunt de dades d'imatges de persones en situacions naturals i diferents anotades amb la seva aparent emoció. La base de dades EMOTIC combina dos tipus de representació d'emocions diferents: (1) un conjunt de 26 categories d'emoció i (2) les dimensions contínues valència, excitació i dominància. També presentem una anàlisi estadística i algorítmica detallada del conjunt de dades juntament amb l'anàlisi d'acords d'anotadors. Els models CNN estan formats per EMOTIC, combinant característiques de la persona amb funcions d'escena (context). Els nostres resultats mostren com el context d'escena aporta informació important per reconèixer automàticament els estats emocionals i motiven més recerca en aquesta direcció.
Los estudios psicológicos muestran que el contexto de la escena, además de la expresión facial y la pose corporal, aporta información importante a nuestra percepción de las emociones de las personas. Sin embargo, el procesamiento del contexto para el reconocimiento automático de emociones no se ha explorado en profundidad, en parte debido a la falta de datos adecuados. En esta tesis presentamos EMOTIC, un conjunto de datos de imágenes de personas en situaciones naturales y diferentes anotadas con su aparente emoción. La base de datos EMOTIC combina dos tipos diferentes de representación de emociones: (1) un conjunto de 26 categorías de emociones y (2) las dimensiones continuas de valencia, excitación y dominación. También presentamos un análisis estadístico y algorítmico detallado del conjunto de datos junto con el análisis de concordancia de los anotadores. Los modelos CNN están entrenados en EMOTIC, combinando características de la persona con características de escena (contexto). Nuestros resultados muestran cómo el contexto de la escena aporta información importante para reconocer automáticamente los estados emocionales, lo cual motiva más investigaciones en esta dirección.
Psychological studies show that the context of a setting, in addition to facial expression and body language, lends important information that conditions our perception of people's emotions. However, context's processing in the case of automatic emotion recognition has not been explored in depth, partly due to the lack of sufficient data. In this thesis we present EMOTIC, a dataset of images of people in various natural scenarios annotated with their apparent emotion. The EMOTIC database combines two different types of emotion representation: (1) a set of 26 emotion categories, and (2) the continuous dimensions of valence, arousal and dominance. We also present a detailed statistical and algorithmic analysis of the dataset along with the annotators' agreement analysis. CNN models are trained using EMOTIC, combining a person's features with those of the setting (context). Our results not only show how the context of a setting contributes important information for automatically recognizing emotional states but also promote further research in this direction.
APA, Harvard, Vancouver, ISO, and other styles
7

Bodnar, Andor L. "Sensory and Emotion Perception of Music." Thesis, University of Louisiana at Lafayette, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10268431.

Full text
Abstract:

The purpose of this study was to examine whether isolated musical chords and chord progressions are capable of communicating basic emotions (happiness, sadness, and fear) and sensory perceptions of tension and dissonance to eighty-two university students differing in musical expertise. Participants were recruited from ULL’s psychology and music department, and were divided into three different groups based on their formal training in music. Participants listened to forty-six music excerpts and were asked to identify and rate the emotions they felt each stimulus was attempting to convey. Participants were also asked to rate how much tension and dissonance they experienced after each excerpt.

The results demonstrated that major chord progressions played in fast tempo more readily expressed happiness than minor and chromatic chord progressions. Minor chord progressions played in slow tempo were associated with sadness and were rated higher in tension and dissonance than major chord progressions. Chromatic chord progressions, regardless of tempo, expressed fear most reliable, and received higher tension and dissonance ratings than major and minor chord progressions. Furthermore, results showed that isolated major chords were perceived as the least tense, the least dissonant, and the happiest sounding. Isolated minor chords readily conveyed sadness, and were perceived as more tense and dissonant than majors. Additionally, isolated augmented and diminished chords were the most likely to express fear and were rated highest in tension and dissonance. Contrary to previous research findings, participants’ level of musical expertise influenced sensory and emotion perception ratings. Participants with three to four years of formal training outperformed experts and novices. Future research directions and possible applied implications of these finding are also discussed.

APA, Harvard, Vancouver, ISO, and other styles
8

Cheng, Linda. "Ethnic and Racial Differences in Emotion Perception." Digital Archive @ GSU, 2007. http://digitalarchive.gsu.edu/psych_hontheses/6.

Full text
Abstract:
This study analyzed racial differences in the way African Americans and Caucasians perceive emotion from facial expressions and tone of voice. Participants were African American (n=25) and Caucasian (n=26) college students. The study utilizes 56 images of African American and Caucasian faces balanced for race and sex from the NimStim stimulus set (Tottenham, 2006). The study also utilized visual and auditory stimuli form the DANVA2. Participants were asked to judged emotion for each stimulus in the tasks. The BFRT, the WASI, and the Seashore Rhythm test were used as exclusionary criteria. In general the study found few differences in the way African Americans and Caucasians perceived emotion, though racial differences emerged as an interaction with other factors. The results of the study supported the theory of universality of emotion perception and expression though social influences, which may affect emotion perception, is also a possibility. Areas of future research were discussed.
APA, Harvard, Vancouver, ISO, and other styles
9

Lim, Seung-Lark. "The role of emotion on visual perception." [Bloomington, Ind.] : Indiana University, 2009. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3358931.

Full text
Abstract:
Thesis (Ph.D.)--Indiana University, Dept. of Psychological and Brain Sciences, 2009.
Title from PDF t.p. (viewed on Feb. 10, 2010). Source: Dissertation Abstracts International, Volume: 70-05, Section: B, page: 3196. Adviser: Luiz Pessoa.
APA, Harvard, Vancouver, ISO, and other styles
10

Recio, Guillermo. "Perception of dynamic facial expressions of emotion." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2013. http://dx.doi.org/10.18452/16697.

Full text
Abstract:
Verhaltensstudien haben gezeigt, dass dynamische besser als statische Emotionsausdrücke erkannt werden. Im Einklang mit dieser dynamischer Vorteil Hypothese, haben fMRT Studien eine erhöhte und ausgedehnte Aktivierung für dynamische Emotionsausdrücke gezeigt. Die vorliegende Dissertation hatte das Ziel, die kognitiven Mechanismen, die den dynamischen Vorteil bedingen, zu klären, beziehungsweise die Spezifität dessen Wirkung für Gesichtsausdrücke der sechs Basisemotionen zu untersuchen. Studie 1 verglich Verhaltensdaten und kortikale Reaktionen zwischen dynamischen und statischen Emotionsausdrücken. Studie 2 behandelte methodischen Fragen des Timings der Stimuli und der neutralen dynamischen Bedingung. Studie 3 überprüfte die Hypothese, dass die Erhöhung der Menge von Bewegungen in den Gesichtsausdrücken die Zuweisung der Aufmerksamkeit erhöhen würde, und verglich die Wirkung in emotionalen und nicht-emotionalen Bewegungen. Study 4 konzentrierte sich auf die Frage der Emotionsspezifität der Hirnaktivierung in der Erkennung von Emotionen. Die Ergebnisse bestätigten einen dynamischen Vorteil in der Klassifizierung von Emotionsausdrücken, vermutlich bedingt durch eine Erhöhung in der visuellen Aufmerksamkeit, und eine Verbesserung der Wahrnehmungsverarbeitung. Außerdem, erhöht sich dieser Effekt mit allmählichem Erhöhen der Stärke der Bewegung in beide emotionalen und neutralen Bedingungen. Solche Effekte sprechen für ein perzeptuellen Bias erhöhte Aufmerksamkeit emotionalen verglichen mit neutralen und dynamischen verglichen mit statischen Gesichtern zuzuweisen. Dieser Effekt war für Freude etwas erhöht und für Überraschung reduziert, aber insgesamt ähnlich für alle Emotionsausdrücken.
Behavioral studies have shown that facial expressions of emotion unfolding over time provide some type of information that benefits the recognition of emotional expressions, in comparison with static images. In line with the dynamic advantage hypothesis, neuroimaging studies have shown increased and wider activation while seeing dynamic expressions. The present dissertation aims to clarify the cognitive mechanism underlying this dynamic advantage and the specificity of this effect for six facial expressions of emotion. Study 1 compared behavioral and brain cortical responses to dynamic and static expressions, looking for psychophysiological correlates of the dynamic advantage. Study 2 dealt with methodological issues regarding the timing of the stimuli and the dynamic neutral conditions. Study 3 tested the hypothesis that increasing the amount of movement in the expressions would increase the allocation of attention, and compared effects of intensity in both emotional and non-emotional movements. Study 4 focused on the question of emotion specificity of brain activation during emotion recognition. Results confirmed a dynamic advantage in the classification of expressions, presumably due to more efficient allocation of attention that improved perceptual processing. The effect increased gradually by augmenting the amount of motion, in both emotional and neutral expressions, indicating a perceptual bias to attend facial movements. The enhancement was somewhat larger for happiness and reduced for surprise, but overall similar for all emotional expressions.
APA, Harvard, Vancouver, ISO, and other styles
11

Gendron, Maria Therese. "Relativity in the perception of emotion across cultures." Thesis, Boston College, 2013. http://hdl.handle.net/2345/bc-ir:104063.

Full text
Abstract:
Thesis advisor: Lisa Feldman Barrett
A central question in the study of human behavior is whether or not certain categories of emotion, such as anger, fear and sadness (termed "discrete emotions"), are universally recognized in the nonverbal behaviors of others (termed the "universality of attribution hypothesis"). In this dissertation, the universality of attribution hypothesis was revisited in order to examine whether individuals from remote cultural contexts perceive the same mental states in nonverbal cues as individuals from a Western cultural context. The studies described in this dissertation removed certain features of prior universality studies that served to obscure the underlying nature of cross-cultural perceptions. In study 1, perception of posed emotional vocalizations by individuals from a US cultural context were compared to those of individuals from the Himba ethnic group, who reside in remote regions of Namibia and have limited contact with individuals outside their community. In contrast to recent data claiming to support the universality hypothesis, we did not find evidence that emotions were universally perceived when participants were asked to freely label the emotion they perceived in vocalizations. In contrast, our findings did support the hypothesis that affective dimensions of valence and arousal are perceived across cultural contexts. In the second study, emotion perceptions based on facial expressions were compared between participants from US and Himba cultural contexts. Consistent with the results of Study 1, Himba individuals did not perceive the Western discrete emotion categories that their US counterparts did. Our data did support the hypothesis that Himba participants were routinely engaging in action perception, rather than mental state inference. Across both cultural contexts, when conceptual knowledge about emotions was made more accessible by presenting emotion words as part of the task, perception was impacted. In US participants, perceptions conformed even more strongly with the previously assumed "universal" model. Himba participants appeared to rely more on mental state categories when exposed to concepts, but a substantial amount of cultural variation was still observed. Finally, in Study 3, perceptions of emotion were examined in a US cultural context after the focus of participants was manipulated, either onto mental states (broadly), emotions or behaviors. Perceptions of emotion did not differ substantially across these three conditions, indicating that within a US cultural context the tendency to infer mental states from facial expressions is somewhat inflexible. Overall, the findings of this dissertation indicate that emotion perception is both culturally and linguistically relative and that attempts to apply the Western cultural model for emotions as a universal one obscures important cultural variation
Thesis (PhD) — Boston College, 2013
Submitted to: Boston College. Graduate School of Arts and Sciences
Discipline: Psychology
APA, Harvard, Vancouver, ISO, and other styles
12

LaBass, Eric A. "Does Teaching Parents Emotion-Coaching Strategies Change Parental Perception of Children's Negative Emotions?" Antioch University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=antioch1453835425.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Cornew, Lauren A. "Emotion processing in the auditory modality the time course and development of emotional prosody recognition /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2008. http://wwwlib.umi.com/cr/ucsd/fullcit?p3330854.

Full text
Abstract:
Thesis (Ph. D.)--University of California, San Diego, 2008.
Title from first page of PDF file (viewed December 11, 2008). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
14

Lambert, Hayley M. "Emotion Discrimination in Peripheral Vision." TopSCHOLAR®, 2018. https://digitalcommons.wku.edu/theses/2087.

Full text
Abstract:
The recognition accuracy of emotion in faces varies depending on the discrete emotion being expressed and the location of the stimulus. More specifically, emotion detection performance declines as facial stimuli are presented further out in the periphery. Interestingly, this is not always true for faces depicting happy emotional expressions, which can be associated with maintained levels of detection. The current study examined neurophysiological responses to emotional face discrimination in the periphery. Two event-related potentials (ERPs) that can be sensitive to the perception of emotion in faces, P1 and N170, were examined using EEG data recorded from electrodes at occipitotemporal sites on the scalp. Participants saw a face presented at a 0° angle of eccentricity, at a 10° angle of eccentricity, or at a 20° angle of eccentricity, and responded whether the face was a specific emotion or neutral. Results showed that emotion detection was higher when faces were presented at the center of the display than at 10° or 20° for both happy and angry expressions. Likewise, the voltage amplitude of the N170 component was greater when faces were presented at the center of the display than at 10° or 20°. Further exploration of the data revealed that high intensity expressions were more easily detected at each location and elicited a larger amplitude N170 than low intensity expressions for both emotions. For a peripheral emotion discrimination task like that which was employed in the current study, emotion cues seem to enhance face processing at peripheral locations.
APA, Harvard, Vancouver, ISO, and other styles
15

Richoz, Anne-Raphaëlle. "Vers la compréhension du traitement dynamique du visage humain." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAS002.

Full text
Abstract:
Au cours des dernières décennies, la plupart des études investiguant la reconnaissance des visages ont utilisé des photographies statiques. Or dans notre environnement naturel, les visages auxquels nous sommes exposés sont des phénomènes dynamiques qu’il est difficile de communiquer écologiquement avec des images statiques. Cette exposition quotidienne et répétée à des visages en mouvement pourrait-elle avoir un effet sur notre système visuel, favorisant le traitement de stimuli dynamiques au détriment des statiques ?Afin d’éclairer cette problématique, les recherches présentées dans cette thèse avaient pour but d’utiliser des stimuli dynamiques pour étudier différents aspects du traitement des visages à travers plusieurs groupes d’âge et populations. Dans notre première recherche, nous avons utilisé des visages animés pour voir si la capacité de nourrissons âgés de 6-, 9- et 12 mois à associer des attributs audibles et visibles à un genre est influencée par l'utilisation d’un discours de type adulte par opposition à un langage de type enfantin. Nos résultats ont montré qu’à partir de 6 mois, lorsqu'ils étaient soumis à un discours de type adulte, les nourrissons associaient les voix et visages de femmes. Par contre, lorsqu'ils étaient confrontés à un langage de type enfantin, cette capacité apparaissait seulement à l'âge de 9 mois. Ces premiers résultats soutiennent l'idée selon laquelle le développement de la perception multisensorielle chez les nourrissons est influencé par la nature même des interactions sociales.Dans notre deuxième recherche, nous avons utilisé une nouvelle technique 4D pour reconstruire les représentations mentales des six émotions de base d’une patiente présentant un cas unique et pure de prosopagnosie acquise (i.e., une incapacité à reconnaître les visages), afin de réexaminer une question bien débattue, à savoir si les modules cérébraux sous-jacents à la reconnaissance de l’identité et des expressions faciales sont séparés ou communs. Les résultats ont montré que notre patiente a utilisé toutes les caractéristiques faciales pour identifier les émotions de base, ce qui contraste fortement avec son utilisation déficitaire de l'information faciale pour la reconnaissance de l’identité. Ces résultats confortent l’idée selon laquelle différents systèmes de représentations sous-tendent le traitement de l'identité et de l'expression. Par la suite, nous avons pu démontrer que notre patiente était capable de reconnaître adéquatement les expressions émotionnelles dynamiques, mais pas les émotions statiques provenant de ses propres représentations internes. Ces résultats qui pourraient être expliqués par un ensemble spécifique de lésions dans son gyrus occipital inférieur droit, soutiennent l’idée selon laquelle le traitement des stimuli statiques et dynamiques se produit dans des régions cérébrales différentes.Dans notre troisième recherche, nous avons investigué si d'autres populations ayant un système visuel neurologiquement fragile ou en développement bénéficient également de la présentation d’expressions dynamiques. Nous avons demandé à plus de 400 sujets âgés de 5 à 96 ans de catégoriser les six expressions de base en versions statique, dynamique ou bruitée. En utilisant un modèle Bayésien, nos résultats nous ont permis de quantifier la pente d'amélioration et de déclin pour chaque expression dans chaque condition, ainsi que d'estimer l'âge auquel l’efficacité est maximale. En résumé, nos résultats montrent la supériorité des stimuli dynamiques dans la reconnaissance des expressions faciales, de manière plus marquée pour certaines expressions que d'autres et de façon plus importante à certains moments spécifiques du développement.Dans l'ensemble, les résultats de cette thèse soulignent l'importance d’investiguer la reconnaissance des visages avec des stimuli dynamiques, non seulement en neuropsychologie, mais aussi dans d'autres domaines des neurosciences développementales et cliniques
The human visual system is steadily stimulated by dynamic cues. Faces provide crucial information important for adapted social interactions. From an evolutionary perspective, humans have been more extensively exposed to dynamic faces, as static face images have only appeared recently with the advent of photography and the expansion of digital tools. Yet, most studies investigating face perception have relied on static faces and only a little is known about the mechanisms involved in dynamic face processing.To clarify this issue, this thesis aimed to use dynamic faces to investigate different aspects of face processing in different populations and age groups. In Study 1, we used dynamic faces to investigate whether the ability of infants aged 6, 9, and 12 months in matching audible and visible attributes of gender is influenced by the use of adult-directed (ADS) vs. infant-directed (IDS) speech. Our results revealed that from 6 months of age, infants matched female faces and voices when presented with ADS. This ability emerged at 9 months of age when presented with IDS. Altogether, these findings support the idea that the perception of multisensory gender coherence is influenced by the nature of social interactions.In Study 2, we used a novel 4D technique to reconstruct the dynamic internal representations of the six basic expressions in a pure case of acquired prosopagnosia (i.e., a brain-damaged patient severely impaired in recognizing familiar faces). This was done in order to re-examine the debated issue of whether identity and expression are processed independently. Our results revealed that our patient used all facial features to represent basic expressions, contrasting sharply with her suboptimal use of facial information for identity recognition. These findings support the idea that different sets of representations underlie the processing of identity and expression. We then examined our patient’s ability to recognize static and dynamic expressions using her internal representations as stimuli. Our results revealed that she was selectively impaired in recognizing many of the static expressions; whereas she displayed maximum accuracy in recognizing all the dynamic emotions with the exception of fear. The latter findings support recent evidence suggesting that separate cortical pathways, originating in early visual areas and not in the inferior occipital gyrus, are responsible for the processing of static and dynamic face information.Moving on from our second study, in Study 3, we investigated whether dynamic cues offer processing benefits for the recognition of facial expressions in other populations with immature or fragile face processing systems. To this aim, we conducted a large sample cross-sectional study with more than 400 participants aged between 5 to 96 years, investigating their ability to recognize the six basic expressions presented under different temporal conditions. Consistent with previous studies, our findings revealed the highest recognition performance for happiness, regardless of age and experimental condition, as well as marked confusions among expressions with perceptually similar facial signals (e.g., fear and surprise). By using Bayesian modelling, our results further enabled us to quantify, for each expression and condition individually, the steepness of increase and decrease in recognition performance, as well as the peak efficiency, the point at which observers’ performance reaches its maximum before declining. Finally, our results offered new evidence for a dynamic advantage for facial expression recognition, stronger for some expressions than others and more important at specific points in development.Overall, the results highlighted in this thesis underlie the critical importance of research featuring dynamic stimuli in face perception and expression recognition studies; not only in the field of prosopagnosia, but also in other domains of developmental and clinical neuroscience
APA, Harvard, Vancouver, ISO, and other styles
16

Longmore, Richard. "The understanding and perception of emotion in schizophrenia." Thesis, University of Leicester, 2002. http://hdl.handle.net/2381/31335.

Full text
Abstract:
Objectives: Research is beginning to examine the links between schizophrenia and social cognition - the processes people use to make sense of their social experience (Corrigan & Perm, 2001). Identifying emotion in other people is a vital social skill. A significant body of research shows that people with schizophrenia have problems judging facial emotions. However, an intact understanding of emotion concepts is usually assumed in such research. This thesis aims to establish (a) whether affect recognition difficulties in schizophrenia reflect problems in the understanding or perception of emotion (b) the relationship of emotional understanding to social functioning and (c) how far general cognition accounts for differences in affect recognition. Method: The study describes the validation of a set of vignettes that reliably imply specific emotions. These were then administered to participants with a diagnosis of schizophrenia (n = 60), nonclinical controls (n = 40) and learning disabled controls (n = 20). There were two experimental conditions. In the first, vignettes were paired with emotion words. In the second, they were matched with previously validated photographs of emotional facial expressions. A measure of intelligence was administered to all participants, and a social functioning scale was completed for participants with schizophrenia. Results: Participants with schizophrenia and learning disabled controls had significantly more difficulty than nonclinical controls in the understanding and perception of emotion. Once general intellectual functioning was taken into account, however, only the group with schizophrenia showed a differential deficit in affect recognition. No differential deficit was found in the perception of facial emotion in schizophrenia, although performance on some emotions was markedly low. There was no significant correlation between affect recognition ability and social functioning in schizophrenia. Conclusion: People with schizophrenia have a specific and differential deficit in social understanding which is not wholly accounted for by general cognitive functioning.
APA, Harvard, Vancouver, ISO, and other styles
17

Cox, A. G. "Multimodal emotion perception from facial and vocal signals." Thesis, University of Cambridge, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.598105.

Full text
Abstract:
The perception of emotion in other people is a fundamental part of social communication. Emotional expressions are often multimodal in nature and like human speech both auditory and visual components are used for comprehension. Up to this date however, the majority of emotion research has focused on the perception of emotion from facial or vocal expressions in isolation. This thesis investigated the behavioural and neural consequences of perceiving emotion from facial and vocal emotional signals simultaneously. Initial experiments demonstrated that a congruent, but unattended, vocal expression produced faster emotion-categorisation decisions to facial expressions, relative to incongruent or neutral voices. Similarly, simultaneously presented facial expressions had the same effect on the categorisation of vocal expressions. Subsequent experiments showed that other pairings of emotional stimuli (vocal expressions and emotion pictures; facial expressions and emotion pictures) did not have bi-directional effects on each other, but rather asymmetric effects that were consistent with interactions between these stimuli at post-perceptual stages of processing. Facial and vocal signals are naturalistic pairings, and evidence that these signals are integrated at a ‘perceptual’ level was provided by a final experiment using functional magnetic resonance imaging. Congruent facial-vocal pairings produced enhanced activity in the superior temporal sulcus; a region implicated in cross-modal integration of sensory inputs. The data from this thesis suggest that facial and vocal signals of emotion are automatically integrated at a perceptual processing stage to create a single unified percept to facilitate social communication.
APA, Harvard, Vancouver, ISO, and other styles
18

Zieber, Nicole R. "INFANTS’ PERCEPTION OF EMOTION FROM DYNAMIC BODY MOVEMENTS." UKnowledge, 2012. http://uknowledge.uky.edu/psychology_etds/5.

Full text
Abstract:
In humans, the capacity to extract meaning from another person’s behavior is fundamental to social competency. Adults recognize emotions conveyed by body movements with comparable accuracy to when they are portrayed in facial expressions. While infancy research has examined the development of facial and vocal emotion processing extensively, no prior study has explored infants’ perception of emotion from body movements. The current studies examined the development of emotion processing from body gestures. In Experiment 1, I asked whether 6.5-month-olds infants would prefer to view emotional versus neutral body movements. The results indicate that infants prefer to view a happy versus a neutral body action when the videos are presented upright, but fail to exhibit a preference when the videos are inverted. This suggests that the preference for the emotional body movement was not driven by low-level features (such as the amount or size of the movement displayed), but rather by the affective content displayed. Experiments 2A and 2B sought to extend the findings of Experiment 1 by asking whether infants are able to match affective body expressions to their corresponding vocal emotional expressions. In both experiments, infants were tested using an intermodal preference technique: Infants were exposed to a happy and an angry body expression presented side by side while hearing either a happy or angry vocalization. An inverted condition was included to investigate whether matching was based solely upon some feature redundantly specified across modalities (e.g., tempo). In Experiment 2A, 6.5-month-old infants looked longer at the emotionally congruent videos when they were presented upright, but did not display a preference when the same videos were inverted. In Experiment 2B, 3.5-month-olds tested in the same manner exhibited a preference for the incongruent video in the upright condition, but did not show a preference when the stimuli were inverted. These results demonstrate that even young infants are sensitive to emotions conveyed by bodies, indicating that sophisticated emotion processing capabilities are present early in life.
APA, Harvard, Vancouver, ISO, and other styles
19

John, C. "Subliminal perception and the cognitive processing of emotion." Thesis, University of Reading, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.233155.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Vickhoff, Björn. "A perspective theory of music perception and emotion /." Göteborg : Göteborgs Universitet, 2008. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=016671611&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Paterson, Helena M. "The perception and cognition of emotion from motion." Thesis, Connect to e-thesis, 2002. http://theses.gla.ac.uk/1072/.

Full text
Abstract:
Thesis (Ph.D.) -- University of Glasgow, 2002.
Ph.D. thesis submitted to the Department of Psychology, University of Glasgow, 2002. Includes bibliographical references. Print version also available.
APA, Harvard, Vancouver, ISO, and other styles
22

Dalili, Michael Nader. "Investigating emotion recognition and evaluating the emotion recognition training task, a novel technique to alter emotion perception in depression." Thesis, University of Bristol, 2016. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.702458.

Full text
Abstract:
Rationale. Accurately recognising facial expressions of emotion is important in social interactions and for maintaining interpersonal relationships. While comparing evidence across studies is difficult, research suggests that depressed individuals show deficits in emotion recognition (ER). A possible explanation for these deficits is the biased perception of these expressions. Research suggests that the emotion recognition training task, a novel cognitive bias modification (CBM) technique, shows promise in improving affect in individuals with low mood. However, further work is necessary to evaluate its training effects. Finally, research in healthy individuals has been limited, with larger studies needed to determine the effects of participant and study characteristics and negative symptoms on ER performance. Methods. Using experimental methodologies such as meta-analysis and online recruitment and testing, the research conducted here reviews and contributes to ER research in healthy and depressed populations. This work also uses CBM paradigms, brain imaging, and randomised controlled trial design to evaluate the emotion recognition training task. Results. This research identifies a general ER deficit in depression, and across emotions except sadness. It also finds effects of presentation time and anxiety, but not sociodemographic characteristics or depression, on performance in healthy individuals. This work also indicates generalisation of emotion recognition training effects across identities, but only partial generalisation across emotions. Finally, it finds increased neural activity for happy faces following training in individuals with low mood. Conclusions. Overall, this thesis has contributed new evidence to understanding ER and factors influencing performance in healthy and depressed individuals. The work presented in this thesis has found partial generalisation of emotion recognition training effects and an increase in neural activation for happy faces following a course of training, resembling antidepressant treatment effects. These findings suggest emotion recognition training is a promising novel CBM technique that should continue being evaluated for use in treatment in conjunction with traditional methods such as cognitive behavioural therapy.
APA, Harvard, Vancouver, ISO, and other styles
23

Santorelli, Noelle Turini. "Perception of Emotion from Facial Expression and Affective Prosody." Digital Archive @ GSU, 2006. http://digitalarchive.gsu.edu/psych_theses/17.

Full text
Abstract:
Real-world perception of emotion results from the integration of multiple cues, most notably facial expression and affective prosody. The use of incongruent emotional stimuli presents an opportunity to study the interaction between sensory modalities. Thirty-seven participants were exposed to audio-visual stimuli (Robins & Schultz, 2004) including angry, fearful, happy, and neutral presentations. Eighty stimuli contain matching emotions and 240 contain incongruent emotional cues. Matching emotions elicited a significant number of correct responses for all four emotions. Sign tests indicated that for most incongruent conditions, participants demonstrated a bias towards the visual modality. Despite these findings, specific incongruent conditions did show evidence of blending. Future research should explore an evolutionary model of facial expression as a means for behavioral adaptation and the possibility of an “emotional McGurk effect” in particular combinations of emotions.
APA, Harvard, Vancouver, ISO, and other styles
24

Buchanan, Joshua. "I Feel Your Pain: Social Connection and the Expression and Perception of Regret." Miami University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=miami1436928483.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Aldebot, Stephanie. "Neurocognition, Emotion Perception and Quality of Life in Schizophrenia." Scholarly Repository, 2009. http://scholarlyrepository.miami.edu/oa_theses/228.

Full text
Abstract:
Patients with schizophrenia have extremely high levels of depression and suicide (Carlborg et al., 2008), thus, a better understanding of factors associated with poor quality of life (QoL) for this population is sorely needed. A growing body of research suggests that cognitive functioning in schizophrenia may be a strong predictor of overall QoL (Green et al., 2000), but individual domains of QoL have not been examined. Indirect evidence also suggests that emotion perception may underlie the relationship between neurocognition and QoL, but this hypothesis has also yet to be tested. Using a sample of 92 clinically stable schizophrenia patients, the current study explores the relationship between neurocognition, namely attention and working memory, and the following sub domains of QoL: social, vocational, intrapsychic foundations and environmental engagement. The current study also examines whether emotion perception mediates this relationship. In partial support of hypotheses, patients with more deficits in working memory reported decreased Occupational QoL and, although only marginally significant, decreased Total QoL. There was also a trend for poorer working memory to be associated with poorer Intrapsychic Foundations QoL. Contrary to hypotheses, emotion perception was not found to mediate the relationship between working memory and QoL. Current findings suggest that interventions that specifically target working memory may also improve many other aspects of schizophrenia patients? QoL.
APA, Harvard, Vancouver, ISO, and other styles
26

Obeidi, Amer. "Emotion, Perception and Strategy in Conflict Analysis and Resolution." Thesis, University of Waterloo, 2006. http://hdl.handle.net/10012/2828.

Full text
Abstract:
Theoretical procedures are developed to account for the effect of emotion and perception in strategic conflict. The possibility principle facilitates modeling the effects of emotions on future scenarios contemplated by decision makers; perceptual graph models and a graph model system permit the decision makers (DMs) to experience and view the conflict independently; and perceptual stability analysis, which is based on individual- and meta-stability analysis techniques, is employed in analyzing graph model systems when the DMs have inconsistent perceptions. These developments improve the methodology of the Graph Model for Conflict Resolution by reconciling emotion, perception, and strategy to make predictions consistent with the actual unfolding of events.

Current research in neuroscience suggests that emotions are a necessary component of cognitive processes such as memory, attention, and reasoning. The somatic marker hypothesis, for example, holds that feelings are necessary to reasoning, especially during social interactions (Damasio, 1994, 2003). Somatic markers are memories of past emotions: we use them to predict future outcomes. To incorporate the effect of emotion in conflict, the underlying principle of Damasio?s hypothesis is used in developing the possibility principle, which significantly expands the paradigm of the Graph Model for Conflict Resolution of Fang, Hipel, and Kilgour (1993).

State identification is a crucial step in determining future scenarios for DMs. The possibility principle is integrated into the modeling stage of the Graph Model by refining the method of determining feasible states. The possibility principle enables analysts and DMs to include emotion in a conflict model, without sacrificing the parsimonious design of the Graph Model methodology, by focusing attention on two subsets of the set of feasible states: hidden and potential states. Hidden states are logically valid, feasible states that are invisible because of the presence of negative emotions such as anger and fear; potential states are logically valid, feasible states that are invisible because of missing positive emotions. Dissipating negative emotions will make the hidden states visible, while expressing the appropriate positive emotions will make the potential states visible. The possibility principle has been applied to a number of real world conflicts. In all cases, eliminating logically valid states not envisioned by any DM simplifies a conflict model substantially, expedites the analysis, and makes it an intuitive and a realistic description of the DMs' conceptualizations of the conflict.

A fundamental principle of the Graph Model methodology is that all DMs' directed graphs must have the same set of feasible states, which are integrated into a standard graph model. The possibility principle may modify the set of feasible states perceived by each DM according to his or her emotion, making it impossible to construct a single standard graph model. When logically valid states are no longer achievable for one or more DMs due to emotions, the apprehension of conflict becomes inconsistent, and resolution may become difficult to predict. Therefore, reconciling emotion and strategy requires that different apprehensions of the underlying decision problem be permitted, which can be accomplished using a perceptual graph model for each DM. A perceptual graph model inherits its primitive ingredients from a standard graph model, but reflects a DM's emotion and perception with no assumption of complete knowledge of other DMs' perceptions.

Each DM's perceptual graph model constitutes a complete standard graph model. Hence, conclusions drawn from a perceptual graph model provide a limited view of equilibria and predicted resolutions. A graph model system, which consists of a list of DMs' perceptual graph models, is defined to reconcile perceptions while facilitating conclusions that reflect each DM's viewpoint. However, since a DM may or may not be aware that other graph models differ from his or her own, different variants of graph model systems are required to describe conflicts. Each variant of graph model system corresponds to a configuration of awareness, which is a set of ordered combinations of DMs' viewpoints.

Perceptual stability analysis is a new procedure that applies to graph model systems. Its objective is to help an outside analyst predict possible resolutions, and gauge the robustness and sustainability of these predictions. Perceptual stability analysis takes a two-phase approach. In Phase 1, the stability of each state in each perceptual graph model is assessed from the point of view of the owner of the model, for each DM in the model, using standard or perceptual solution concepts, depending on the owner's awareness of others' perceptions. (In this research, only perceptual solution concepts for the 2-decision maker case are developed. ) In Phase 2, meta-stability analysis is employed to consolidate the stability assessments of a state in all perceptual graph models and across all variants of awareness. Distinctive modes of equilibria are defined, which reflect incompatibilities in DMs' perceptions and viewpoints but nonetheless provide important insights into possible resolutions of conflict.

The possibility principle and perceptual stability analysis are integrative techniques that can be used as a basis for empathetically studying the interaction of emotion and reasoning in the context of strategic conflict. In general, these new techniques expand current modeling and analysis capabilities, thereby facilitating realistic, descriptive models without exacting too great a cost in modeling complexity. In particular, these two theoretical advances enhance the applicability of the Graph Model for Conflict Resolution to real-world disputes by integrating emotion and perception, common ingredients in almost all conflicts.

To demonstrate that the new developments are practical, two illustrative applications to real-world conflicts are presented: the US-North Korea conflict and the confrontation between Russia and Chechen Rebels. In both cases, the analysis yields new strategic insights and improved advice.
APA, Harvard, Vancouver, ISO, and other styles
27

Balkwill, Laura-Lee. "Perception of emotion in music a cross-cultural investigation /." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape15/PQDD_0035/MQ27332.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Yates, Alan J. "The role of attention and awareness in emotion perception." Thesis, University of Essex, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.496279.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Christy, Anita Marie. "The Effects of Attributed Gender on Adult Emotion Perception." Thesis, Boston College, 2004. http://hdl.handle.net/2345/446.

Full text
Abstract:
Thesis advisor: James Russell
Adults' gender stereotypes of emotion have been investigated with a variety of methods, but those methods do not provide a strong test of the stereotype: The participants were presented only with cues to the gender or to the emotion; or when both cues were available, gender was confounded with poser. This study examined the effects of attributed gender on adults' perception of emotion in facial expressions and stories when presented with clear versus ambiguous cues to both emotion and gender. College students (n = 90) were first asked to label the emotion of either a man (Timothy) or a woman (Sophia) with identical prototypical and “mixed” facial expressions and, separately, to Free Label stories about emotions. The same students were then to choose from a list of ten emotion labels the one that best described the protagonist's emotion for the same stimuli. Results showed that, for ambiguous cues to emotion, participants labeled facial expressions according to gender stereotypes. However, for the stimuli with clear cues to both emotion and gender of the poser, a reverse effect of gender stereotypes was observed for anger, fear, shame, and compassion due to an expectancy violation
Thesis (BA) — Boston College, 2004
Submitted to: Boston College. College of Arts and Sciences
Discipline: Psychology
Discipline: College Honors Program
APA, Harvard, Vancouver, ISO, and other styles
30

Mollet, Gina Alice. "Neuropsychological Effects of Hostility and Pain on Emotion Perception." Diss., Virginia Tech, 2006. http://hdl.handle.net/10919/26432.

Full text
Abstract:
Recent research on the neuropsychology of emotion and pain has indicated that emotion and pain are complex processes that may substantially influence each other. Disorders of negative emotion and pain are known to co-occur (Delgado, 2004); however, it is not clear whether negative emotional conditions lead to pain or whether increased pain experiences lead to negative emotion. Further, certain negative emotions, such as hostility or anger, may produce differential effects on the experience of pain, such that they may lead to an increase in pain or a decrease in pain. An increase or decrease in pain perception may lead to altered behavioral, cognitive, and neuropsychological effects in high hostility. In order to more clearly examine the aforementioned relationships, the current experiment examined auditory emotion perception before and after cold pressor pain in high and low hostile men. Additionally, quantitative electroencephalography (QEEG) was used to measure changes in cerebral activation as a result of auditory emotion perception and cold pressor pain. Results indicated that identification of emotion post-cold pressor differed as a function of hostility level and ear. The high hostile group increased identification of stimuli at the right ear after cold pressor exposure, while the low hostile group increased identification of stimuli at the left ear after cold pressor exposure. Primary QEEG findings indicated increased left temporal activation after cold pressor exposure and increased reactivity to cold pressor pain in the high hostile group. Low hostile men had a bilateral increase in high beta magnitude at the temporal lobes and a bilateral increase in delta magnitude at the frontal lobes after the cold pressor. Results suggest decreased cerebral laterality and left hemisphere activation for emotional and pain processing in high hostile men.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
31

Heck, Alison, Alyson Chroust, Hannah White, Rachel Jubran, and Ramesh S. Bhatt. "Development of Body Emotion Perception in Infancy: From Discrimination to Recognition." Digital Commons @ East Tennessee State University, 2018. https://dc.etsu.edu/etsu-works/2730.

Full text
Abstract:
Research suggests that infants progress from discrimination to recognition of emotions in faces during the first half year of life. It is whether the perception of emotions from bodies develops in a similar manner. In the current study, when presented with happy and angry body videos and voices, 5-month-olds looked longer at the matching video when they were presented upright but not when they were inverted. In contrast, 3.5-month-olds failed to match even with upright videos. Thus, 5-month-olds but not 3.5-month-olds exhibited evidence of recognition of emotions from bodies by demonstrating intermodal matching. In a subsequent experiment, younger infants did discriminate between body emotion videos but failed to exhibit an inversion effect, suggesting that discrimination may be based on low-level stimulus features. These results document a developmental change from discrimination based on non-emotional information at 3.5 months to recognition of body emotions at 5 months. This pattern of development is similar to face emotion knowledge development and suggests that both the face and body emotion perception systems develop rapidly during the first half year of life.
APA, Harvard, Vancouver, ISO, and other styles
32

Duval, Céline. "Pain perception in schizophrenia, and relationships between emotion and visual organization : is emotion flattened in patients, and how does it affect cognition?" Thesis, Strasbourg, 2014. http://www.theses.fr/2014STRAJ052/document.

Full text
Abstract:
La schizophrénie touche 1% de la population et comprend des symptômes positifs (hallucinations) et négatifs (affect émoussé), mais aussi des troubles cognitifs. Ici nous présentons deux expériences qui explorent l’interaction entre cognition, douleur et émotion chez les patients et les sujets sains. La première étude montre que des images émotionnelles peuvent détourner l’attention jusqu’à renverser les effets de groupement automatique. Cet effet est présent chez les patients comme chez les témoins. La deuxième étude est centrée sur la perception de la douleur en prenant en compte les différents mécanismes sollicités, dont le traitement émotionnel. Nos résultats, et notamment une P50 élevée chez les patients après la stimulation douloureuse montrent une hypersensibilité à un niveau très précoce. Les deux études montrent que les patients sont plus sensibles aux stimuli émotionnels et douloureux que ce que l’on pensait, ce qui devrait être pris en compte lors de leur prise en charge
Schizophrenia is a severe mental illness affecting 1% of the population, and comprises positive (hallucinations) and negative symptoms (blunted affect), but also cognitive deficits. Here we describe two distinct studies which address the question of how emotion and cognition interact, in healthy subjects and in schizophrenia. In the first study we created a paradigm that shows how emotional stimuli distract subjects and thus interfere during the organization of visual stimuli. The effect is the same in patients and healthy controls.In our second study we explored pain perception by taking into account different mechanisms, and especially emotion processing. The results show that patients are more sensitive to pain than healthy controls as they present an elevated P50 which indicates an alteration at an early stage of processing. Both studies reveal that patients are more sensitive as previously thought which has to be considered when dealing with patients in hospitals and everyday life
APA, Harvard, Vancouver, ISO, and other styles
33

Bellegarde, Lucille Gabrielle Anna. "Perception of emotions in small ruminants." Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/25915.

Full text
Abstract:
Animals are sentient beings, capable of experiencing emotions. Being able to assess emotional states in farm animals is crucial to improving their welfare. Although the function of emotion is not primarily for communication, the outward expression of an emotional state involves changes in posture, vocalisations, odours and facial expressions. These changes can be perceived and used as indicators of emotional state by other animals. Since emotions can be perceived between conspecifics, understanding how emotions are identified and how they can spread within a social group could have a major impact on improving the welfare of farmed species, which are mostly reared in groups. A recently developed method for the evaluation of emotions in animals is based on cognitive biases such as judgment biases, i.e. an individual in a negative emotional state will show pessimistic judgments while and individual in a positive emotional state will show optimistic judgments. The aims of this project were to (A) establish whether sheep and goats can discriminate between images of faces of familiar conspecifics taken in different positive and negative situations, (B) establish whether sheep and goats perceive the valence (positive of negative) of the emotion expressed by the animal on the image, (C) validate the use of images of faces in cognitive bias studies. The use of images of faces of conspecifics as emotional stimuli was first validated, using a discrimination task in a two-armed maze. A new methodology was then developed across a series of experiments to assess spontaneous reactions of animals exposed to video clips or to images of faces of familiar conspecifics. Detailed observations of ear postures were used as the main behavioural indicator. Individual characteristics (dominance status within the herd, dominance pairwise relationships and humananimal relationship) were also recorded during preliminary tests and included in the analyses. The impact of a low-mood state on the perception of emotions was assessed in sheep after subjecting half of the animals to unpredictable negative housing conditions and keeping the other half in good standard housing conditions. Sheep were then presented with videos of conspecifics filmed in situations of varying valence. Reactions to ambiguous stimuli were evaluated by presenting goats with images of morphed faces. Goats were also presented with images of faces of familiar conspecifics taken situations of varying emotional intensity. Sheep could discriminate images of faces of conspecifics taken either in a negative or in a neutral situation and their learning process of the discrimination task was affected by the type of emotion displayed. Sheep reacted differently depending on the valence of the video clips (P < 0.05); however, there was no difference between the control and the low-mood groups (P > 0.05). Goats also showed different behavioural reactions to images of faces photographed in different situations (P < 0.05), indicating that they perceived the images as different. Responses to morphed images were not necessarily intermediate to responses to negative and positive images and not gradual either, which poses a major problem to the potential use of facial images in cognitive bias experiments. Overall, animals were more attentive towards images or videos of conspecifics in negative situations, i.e., presumably, in a negative emotional state. This suggests that sheep and goats are able to perceive the valence of the emotional state. The identity of the individual on the photo also affected the animals’ spontaneous reaction to the images. Social relationships such as dominance, but also affinity between the tested and photographed individual seem to influence emotion perception.
APA, Harvard, Vancouver, ISO, and other styles
34

Heck, Alison Rae. "Effects of Motion on Infants' Negativity Bias in Emotion Perception." Thesis, Virginia Tech, 2012. http://hdl.handle.net/10919/40540.

Full text
Abstract:
The negativity bias is a phenomenon that is characterized by infants being more influenced by, attending more to, and responding to more negative emotion information from the environment than positive emotion information. This study used a Tobii© T60 eye-tracking system to examine differences in 8- to 12-month-old infants' latencies to disengage from a centrally-presented face for three different emotion conditions-happy, sad, and fear. The events also varied by motion type-static versus dynamic. Additionally, infants' locomotor experience and parental affect served as two additional measures of experience, and assessed for their contributions to the infants' negativity bias. It was expected that infants would show longer latencies to disengage from the negative emotion events (fear or sad) compared to the positive emotion event (happy), but also that the latencies would be augmented by event type (dynamic > static), locomotion experience (high > low), and parental affect (higher negativity > lower negativity). Although infants showed more attention to dynamic than static emotion displays (especially on the speaker's mouth), and more attention to happy and sad compared to fear displays, no consistent effect of emotion type was found on infants' attention disengagement. Thus, no evidence for a negativity bias was seen. The results are interpreted with respect to possible contributions of the bimodal nature of emotion expression in the current study as well as age-related attentional differences in responding to a wide range of emotion cues.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
35

Beck, Erika D. "The perception, experience and regulation of emotion : an fMRI approach /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC IP addresses, 2001. http://wwwlib.umi.com/cr/ucsd/fullcit?p3026379.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Grice-Jackson, Thomas. "Individual differences in the vicarious perception of pain." Thesis, University of Sussex, 2018. http://sro.sussex.ac.uk/id/eprint/80918/.

Full text
Abstract:
Vicarious pain refers to the processes and experiences that arise from observations of other people in pain. Due to the interpersonal and multi-modal nature of these processes, research into the field is highly relevant for a number of key concepts in social cognitive neuroscience, such as empathy, multi-sensory processing and social cognition. The dominant approach in the field has been to focus on normative samples with little focus being given to inter-individual differences. The discovery of a subsample of the population who report conscious experiences of pain when observing it, so called 'mirror-pain responders', presents a significant opportunity for developing our understanding of the neural processes and characteristics associated with vicarious pain. The present thesis aims to extend understanding of this group who appear to lie on an extreme end of a spectrum of vicarious pain perception. Although past research has highlighted this group and made some attempts to identify their prevalence, few formal attempts have been made to stringently discover the prevalence and identify the characteristics of their qualitative experience. As such, ARTICLE I developed a questionnaire, named the Vicarious Pain Questionnaire (VPQ), which characterises mirror-pain responders based on their subjective experiences of pain. The results showed a surprisingly high prevalence rate for the condition, ~30%. In addition through the use of a cluster analysis, the VPQ identified subgroups within mirror-pain responders, which included a group who experienced sensory and localised mirror-pain, and a group that experienced affective and generalised mirror-pain. ARTICLE I and ARTICLE II both aimed at assessing the neural basis for the experiences and successfully highlighted the role of hyperactivity in vicarious somatosensory processing, through the use of electrophysiological (EEG) neuro-markers for somatosensory processing (mu rhythm) and functional magnetic resonance imaging (fMRI) activation in the somatosensory cortex during pain observation. Additionally, these articles highlighted the role of self-other processing regions through the use of voxel-based morphometry (VBM) which revealed reduced grey-matter volume in the right temporo-parietal junction (rTPJ), and psycho-physiological interactions (PPI) of fMRI processing which revealed connectivity networks between pain matrix regions and self-other processing regions (rTPJ and dorsomedial prefrontal cortex (DMPFC)). Characteristics of the mirror-pain were further assessed in ARTICLE III which in a battery of behavioural and physiological tests were administered to mirror-pain responders and controls. This study showed abnormal autonomic nervous system processing for Affective/General mirror-pain responders and confirmed the link between the condition and questionnaire measures of empathy. Finally, ARTICLE IV failed to provide a causal link between self-other processing regions (rTPJ) and somatosensory activation in response to pain observations through the use of theta-burst transcranial magnetic stimulation (TMS) in non-responders. This calls into question the direct causality of neural mechanisms associated with self-other theories of mirror-pain. This thesis demonstrates the importance of studying inter-individual differences in vicarious pain by reporting a set questionnaire and neuroimaging results which contribute to debates in the field and raises questions for future research. This work, its implications, and contributions to the wider literature are reviewed in the DISCUSSION chapter.
APA, Harvard, Vancouver, ISO, and other styles
37

Gerber, Ora. "Die persepsie en belewenis van emosionele selfregulering by 'n groep laatadolessente / Ora Gerber." Thesis, North-West University, 2007. http://hdl.handle.net/10394/757.

Full text
Abstract:
This study aims to investigate the perception and experience of emotion self-regulation in a group of late adolescents. An exploratory, qualitative survey design was used to collect data from a group of 54 Afrikaans-speaking late adolescents by means of a semi structured emotion self-regulation questionnaire. Data were assessed by means of directed thematic content analysis (Hsieh & Shannon, 2005). It was established that participants primarily have a positive perception of emotions, and that more participants display higher levels of emotion awareness. However, despite this, most participants experience emotions negatively. At most, therefore, a balance is struck between the constructive and unconstructive handling of emotions. Throughout it was endeavoured to relate the results to late adolescence as a developmental stage. Study conclusions include that emotion self-regulation in late adolescents is strongly influenced by uncertainty about the handling of emotions, self-consciousness with regard to emotions in a social context, and a lack of self-control. A few recommendations are made on the basis of these conclusions.
Thesis (M.A. (Psychology))--North-West University, Potchefstroom Campus, 2008.
APA, Harvard, Vancouver, ISO, and other styles
38

Mignault, Alain. "Connectionist models of the perception of facial expressions of emotion." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0019/NQ55360.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Milligan, Karen Victoria. "Attachment and depression, communication and perception of emotion through song." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/MQ53472.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Mignault, Alain 1962. "Connectionist models of the perception of facial expressions of emotion." Thesis, McGill University, 1999. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=36039.

Full text
Abstract:
Two connectionist models are developed that predict humans' categorization of facial expressions of emotion and their judgements of similarity between two facial expressions. For each stimulus, the models predict the subjects' judgement, the entropy of the response, and the mean response time (RT). Both models involve a connectionist component which predicts the response probabilities and a response generator which predicts the mean RT. The input to the categorization model is a preprocessed picture of a facial expression, while the hidden unit representations generated by the first model for two facial expressions constitute the input of the similarity model. The data collected on 45 subjects in a single-session experiment involving a categorization and a similarity task provided the target outputs to train both models. Two response generators are tested. The first, called the threshold model , is a linear integrator with threshold inspired from Lacouture and Marley's (1991) model. The second, called the channel model, constitutes a new approach which assumes a linear relationship between entropy of the response and mean RT. It is inspired by Lachman's (1973) interpretation of Shannon's (1948) entropy equation. The categorization model explains 50% of the variance of mean RT for the training set. It yields an almost perfect categorization of the pure emotional stimuli of the training set and is about 70% correct on the generalization set. A two-dimensional representation of emotions in the hidden unit space reproduces most of the properties of emotional spaces found by multidimensional scaling in this study as well as in other studies (e.g., Alvarado, 1996). The similarity model explains 53% of the variance of mean similarity judgements; it provides a good account of subjects' mean RT; and it even predicts an interesting bow effect that was found in subjects' data.
APA, Harvard, Vancouver, ISO, and other styles
41

Owen, Ann Lesley. "Development of tests of emotion-related learning in person perception." Thesis, Keele University, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.392158.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Halvorsen, Gunn Kristin. "Gender and cerebral lateralization of audio-visual perception of emotion." Thesis, Norges teknisk-naturvitenskapelige universitet, Psykologisk institutt, 2014. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-25305.

Full text
Abstract:
Presentation of brief (120ms, 160ms, 520ms) audio prosody-, video- and audio-visual clips containing congruent emotion (anger, fear, happiness, neutral, sadness/ positive- and negative valence) was used in a divided visual field technique/dichotic listening behavioral experiment consisting of 17 males and 17 females to investigate a possible gender difference in cerebral lateralization in perception of emotion. Clips were created from Montréal affective voices and the Montréal Pain and Affective Face Clips. Accuracy percentages of correct recognition of emotion were recorded. Findings showed no support for either the right-hemisphere- or the valence hypothesis. Gender as a between subject factor was non significant. Clips containing both audio and video had the highest accuracy score of all modalities. Audio-only prosody had significant lower accuracy score compared to video-only and audio-visual clips. Positive valence in the short length may have an early accuracy advantaged compared to negative valence in the audio-visual modality that dissipates in 120ms-160ms range, with the accuracy difference disappearing between the categories. The same advantage can be found in anger, while happiness, fear and neutral have no significant differences in accuracy in lengths in the audio-visual modality.
APA, Harvard, Vancouver, ISO, and other styles
43

Metcalfe, Tim. "Perception of speech, music and emotion by hearing-impaired listeners." Thesis, University of Sheffield, 2017. http://etheses.whiterose.ac.uk/19151/.

Full text
Abstract:
The everyday tasks of perceiving speech, music and emotional expression via both of these media, are made much more difficult in the case of hearing impairment. Chiefly, this is because relevant acoustic cues are less clearly audible, owing to both hearing loss in itself, and the limitations of available hearing prostheses. This thesis focussed specifically on two such devices, the cochlear implant (CI) and the hearing aid (HA), and asks two overarching questions: how do users approach music and speech perception tasks, and how can performance be improved? The first part of the thesis considered auditory perception of emotion by CI users. In particular, the underlying mechanisms by which this population perform such tasks are poorly understood. This topic was addressed by a series of emotion discrimination experiments, featuring both normal-hearing (CI-simulated) participants and real CI users, in which listeners heard stimuli with processing designed to systematically attenuate different acoustic features. Additionally, a computational modelling approach was utilised in order to estimate participants' listening strategies, and whether or not these were optimal. It was shown that the acoustic features attended to by participants were a compromise of those generally better-preserved by the CI, and those particularly salient for each stimulus. In the latter half of the thesis, the nature of assessment of music perception by hearing-impaired listeners was considered. Speech perception has typically taken precedence in this domain which, it is argued, has left assessment of music perception relatively underdeveloped. This problem was addressed by the creation of a novel, psychoacoustical testing procedure, similar to those typically used with speech. This paradigm was evaluated via listening experiments with both HA users and CI-simulated listeners. In general, the results indicated that the measure produced both valid and reliable results, suggesting the suitability of the procedure as both a clinical and experimental tool. Lastly, the thesis considered the consequences of the various findings for both research and clinical practice, contextualising the results with reference to the primary research questions addressed, and thereby highlighting what there is left to discover.
APA, Harvard, Vancouver, ISO, and other styles
44

Foster, Mary Kristin. "Perception of emotion in older adults with mild cognitive impairment." University of Cincinnati / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1289235238.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Fournier, Jody Stanton. "Beliefs that emotion and need states influence perception : developmental differences /." The Ohio State University, 1999. http://rave.ohiolink.edu/etdc/view?acc_num=osu1488191124570332.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Veltri, Thersa. "The effects of nicotine on music-induced emotion and perception." Thesis, University of Sheffield, 2016. http://etheses.whiterose.ac.uk/15819/.

Full text
Abstract:
This thesis investigates why nicotine is often consumed in the context of music. Nicotine and music both independently increase physiological and emotional indices of arousal and pleasure, however less is known about these responses when they occur together. Study one tests the effects of nicotine on music-induced emotion in smokers and nonsmokers (n = 125) and overall finds trends indicative of additive effects (although nonsignificant) on the physiological and emotional responses of listeners. However, nonsmokers experienced negative side effects, such as a decrease in arousal and pleasure, due to their lack of tolerance for nicotine. To disassociate the effects of nicotine (e.g. increase in arousal, increase in pleasure) study two tests the effects of caffeine on music-induced emotion in smokers and nonsmokers (n = 120). Caffeine was predicted to only increase arousal without influencing pleasure, but increased both and had additive effects on the physiological and emotional responses to music. It is proposed that these additive effects occur through nicotine and caffeine’s ability to increase the reward value of other stimuli and through excitation transfer, where increased physiological arousal from pharmacological substances amplifies the emotions experienced during music listening. Following on from the above physiological studies, Study three examines how nicotine affects auditory information processing in nonsmokers (n = 36) using ERP (event related potentials) techniques. Nicotine decreases habituation, reflected by an increase in the P2 amplitude in the frontal region. Nicotine therefore reduces listeners’ disengagement from repetition in music, thereby increasing familiarity and music-induced emotion. These results agree with Dibben (2004) who found increased physiological arousal from exercise to intensify music-induced emotions and with Domino & Kishimoto (2002) who found nicotine to decrease habituation in nonsmokers during frequently occurring tones. Overall, this thesis suggests that music-induced emotion and musical engagement are enhanced as a result of nicotine consumption.
APA, Harvard, Vancouver, ISO, and other styles
47

Pye, A. "The perception of emotion and identity in non-speech vocalisations." Thesis, Bangor University, 2015. https://research.bangor.ac.uk/portal/en/theses/the-perception-of-emotion-and-identity-in-nonspeech-vocalisations(efff271d-3c3a-4a39-9ccb-b51cadb937e8).html.

Full text
Abstract:
The voice contains a wealth of information relevant for successful and meaningful social interactions. Aside from speech, the vocal signal also contains paralinguistic information such as the emotional state and identity of the speaker. The three empirical chapters reported in this thesis research the perceptual processing of paralinguistic vocal cues. The first set of studies uses unimodal adaptation to explore the mental representation of emotion in the voice. Using a series of different adaptor stimuli -human emotional vocalisations, emotive dog calls and affective instrumental bursts- it was found that aftereffects in human vocal emotion perception were largest following adaptation to human vocalisations. There was still an aftereffect present following adaptation to dog calls, however it was smaller in magnitude than that of the human vocalisation aftereffect and potentially as a result of the acoustic similarities between adaptor and test stimuli. Taken together, these studies suggest that the mental representation of emotion in the voice is not species specific but is specific to vocalisations as opposed to all affective auditory stimuli. The second empirical chapter examines the supramodal relationship between identity and emotion in face-voice adaptation. It was found that emotional faces have the ability to produce aftereffects in vocal emotion perception, irrespective of the identity of the adaptor and test stimuli being congruent. However, this effect was found to be dependent upon the adapting stimuli being dynamic as opposed to static in nature. The final experimental chapter looks at the mechanisms underlying the perception of vocal identity. A voice matching test was developed and standardised, finding large individual differences in voice matching ability. Furthermore, in an identity adaptation experiment, absolute difference in aftereffect size demonstrated a trend towards significance when correlated with voice matching ability, suggesting that there is a relationship between perceptual abilities and the degree of plasticity observed in response adaptation.
APA, Harvard, Vancouver, ISO, and other styles
48

Symons, Ashley. "Examining the role of temporal prediction in multisensory emotion perception." Thesis, University of Manchester, 2018. https://www.research.manchester.ac.uk/portal/en/theses/examining-the-role-of-temporal-prediction-in-multisensory-emotion-perception(a61f046f-df72-4469-9c4b-71705eb77c6a).html.

Full text
Abstract:
In naturalistic environments, emotion perception depends upon the integration of dynamic facial, body, and vocal expressions. Previous research suggests that emotional expressions are integrated more efficiently than neutral expressions. One possible mechanism facilitating multisensory integration of emotional expressions is cross-modal prediction, which can relate to the formal, spatial, or temporal structure of events. The aim of this thesis is to examine the role of temporal prediction in multisensory emotion perception. More specifically, the experiments presented test the hypothesis that the temporal information provided by emotional facial and body expressions facilitates the integration of subsequent vocal expressions, and explore the factors such as attention and presentation mode that may influence this effect. To test this hypothesis, five experiments were conducted in which participants were presented with audiovisual clips emotional (anger and fear) and neutral facial, body, and vocal expressions. In each experiment, the timing of the vocal expression was manipulated such that it could occur early, on-time, or late with respect to the natural sound onset. Of these five experiments, two behavioural studies were used to determine temporal sensitivity to asynchronies in audiovisual emotional expressions. Based on the results of these experiments, three electroencephalography (EEG) experiments were designed to investigate the role of temporal prediction in multisensory emotion perception when attention was directed to the emotion (Chapter 4), synchrony (Chapter 5), and interjection (Chapter 6) of the audiovisual clip. Moreover, in the final experiment, a mixed design was used to explore whether presentation mode modulates the effect of temporal prediction on multisensory emotion perception. In each EEG experiment, both event-related potential (ERP) and time-frequency analyses were used to determine the effects of temporal prediction on neural indices of multisensory integration (ERP) and the role of oscillatory activity in the generation of multisensory temporal predictions and prediction errors. This complementary approach provided a more comprehensive perspective on the neural dynamics underpinning multisensory emotion perception. The results of these studies yielded several novel findings that provide a basis for future research investigating the role of temporal prediction in multisensory emotion perception. In Chapter 4, results from a temporal order judgment task show that individuals can reliably detect asynchronies of +/- 360 milliseconds (ms) in dynamic facial, body, and vocal expressions, but that this threshold may be smaller (indicating higher temporal sensitivity) for emotional compared to neutral stimuli. Results of the EEG experiment that follows shows that temporal prediction may facilitate the integration of fearful expressions, as reflected by modulation of the auditory N1 ERP component and pre-stimulus beta band activity. In Chapter 5, the results of a synchrony judgment task suggest that emotion improves temporal sensitivity for auditory-leading (early) asynchronies, and that this detection of auditory-leading asynchronies is accompanied by a decrease in alpha power for early compared to on-time and late conditions. In terms of ERPs, main effects of emotion and temporal prediction were observed, but no interaction. Finally, Chapter 6 showed an effect of emotion in the N1 time window and an effect of timing in the P2 time window but no significant interaction and no significant effects in the time-frequency domain. Collectively, the findings from these studies suggest that temporal prediction facilitates the integration of fearful expressions, but only when attention is directed to the emotional quality of the stimulus.
APA, Harvard, Vancouver, ISO, and other styles
49

Laukka, Petri. "Vocal Expression of Emotion : Discrete-emotions and Dimensional Accounts." Doctoral thesis, Uppsala : Acta Universitatis Upsaliensis, Uppsala universitet, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-4666.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Molina, Mariana V. "The Role of Contingency and Gaze Direction in the Emergence of Social Referencing." FIU Digital Commons, 2011. http://digitalcommons.fiu.edu/etd/504.

Full text
Abstract:
The current study assessed the importance of infant detection of contingency and head and eye gaze direction in the emergence of social referencing. Five- to six-month-old infants’ detection of affect-object relations and subsequent manual preferences for objects paired with positive expressions were assessed. In particular, the role of contingency between toys’ movements and an actress’s emotional expressions as well as the role of gaze direction toward the toys’ location were examined. Infants were habituated to alternating films of two toys each paired with an actress’s affective expression (happy and fearful) under contingent or noncontingent and gaze congruent or gaze incongruent conditions. Results indicated that gaze congruence and contingency between toys’ movements and a person’s affective expressions were important for infant perception of affect-object relations. Furthermore, infant perception of the relation between affective expressions and toys translated to their manual preferences for the 3-dimensional toys. Infants who received contingent affective responses to the movements of the toys spent more time touching the toy that was previously paired with the positive expression. These findings demonstrate the role of contingency and gaze direction in the emergence of social referencing in the first half year of life.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography