Dissertations / Theses on the topic 'Emotional face'

To see the other types of publications on this topic, follow the link: Emotional face.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Emotional face.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Neth, Donald C. "Facial configuration and the perception of facial expression." Columbus, Ohio : Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1189090729.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

MATTAVELLI, GIULIA CAMILLA. "Neural correlates of face evaluation: emotional expressions and social traits." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2013. http://hdl.handle.net/10281/43782.

Full text
Abstract:
Face processing is a crucial skill for human interaction. Accordingly, it is supported by a widely distributed fronto-temporo-occipital neural circuit (Haxby et al., 2000). The present work investigates the neural correlates of face expression processing by means of different neuroimaging and electrophysiological techniques. Using fMRI I investigated amygdala responses to basic emotions and activations in face-selective regions in response to social cues detected in faces (Study 1 and Study 2). These studies showed that the amygdala is highly responsive to fear expressions but has also a critical role in appraising socially relevant stimuli and together with the posterior face-selective regions it is sensitive to face distinctiveness as well as social meaning of face features. In Study 3 I demonstrated by means of TMS that the medial prefrontal cortex (mPFC) contains different neural representations for angry and happy expressions linked to lexical knowledge of emotions. Finally, the combined TMS-EEG experiment reported in Study 4 revealed interconnections between activity in the core and the extended system of face processing, and the interactions resulted to be modulated by the type of behavioural task. Taken together the present results help to clarify the role of different regions as part of the face perception system and suggest that the coupling between cortical areas and the coordinated activity of different regions in the distributed network are crucial to recognize the multiplicity of information that faces convey.
APA, Harvard, Vancouver, ISO, and other styles
3

Shostak, Lisa. "Social information processing, emotional face recognition and emotional response style in offending and non-offending adolescents." Thesis, King's College London (University of London), 2007. https://kclpure.kcl.ac.uk/portal/en/theses/social-information-processing-emotional-face-recognition-and-emotional-response-style-in-offending-and-nonoffending-adolescents(15ff1b2d-1e52-46b7-be1a-736098263ce1).html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Siino, Rosanne Marie. "Emotional engagement on geographically distributed teams : exploring interaction challenges in mediated versus face-to-face meetings /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Palumbo, Letizia. "Beyond face value : involuntary emotional anticipation in typical development and Asperger's syndrome." Thesis, University of Hull, 2012. http://hydra.hull.ac.uk/resources/hull:6229.

Full text
Abstract:
Understanding and anticipating the behavior and associated mental/emotional states of mind of others is crucial for successful social interactions. Typically developed (TD) humans rely on the processing and integration of social cues that accompany other’s actions to make, either implicitly or explicitly, inferences about others’ mental states. Interestingly, the attribution of affective or mental states to the agent can in turn (top down) induce distortions in the visual perception of those actions (Hudson, Liu, & Jellema, 2009; Hudson & Jellema, 2011; Jellema, Pecchinenda, Palumbo, & Tan, 2011). The aim of this thesis was to investigate bottom-up and top-down influences on distortions in the perception of dynamic facial expressions and to explore the role those biases may play in action/emotion understanding.
APA, Harvard, Vancouver, ISO, and other styles
6

Merz, Sabine Psychology Faculty of Science UNSW. "Face emotion recognition in children and adolescents; effects of puberty and callous unemotional traits in a community sample." Publisher:University of New South Wales. Psychology, 2008. http://handle.unsw.edu.au/1959.4/41247.

Full text
Abstract:
Previous research suggests that as well as behavioural difficulties, a small subset of aggressive and antisocial children show callous unemotional (CU) personality traits (i.e., lack of remorse and absence of empathy) that set them apart from their low-CU peers. These children have been identified as being most at risk to follow a path of severe and persistent antisocial behaviour, showing distinct behavioural patterns, and have been found to respond less to traditional treatment programs. One particular focus of this thesis is that emerging findings have shown emotion recognition deficits within both groups. Whereas children who only show behavioural difficulties (in the absence of CU traits) have been found to misclassify vague and neutral expressions as anger, the presence of CU traits has been associated with an inability to correctly identify fear and to a lesser extend, sadness. Furthermore, emotion recognition competence varies with age and development. In general, emotion recognition improves with age, but interestingly there is some evidence that it may become less efficient during puberty. No research could be located, however, that assessed emotion recognition through childhood and adolescence for children high and low on CU traits and antisocial behaviour. The primary focus of this study was to investigate the impact of these personality traits and pubertal development on emotion recognition competence in isolation and in combination. A specific aim was to assess if puberty would exacerbate these deficits in children with pre-existing deficits in emotion recognition. The effect of gender, emotion type and measure characteristics, in particular the age of the target face, was also examined. A community sample of 703 children and adolescents aged 7-17 were administered the Strength and Difficulties Questionnaire to assess adjustment, the Antisocial Process Screening Device to assess antisocial traits, and the Pubertal Development Scale was administered to evaluate pubertal stage. Empathy was assessed using the Bryant Index of Empathy for Children and Adolescents. Parents or caregivers completed parent version of these measures for their children. Emotion recognition ability was measured using the newly developed UNSW FACES task (Dadds, Hawes & Merz, 2004). Description of the development and validation of this measure are included. Contrary to expectations, emotion recognition accuracy was not negatively affected by puberty. In addition, no overall differences in emotion recognition ability were found due to participant’s gender or target face age group characteristics. The hypothesis that participants would be better at recognising emotions expressed by their own age group was therefore not supported. In line with expectations, significant negative associations between CU traits and fear recognition were found. However, these were small, and contrary to expectations, were found for girls rather than boys. Also, puberty did not exacerbate emotion recognition deficits in high CU children. However, the relationship between CU traits and emotion recognition was affected differently by pubertal status. The implications of these results are discussed in relation to future research into emotion recognition deficits within this population. In addition, theoretical and practical implications of these findings for the development of antisocial behaviour and the treatment of children showing CU traits are explored.
APA, Harvard, Vancouver, ISO, and other styles
7

Garrod, Oliver G. B. "Mapping multivariate measures of brain response onto stimulus information during emotional face classification." Thesis, University of Glasgow, 2010. http://theses.gla.ac.uk/1662/.

Full text
Abstract:
The relationship between feature processing and visual classification in the brain has been explored through a combination of reverse correlation methods (i.e.“Bubbles” [22]) and electrophysiological measurements (EEG) taken during a facial emotion categorization task [63]. However, in the absence of any specific model of the brain response measurements, this and other [60] attempts to parametrically relate stimulus properties to measurements of brain activation are difficult to interpret. In this thesis I consider a blind data–driven model of brain response. Statistically independent model parameters are found to minimize the expectation of an objective likelihood function over time [55], and a novel combination of methods is proposed for separating the signal from the noise. The model’s estimated signal parameters are then objectively rated by their ability to explain the subject’s performance during a facial emotion classification task, and also by their ability to explain the stimulus features, as revealed in a Bubbles experiment.
APA, Harvard, Vancouver, ISO, and other styles
8

Porto, Juliana Antola. "Neural bases of emotional face processing in infancy : a funcional near-infrared spectroscopy study." Pontif?cia Universidade Cat?lica do Rio Grande do Sul, 2017. http://tede2.pucrs.br/tede2/handle/tede/7867.

Full text
Abstract:
Submitted by PPG Medicina e Ci?ncias da Sa?de (medicina-pg@pucrs.br) on 2018-02-23T19:15:10Z No. of bitstreams: 1 JULIANA_ANTOLA_PORTO_TES.pdf: 4776720 bytes, checksum: 1995f76f1de8d24f63bbbf990ed7083c (MD5)
Approved for entry into archive by Caroline Xavier (caroline.xavier@pucrs.br) on 2018-02-26T19:46:38Z (GMT) No. of bitstreams: 1 JULIANA_ANTOLA_PORTO_TES.pdf: 4776720 bytes, checksum: 1995f76f1de8d24f63bbbf990ed7083c (MD5)
Made available in DSpace on 2018-02-26T19:51:08Z (GMT). No. of bitstreams: 1 JULIANA_ANTOLA_PORTO_TES.pdf: 4776720 bytes, checksum: 1995f76f1de8d24f63bbbf990ed7083c (MD5) Previous issue date: 2017-10-31
Coordena??o de Aperfei?oamento de Pessoal de N?vel Superior - CAPES
As bases neurais do processamento da emo??o facial na inf?ncia s?o amplamente desconhecidas. Os fatores ambientais que podem afetar o processamento facial e o reconhecimento emocional ao longo do curso de desenvolvimento tamb?m s?o pouco compreendidos. No entanto, acredita-se que as experi?ncias iniciais, particularmente envolvendo exposi??o repetida a faces emocionais dos cuidadores, influenciem esse curso. O objetivo deste estudo foi investigar os correlatos neurais do processamento de faces emocionais em lactentes usando a espectroscopia funcional no infravermelho pr?ximo (fNIRS), e examinar a poss?vel influ?ncia das experi?ncias emocionais iniciais dos lactentes, indiretamente medida pela investiga??o de sintomas de ansiedade materna. Foram avaliadas 29 crian?as de 5 meses de idade e suas m?es, recrutadas de uma amostra da comunidade de Boston, EUA. A ansiedade materna foi avaliada usando o componente tra?o do Invent?rio de Ansiedade Tra?o-Estado (STAI-T). Os lactentes observaram imagens visuais est?ticas de faces femininas retratando express?es de alegria e medo, enquanto as respostas hemodin?micas corticais foram medidas usando fNIRS. As respostas de oxihemoglobina (oxiHb) e deoxihemoglobina (deoxiHb) nas ?reas frontais, parietais e temporais foram comparadas entre as faces emocionais, e entre filhos de m?es com n?veis altos e baixos de sintomas de ansiedade. Os resultados demonstraram efeito principal significativo da emo??o (p=0,022), evidenciado pelo aumento na concentra??o de oxiHb para faces de alegria em compara??o a faces de medo. Ademais, observou-se efeito principal significativo da regi?o (p=0,013), induzido por maior concentra??o de oxiHb nas regi?es corticais temporais em rela??o ?s regi?es corticais frontais (p=0,031). Al?m disso, houve uma intera??o significativa entre emo??o, hemisf?rio e ansiedade (p=0,037). As an?lises revelaram que filhos de m?es com alta ansiedade demonstraram uma resposta hemodin?mica significativamente elevada no hemisf?rio esquerdo para faces de alegria, em compara??o com faces de medo no hemisf?rio direito (p=0,040) e esquerdo (p=0,033). Os resultados indicam que lactentes de 5 meses discriminaram faces de alegria em compara??o com faces de medo, evidenciado pela maior ativa??o para a primeira. A maior ativa??o nas regi?es temporais em rela??o ?s ?reas frontais foi discutida em rela??o ? ontog?nese do processamento facial e ?s redes neurais de reconhecimento emocional. A resposta mais acentuada, comparando faces de alegria e medo observada nos filhos de m?es com alta ansiedade, pode estar relacionada a altera??es no ambiente emocional dessas crian?as em compara??o com os filhos de m?es com baixa ansiedade. Assim, os n?veis de ansiedade materna parecem moderar as respostas cerebrais hemodin?micas das crian?as ?s faces emocionais.
The neural bases of facial emotion processing in infancy are largely unknown. The environmental factors that may impact facial processing and emotion recognition along the developmental course are also not clearly understood. However, early experiences, particularly involving consistent exposure to familiar caregiver faces, are believed to influence this course. The aim of this study was to investigate the neural correlates of infants? emotional face processing using functional near-infrared spectroscopy (fNIRS), and examine the potential influence of infants? early emotional experiences, indirectly measured by investigating maternal anxiety symptoms. Participants were 29 typically developing 5-monthold infants and their mothers, recruited from a community sample from the Boston greater area, MA, USA. Maternal anxiety was assessed using the trait component of the State-Trait Anxiety Inventory. Infants observed static visual images of a female model portraying happy and fearful expressions, while hemodynamic brain responses were measured using fNIRS. The oxyhemoglobin (oxyHb) and deoxyhemoglobin (deoxyHb) responses over frontal, parietal and temporal areas were compared for the emotional expressions in infants of mothers reporting low and high levels of anxiety symptoms. Results revealed a significant main effect of emotion (p=.022), driven by greater oxyHb concentration responses for happy compared to fearful faces. There was also a main effect of region (p=.013) induced by a significantly greater oxyHb concentration in temporal compared to frontal cortical regions (p=.031). Additionally, a significant three-way interaction between emotion, hemisphere and anxiety was observed (p=.037). Planned comparisons revealed that infants of high-anxious mothers showed significantly greater left hemispheric activation of oxyHb to happy faces when compared with right (p=.040) and left (p=.033) hemispheric activation of oxyHb to fearful faces. These findings possibly indicate that 5-month-olds can discriminate happy from fearful faces, evinced by the greater activation for the former. The greater activation in temporal as compared to frontal areas was discussed in relation to the ontogenesis of face processing and emotion recognition neural networks. The enhanced response to happy versus fearful faces observed in infants of high-anxious mothers can be related to the presumed altered emotional environment experienced by these infants, compared to that of infants of low-anxious mothers. Therefore, maternal anxiety levels appeared to moderate infants? hemodynamic brain responses to emotional faces.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Xu. "A new method for generic three dimensional human face modelling for emotional bio-robots." Thesis, University of Gloucestershire, 2012. http://eprints.glos.ac.uk/4592/.

Full text
Abstract:
Existing 3D human face modelling methods are confronted with difficulties in applying flexible control over all facial features and generating a great number of different face models. The gap between the existing methods and the requirements of emotional bio-robots applications urges the creation of a generic 3D human face model. This thesis focuses on proposing and developing two new methods involved in the research of emotional bio-robots: face detection in complex background images based on skin colour model and establishment of a generic 3D human face model based on NURBS. The contributions of this thesis are: A new skin colour based face detection method has been proposed and developed. The new method consists of skin colour model for skin regions detection and geometric rules for distinguishing faces from detected regions. By comparing to other previous methods, the new method achieved better results of detection rate of 86.15% and detection speed of 0.4-1.2 seconds without any training datasets. A generic 3D human face modelling method is proposed and developed. This generic parametric face model has the abilities of flexible control over all facial features and generating various face models for different applications. It includes: The segmentation of a human face of 21 surface features. These surfaces have 34 boundary curves. This feature-based segmentation enables the independent manipulation of different geometrical regions of human face. The NURBS curve face model and NURBS surface face model. These two models are built up based on cubic NURBS reverse computation. The elements of the curve model and surface model can be manipulated to change the appearances of the models by their parameters which are obtained by NURBS reverse computation. A new 3D human face modelling method has been proposed and implemented based on bi-cubic NURBS through analysing the characteristic features and boundary conditions of NURBS techniques. This model can be manipulated through control points on the NURBS facial features to build any specific face models for any kind of appearances and to simulate dynamic facial expressions for various applications such as emotional bio-robots, aesthetic surgery, films and games, and crime investigation and prevention, etc.
APA, Harvard, Vancouver, ISO, and other styles
10

Ridout, Nathan. "Processing of emotional material in major depression : cognitive and neuropsychological investigations." Thesis, University of St Andrews, 2005. http://hdl.handle.net/10023/13141.

Full text
Abstract:
The aim of this thesis was to expand the existing knowledge base concerning the profile of emotional processing that is associated with major depression, particularly in terms of socially important non-verbal stimuli (e.g. emotional facial expressions). Experiment one utilised a face-word variant of the emotional Stroop task and demonstrated that depressed patients (DP) did not exhibit a selective attention bias for sad faces. Conversely, the healthy controls (HC) were shown to selectively attend to happy faces. At recognition memory testing, DP did not exhibit a memory bias for depression-relevant words, but did demonstrate a tendency to falsely recognise depression-relevant words that had not been presented at encoding. Experiment two examined the pattern of autobiographical memory (ABM) retrieval exhibited by DP and HC in response to verbal (words) and non-verbal (images & faces) affective cues. DP were slower than HC to retrieve positive ABMs, but did not differ from HC in their retrieval times for negative ABMs. Overall, DP retrieved fewer specific ABMs than did the HC. Participants retrieved more specific ABMs to image cues than to words or faces, but this pattern was only demonstrated by the HC. Reduced retrieval of specific ABMs by DP was a consequence of increased retrieval of categorical ABMs; this tendency was particularly marked when the participants were cued with faces. During experiment three, DP and HC were presented with a series of faces and were asked to identify the gender of the person featured in each photograph. Overall, gender identification times were not affected by the emotion portrayed by the faces. Furthermore at subsequent recognition memory testing, DP did not exhibit MCM bias for sad faces. During experiment four, DP and HC were presented with videotaped depictions of 'realistic' social interactions and were asked to identify the emotion portrayed by the characters and to make inferences about the thoughts, intentions and beliefs of these individuals. Overall, DP were impaired in their recognition of happiness and in understanding social interactions involving sarcasm and deception. Correct social inference was significantly related to both executive function and depression severity. Experiment five involved assessing a group of eight patients that had undergone neurosurgery for chronic, treatment-refractory depression on the identical emotion recognition and social perception tasks that were utilised in experiment four. Relative to HC, surgery patients (SP) exhibited general deficits on all emotion recognition and social processing tasks. Notably, depression status did not appear to interact with surgery status to worsen these observed deficits. These findings suggest that the anterior cingulate region of the prefrontal cortex may play a role in correct social inference. Summary: Taken together the findings of the five experimental studies of the thesis demonstrate that, in general, biases that have been observed in DP processing of affective verbal material generalise to non-verbal emotional material (e.g. emotional faces). However, there are a number of marked differences that have been highlighted throughout the thesis. There is also evidence that biased emotional processing in DP requires explicit processing of the emotional content of the stimuli. Furthermore, a central theme of the thesis is that deficits in executive function in DP appear to be implicated in the impairments of emotional processing that are exhibited by these patients.
APA, Harvard, Vancouver, ISO, and other styles
11

Fasola, Christiana. "The Effects of Emotive Faces and Emotional Intelligence on Task Performance." Xavier University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=xavier1500252906257976.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Riehle, Marcel Verfasser], and Tania [Akademischer Betreuer] [Lincoln. "Interpersonal consequences of diminished emotional expressiveness in schizophrenia : an investigation of facial expressions within face-to-face interactions / Marcel Riehle ; Betreuer: Tania Lincoln." Hamburg : Staats- und Universitätsbibliothek Hamburg, 2016. http://d-nb.info/1116604744/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Beall, Paula M. "Automaticity and Hemispheric Specialization in Emotional Expression Recognition: Examined using a modified Stroop Task." Thesis, University of North Texas, 2002. https://digital.library.unt.edu/ark:/67531/metadc3267/.

Full text
Abstract:
The main focus of this investigation was to examine the automaticity of facial expression recognition through valence judgments in a modified photo-word Stroop paradigm. Positive and negative words were superimposed across male and female faces expressing positive (happy) and negative (angry, sad) emotions. Subjects categorized the valence of each stimulus. Gender biases in judgments of expressions (better recognition for male angry and female sad expressions) and the valence hypothesis of hemispheric advantages for emotions (left hemisphere: positive; right hemisphere: negative) were also examined. Four major findings emerged. First, the valence of expressions was processed automatically (robust interference effects). Second, male faces interfered with processing the valence of words. Third, no posers' gender biases were indicated. Finally, the emotionality of facial expressions and words was processed similarly by both hemispheres.
APA, Harvard, Vancouver, ISO, and other styles
14

Sergerie, Karine. "A face to remember : an fMRI study of the effects of emotional expression on recognition memory." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=82422.

Full text
Abstract:
Emotion can exert a modulatory role on declarative memory. Several studies have shown that emotional stimuli (e.g., words, pictures) are better remembered than neutral ones. Although facial expressions are powerful emotional stimuli and have been shown to influence perception and attention processes, little is known about their effect on memory. We conducted an event-related fMRI study in 18 healthy individuals (9 men) to investigate the effects of expression on recognition memory for faces. During the encoding phase, participants viewed 84 faces of different individuals, depicting happy, fearful or neutral expressions. Subjects were asked to perform a gender discrimination task and remember the faces for later. In the recognition part subjects performed an old/new decision task on 168 faces (84 new). Both runs were scanned. Our findings highlight the importance of the amygdala, hippocampus and prefrontal cortex on the formation and retrieval of memories with emotional content.
APA, Harvard, Vancouver, ISO, and other styles
15

Hadden, Alexis A. "Face Threat Mitigation in Feedback: An Examination of Student Apprehension, Self-Efficacy, and Perceived Emotional Support." UKnowledge, 2017. http://uknowledge.uky.edu/comm_etds/59.

Full text
Abstract:
This experimental study examined the effects of an instructor’s face threat mitigation tactics on student self-efficacy for learning and perceived emotional support from the instructor in a written feedback setting. Participants (N = 401) were randomly assigned to one of four feedback scenarios in which level of face threat mitigation and instructor age and status were manipulated. Student grade orientation and state feedback apprehension were measured prior to being exposed to the feedback scenario. Results indicate that high face threat mitigation is positively associated with student self-efficacy for learning and perceived emotional support from the instructor. Results also revealed that state feedback apprehension predicts self-efficacy for learning and perceived emotional support from the instructor. Grade orientation predicted self-efficacy for learning but did not significantly predict perceived emotional support from the instructor providing feedback. Finally, scenarios manipulated for instructor age and status did not significantly differ in self-efficacy for learning or perceived emotional support from the instructor. Implications regarding theory, the measurement of feedback apprehension, and student-instructor communication are discussed.
APA, Harvard, Vancouver, ISO, and other styles
16

MERMIER, JULIA. "PROCESSING EMOTIONAL FACES WITHIN CONTEXT: EVIDENCE FROM INFANCY AND CHILDHOOD." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2022. http://hdl.handle.net/10281/370570.

Full text
Abstract:
Le espressioni facciali svolgono un ruolo importante nelle interazioni sociali, comunicando informazioni sullo stato d’animo e le intenzioni di chi le esprime. La maggioranza della letteratura sulla percezione delle emozioni ha preso in esame i volti emotivi isolandoli dal contesto, assumendo che essi siano di per sé sufficienti a veicolare, in maniera non ambigua ed indipendente dal contesto, le manifestazioni emotive (Calder et al., 1996; Smith et al., 2005) . Tuttavia, nella vita reale è assai raro osservare espressioni emotive isolate da un contesto e molti studi recenti che coinvolgono partecipanti adulti indicano che il contesto nel quale le espressioni emotive sono inserite svolge un ruolo essenziale nella loro percezione (Aviezer et al., 2017; Wieser et al., 2014). In particolare, negli adulti, è stato mostrato come varie forme di contesti emotivi e sociali (ad esempio, emozioni espresse tramite gestualità corporee e scene visive, fattori sociali intrinseci o precedenti esperienze sociali) modulano il riconoscimento, la valutazione e l’elaborazione neurale delle espressioni facciali (Aviezer et al., 2017; Iidaka et al., 2010; Jack et al., 2012; Pickett et al., 2004; Righart & De Gelder, 2006). Alcuni studi suggeriscono che anche nei primi anni di vita i fattori contestuali svolgano un ruolo nella percezione delle emozioni (citazioni). Tuttavia, la ricerca in questo ambito è estremamente scarsa e prende in esame una gamma di contesti fortemente limitata e selettiva. Questa tesi di dottorato ha quindi l’obiettivo di fornire un quadro più completo sull’influenza del contesto nell’elaborazione delle espressioni facciali nel corso dello sviluppo, esaminando gli effetti di diversi contesti sulla percezione, e sul riconoscimento delle espressioni facciali nei bambini in età infantile prescolare e scolare. La prima parte di questa tesi prende in esame i segnali emotivi contestuali, verificando come la presenza di altre emozioni nel contesto possa modulare la percezione emotiva. I risultati mostrano che, in bambini di 12 mesi, sia i volti emotivi (Capitolo 1), sia le espressioni emotive cinematiche (Capitolo 2) presenti nel contesto influenzano la percezione delle emozioni, modulando il comportamento visivo e l’attività neurale dei bambini. La seconda parte della tesi si concentra sugli effetti contestuali suscitati da circostanze sociali nei bambini in età infantile, prescolare e scolare. I risultati mostrano che le situazioni contestuali di inclusione ed esclusione influenzano l’elaborazione neurale delle espressioni facciali negli infanti di 13 mesi (Capitolo 3), e il loro riconoscimento nei bambini di 5 anni, ma non nei bambini di 7 e 10 anni (Capitolo 4). La tesi nel suo complesso fornisce evidenze empiriche che dimostrano gli effetti contestuali generati da diverse situazioni emozionali e sociali (ad esempio, volti e azioni che esprimono una emozione nel contesto, inclusione ed esclusione sociale) in infanti e bambini, ed agiscano a diversi livelli dell’elaborazione delle espressioni emotive facciali (ad esempio, neurale e comportamentale). Inoltre, i risultati suggeriscono che questi effetti contestuali varino in funzione dell’età dei bambini(ad esempio, il Capitolo 4 mostra come gli effetti contestuali fossero presenti solo in bambini di 5 anni). Riassumendo, nelle prime fasi dello sviluppo il contesto sembra svolgere un ruolo chiave nella percezione delle espressioni facciali.
Facial expressions, by conveying information on individuals’ internal state and intentions, play an important role in social interactions. The idea that faces alone convey all the necessary information about the expresser’s emotional state in an unambiguous manner and independently of contextual factors was prevalent in the past decades (Calder et al., 1996; Smith et al., 2005) and drove the majority of literature on emotion perception to examine faces in isolation. Nonetheless, facial expressions are very rarely encountered in isolation in real life, and many recent adult studies indicate that the context in which they occur plays an essential part in their perception (for a review, see Aviezer et al., 2017; Wieser et al., 2014). Specifically, various forms of emotional and social context (e.g., emotional bodies or visual scenes, intrinsic social factors or past social experiences) were shown to have a significant influence on adults’ recognition, evaluation, and neural processing of facial expressions (Aviezer et al., 2017; Iidaka et al., 2010; Jack et al., 2012; Pickett et al., 2004; Righart & De Gelder, 2006). However, research investigating the influence of context on the processing of emotional faces in developmental populations is extremely scarce, and although it suggests that contextual effects are also present in infancy and childhood, only a small subset of contextual cues have been examined so far. Therefore, this doctoral dissertation aimed at providing a more comprehensive view of the influence of context on the processing of facial emotions at different developmental stages, by examining the effects of different contextual cues on the perception, neural processing and recognition of facial expressions in infants and children. The first part of this thesis focused on contextual emotional signals. Results indicated that the surrounding facial emotional context (Chapter 1) as well as emotional kinematics cues (Chapter 2) influenced 12-month-olds’ attention and neural processing of emotional faces. The second part focused on contextual effects elicited by social cues in infants and children. They showed that contextual cues of social inclusion and exclusion affected 13-months-old infants’ neural processing of emotional faces (Chapter 3) as well as 5-, but not 7- nor 10-years-olds’ recognition of facial expressions (Chapter 4). Altogether, this thesis provides evidence that contextual effects can be elicited by various types of emotional and social cues (i.e., surrounding emotional faces, emotional kinematics, social inclusion and exclusion) in infants and children, and affect different levels of the processing of emotional faces (i.e., neural and behavioral). In addition, it suggests that these contextual effects vary as a function of the developmental stage of the perceiver (e.g., contextual effects were present only in 5-year-olds in Chapter 4). In sum, context seems to play an essential role in the processing of facial expressions in infancy and childhood, and should be granted particular attention in future developmental studies.
APA, Harvard, Vancouver, ISO, and other styles
17

Van, Fossen Laurel. "The communicative use of iconic face drawings to express emotional and evaluative statements in persons with aphasia." Thesis, California State University, Long Beach, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=1603546.

Full text
Abstract:

The purpose of this study was to explore (1) if persons with aphasia (PWA) might be able to easily extract emotional meaning from iconic facial drawings, (2) if they are able and willing to use those drawings as a communicative tool to express emotion and evaluative statements with their communication partners, and (3) if their responses differed from individuals with right hemisphere dysfunction (RHD). Ten persons with aphasia and seven persons with RHD participated in the study, along with two control groups of 34 neurotypical adults. The first phase of the study required 24 neurotypical adults to match twelve words describing various emotional states with the facial drawing most closely representing the word. Then, they were asked to copy six of the drawings as a baseline for drawing accuracy. The six drawings which were determined by the control group to have the least amount of ambiguity of meaning were selected as stimuli to the experimental group. In the second phase of the study, PWAs and persons with RHD were asked to match each drawing with a labeled photograph of a person with a similar facial expression. Secondly, to test their ability to produce these drawings, both stroke groups were asked to copy six of the facial drawings. Lastly, the two experimental groups completed a short, anonymous survey about the nature of their communication difficulties and their willingness to use drawing as a communicative tool. The resultant data was compared to a second control group of ten neurotypical adults, and then, to determine the best candidates for this proposed strategy, the two stroke groups were compared with each other. The results demonstrated that both persons with nonfluent aphasia and RHD were able to identify and copy the drawings with moderate success, although only the PWAs were willing to use drawing to communicate.

APA, Harvard, Vancouver, ISO, and other styles
18

Juth, Pernilla. "Finding the emotional face in the crowd and the role for threat-biased attention in social anxiety." Stockholm : Department of Clinical Neuroscience, Karolinska Institutet, 2010. http://diss.kib.ki.se/2010/978-91-7409-746-7/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

BRENNA, VIOLA. "Positive and negative facial emotional expressions: the effect on infants' and children's facial identity recognition." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2013. http://hdl.handle.net/10281/46845.

Full text
Abstract:
Aim of the present study was to investigate the origin and the development of the interdipendence between identity recognition and facial emotional expression processing, suggested by recent models on face processing (Calder & Young, 2005) and supported by outcomes on adults (e.g. Baudouin, Gilibert, Sansone, & Tiberghien, 2000; Schweinberger & Soukup, 1998). Particularly the effect of facial emotional expressions on infants’ and children’s ability to recognize identity of a face was explored. Studies on adults describe a different role of positive and negative emotional expressions on identity recognition (e.g. Lander & Metcalfe, 2007), i.e. positive expressions have a catalytic effect, increasing rating of familiarity of a face, conversely negative expression reduce familiarity judgments, producing an interference effect. Using respectively familiarization paradigm and a delayed two alternative forced-choice matching-to-sample task, 3-month-old infants (Experiment 1, 2, 3) and 4- and 5-year-old children (Experiment 4, 5) were tested. Results of Experiment 1 and 2 suggested an adult-like pattern at 3 months of age. Infants familiarized with a smiling face recognized the new identity in the test phase, but when they were shown with a woman’s face conveying negative expression, both anger or fear, they were not able to discriminate between the new and familiar face stimulus during the test. Moreover, evidence from Experiment 3 demonstrated that a single feature of a happy face (i.e. smiling mouth or “happy eyes”) is sufficient to drive the observed facilitator effect on identity recognition. Conversely, outcomes obtained in experiments with pre-school aged suggested that both positive and negative emotions have a distracting effect on children identity recognition. A decrement in children's performance was observed when faces 8 displayed an emotional expression (i.e. happiness, anger and fear) rather than a neutral expression (Experiment 4). This detrimental effect of a happy expression on face identity recognition emerged independently of the processing stage -i.e., encoding, recognition, encoding and recognition- at which emotional information was provided (Experiment 5). Overall, these findings suggest that, both in infancy and in childhood, facial emotional processing interacts with identity recognition. Moreover, observed outcomes seem to describe an U-shaped developmental trend of the relation between identity recognition and facial emotional expressions processing. The results are discussed by referring to Karmiloff-Smith’s Representational Redescription Model (1992).
APA, Harvard, Vancouver, ISO, and other styles
20

Cooper, Robbie Mathew. "The effects of eye gaze and emotional facial expression on the allocation of visual attention." Thesis, University of Stirling, 2006. http://hdl.handle.net/1893/128.

Full text
Abstract:
This thesis examines the way in which meaningful facial signals (i.e., eye gaze and emotional facial expressions) influence the allocation of visual attention. These signals convey information about the likely imminent behaviour of the sender and are, in turn, potentially relevant to the behaviour of the viewer. It is already well established that different signals influence the allocation of attention in different ways that are consistent with their meaning. For example, direct gaze (i.e., gaze directed at the viewer) is considered both to draw attention to its location and hold attention when it arrives, whereas observing averted gaze is known to create corresponding shifts in the observer’s attention. However, the circumstances under which these effects occur are not yet understood fully. The first two sets of experiments in this thesis tested directly whether direct gaze is particularly difficult to ignore when the task is to ignore it, and whether averted gaze will shift attention when it is not relevant to the task. Results suggest that direct gaze is no more difficult to ignore than closed eyes, and the shifts in attention associated with viewing averted gaze are not evident when the gaze cues are task-irrelevant. This challenges the existing understanding of these effects. The remaining set of experiments investigated the role of gaze direction in the allocation of attention to emotional facial expressions. Without exception, previous work looking at this issue has measured the allocation of attention to such expressions when gaze is directed at the viewer. Results suggest that while the type of emotional expression (i.e., angry or happy) does influence the allocation of attention, the associated gaze direction does not, even when the participants are divided in terms of anxiety level (a variable known to influence the allocation of attention to emotional expressions). These findings are discussed in terms of how the social meaning of the stimulus can influence preattentive processing. This work also serves to highlight the need for general theories of visual attention to incorporate such data. Not to do so fundamentally risks misrepresenting the nature of attention as it operates out-with the laboratory setting.
APA, Harvard, Vancouver, ISO, and other styles
21

Mileva, Viktoria. "Social status in humans : differentiating the cues to dominance and prestige in men and women." Thesis, University of Stirling, 2016. http://hdl.handle.net/1893/23269.

Full text
Abstract:
Human social status has long been of interest to evolutionary and social psychologists. The question of who gets to control resources and be a leader has garnered a lot of attention from these and other fields, and this thesis examines evidence for there being two different mechanisms of achieving high status, and their correlates. The mechanisms are 1) Dominance: being aggressive, manipulative and forcing others to follow you, and 2) Prestige: possessing qualities which make others freely follow you. Chapter 1 is an introductory chapter in which I explain selection pressures, group formation, and the need for social hierarchies; I then describe the two proposed methods of attaining social status and how facial characteristics can give clues as to an individual’s social status. In Chapter 2, my first experimental chapter, I examined how faces created to appear either high in dominance or high in prestige were judged with respect to those traits as well as personality characteristics. Taking this further, in Chapter 3, I looked at how natural variation in real faces would reflect differences in other- and self-perceived ratings of dominance and prestige. Chapter 4 served to examine whether, given a set of words related to social status, I would find differences in what words were placed into dominant or prestige categories. Findings within these chapters are consistent with dominance and prestige being separable methods of attaining high status, from differences in facial appearance (Chapter 2 and 3), to personality characteristics (Chapter 2), to word usage (Chapter 4). Once I had established that these were two distinct routes to achieving high status, I chose to focus on dominance in Chapter 5 and explored the conceptual relationships between dominance and facial expressions. I found that manipulating perceptions of dominance affected how intense expressions of anger, sadness, and fear were perceived (Chapter 5). As there has been a paucity of research in the area of women’s social status, in Chapter 6, I went on to explore what effects cosmetics use in women would have on their perceived social status. I found differences in how men and women perceived women wearing cosmetics, which again points to a distinction between dominance and prestige. My thesis then presents a broad view of the two different mechanisms for attaining high status. Using new methods not otherwise used in exploring dominance and prestige I was able to explore correlates and indicators, as well as perceptions of both strategies. These findings will allow us to determine who might be capable of attaining social status, which of the two methods they might use, as well as what implicit associations we hold about each. They will also open doors for future research into the two strategies, and even help interpret previous research, as many previous studies simply relate to high status and do not distinguish between dominance and prestige.
APA, Harvard, Vancouver, ISO, and other styles
22

Hunter, Kirsten, and n/a. "Affective Empathy in Children: Measurement and Correlates." Griffith University. School of Applied Psychology, 2004. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20040610.135822.

Full text
Abstract:
Empathy is a construct that plays a pivotal role in the development of interpersonal relationships, and thus ones ability to function socially and often professionally. The development of empathy in children is therefore of particular interest to allow for further understanding of normative and atypical developmental trajectories. This thesis investigated the assessment of affective empathy in children aged 5-12, through the development and comparison of a multimethod assessment approach. Furthermore this thesis evaluated the differential relationships between affective empathy and global behavioural problems in children versus the presence of early psychopathic traits, such as callous-unemotional traits. The first component of this study incorporated; a measure of facial expression of affective empathy, and self-reported experience of affective empathy, as measured by the newly designed Griffith Empathy Measure - Video Observation (GEM-VO) and the Griffith Empathy Measure - Self Report (GEM-SR); the Bryant's Index of Empathy for Children and Adolescents (1982) which is a traditional child self-report measure; and a newly designed parent-report of child affective empathy (Griffith Empathy Measure - Parent Report; GEM-PR). Using a normative community sample of 211 children from grades 1, 3, 5, and 7 (aged 5-6, 7-8, 9-10, & 11-12, respectively), the GEM-PR and the Bryant were found to have moderate to strong internal consistency. As a measure of concurrent validity, strong positive correlations were found between the mother and father reports (GEM-PR) of their child's affective empathy, for grades 5 and 7, and for girls of all age groups. Using a convenience sample of 31 parents and children aged 5 to 12, the GEM-PR and the Bryant demonstrated strong test-retest reliability. The reliability of the GEM-VO and the GEM-SR were assessed using a convenience sample of 20 children aged 5 to 12. These measures involve the assessment of children's facial and verbal responses to emotionally evocative videotape vignettes. Children were unobtrusively videotaped while they watched the vignettes and their facial expressions were coded. Children were then interviewed to determine the emotions they attributed to stimulus persons and to themselves whilst viewing the material. Adequate to strong test-retest reliability was found for both measures. Using 30% from the larger sample of 211 participants (N=60), the GEM-VO also demonstrated robust inter-rater reliability. This multimethod approach to assessing child affective empathy produced differing age and gender trends. Facial affect as reported by the GEM-VO decreased with age. Similarly, the matching of child facial emotion to the vignette protagonist's facial emotion was higher in the younger grades. These findings suggest that measures that assess the matching of facial affect (i.e., GEM-VO) may be more appropriate for younger age groups who have not yet learnt to conceal their facial expression of emotion. Data from the GEM-SR suggests that older children are more verbally expressive of negative emotions then younger children, with older girls found to be the most verbally expressive of feeling the same emotion as the vignette character; a role more complimentary of the female gender socialization pressures. These findings are also indicative of the increase in emotional vocabulary and self-awareness in older children, supporting the validity of child self-report measures (based on observational stimuli) with older children. In comparing data from the GEM-VO and GEM-SR, this study found that for negative emotions the consistency between facial emotions coded and emotions verbally reported increased with age. This consistency across gender and amongst the older age groups provides encouraging concurrent validity, suggesting the results of one measure could be inferred through the exclusive use of the alternate measurement approach. In contrast, affective empathy as measured by the two measures; the accurate matching of the participant and vignette character's facial expression (GEM-VO), and the accurate matching of the self reported and vignette character's emotion (GEM-SR); were not found to converge. This finding is consistent with prior research and questions the assumption that facially expressed and self-appraised indexes of affective empathy are different aspects of a complex unified process. When evaluating the convergence of all four measures of affective empathy, negative correlations were found between the Bryant and the GEM-PR, these two measures were also found to not converge with the GEM-VO and GEM-SR in a consistent and predictable way. These findings pose the question of whether different aspects of the complex phenomena of affective empathy are being assessed. Furthermore, the validity of the exclusive use of a child self report measure such as the Bryant, which is the standard assessment in the literature, is questioned. The possibility that callous-unemotional traits (CU; a unique subgroup identified in the child psychopathy literature) may account for the mixed findings throughout research regarding the assumption that deficiencies in empathy underlie conduct problems in children, was examined using regression analysis. Using the previous sample of 211 children aged 5-12, conduct problems (CP) were measured using the Strengths and Difficulties Questionnaire (SDQ; Goodman, 1999), and the CU subscale was used from the Antisocial Process Screening Device (APSD; Caputo, Frick, & Brodsky, 1999). Affective empathy when measured by the GEM-PR and the Bryant showed differing patterns in the relationship between affective empathy, CU traits and CP. While the GEM-Father reported that neither age, CU traits nor CP accounted for affective empathy variance, the GEM-Mother report supported that affective empathy was no longer associated with CP once CU traits had been partialled out. In contrast, the Bryant reported for girls, that CU traits were not found to have an underlying correlational relationship. It can be argued from the GEM-Mother data only that it was the unmeasured variance of CU traits that was accounting for the relationship between CP and affective empathy found in the literature. Furthermore, the comparison of an altered CU subscale with all possible empathy items removed, suggests that the constructs of CU traits and affective empathy are not synonymous or overlapping in nature, but rather are two independent constructs. This multimethod approach highlights the complexity of this research area, exemplifying the significant influence of the source of the reports, and suggesting that affective empathy consists of multiple components that are assessed to differing degrees by the different measurement approaches.
APA, Harvard, Vancouver, ISO, and other styles
23

Ruivo, João Pedro Prospero. "Um modelo para inferência do estado emocional baseado em superfícies emocionais dinâmicas planares." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/3/3152/tde-28022018-110833/.

Full text
Abstract:
Emoções exercem influência direta sobre a vida humana, mediando a maneira como os indivíduos interagem e se relacionam, seja em âmbito pessoal ou social. Por essas razões, o desenvolvimento de interfaces homem-máquina capazes de manter interações mais naturais e amigáveis com os seres humanos se torna importante. No desenvolvimento de robôs sociais, assunto tratado neste trabalho, a adequada interpretação do estado emocional dos indivíduos que interagem com os robôs é indispensável. Assim, este trabalho trata do desenvolvimento de um modelo matemático para o reconhecimento do estado emocional humano por meio de expressões faciais. Primeiramente, a face humana é detectada e rastreada por meio de um algoritmo; então, características descritivas são extraídas da mesma e são alimentadas no modelo de reconhecimento de estados emocionais desenvolvidos, que consiste de um classificador de emoções instantâneas, um filtro de Kalman e um classificador dinâmico de emoções, responsável por fornecer a saída final do modelo. O modelo é otimizado através de um algoritmo de têmpera simulada e é testado sobre diferentes bancos de dados relevantes, tendo seu desempenho medido para cada estado emocional considerado.
Emotions have direct influence on the human life and are of great importance in relationships and in the way interactions between individuals develop. Because of this, they are also important for the development of human-machine interfaces that aim to maintain natural and friendly interactions with its users. In the development of social robots, which this work aims for, a suitable interpretation of the emotional state of the person interacting with the social robot is indispensable. The focus of this work is the development of a mathematical model for recognizing emotional facial expressions in a sequence of frames. Firstly, a face tracker algorithm is used to find and keep track of a human face in images; then relevant information is extracted from this face and fed into the emotional state recognition model developed in this work, which consists of an instantaneous emotional expression classifier, a Kalman filter and a dynamic classifier, which gives the final output of the model. The model is optimized via a simulated annealing algorithm and is experimented on relevant datasets, having its performance measured for each of the considered emotional states.
APA, Harvard, Vancouver, ISO, and other styles
24

Hunter, Kirsten. "Affective Empathy in Children: Measurement and Correlates." Thesis, Griffith University, 2004. http://hdl.handle.net/10072/366612.

Full text
Abstract:
Empathy is a construct that plays a pivotal role in the development of interpersonal relationships, and thus ones ability to function socially and often professionally. The development of empathy in children is therefore of particular interest to allow for further understanding of normative and atypical developmental trajectories. This thesis investigated the assessment of affective empathy in children aged 5-12, through the development and comparison of a multimethod assessment approach. Furthermore this thesis evaluated the differential relationships between affective empathy and global behavioural problems in children versus the presence of early psychopathic traits, such as callous-unemotional traits. The first component of this study incorporated; a measure of facial expression of affective empathy, and self-reported experience of affective empathy, as measured by the newly designed Griffith Empathy Measure - Video Observation (GEM-VO) and the Griffith Empathy Measure - Self Report (GEM-SR); the Bryant's Index of Empathy for Children and Adolescents (1982) which is a traditional child self-report measure; and a newly designed parent-report of child affective empathy (Griffith Empathy Measure - Parent Report; GEM-PR). Using a normative community sample of 211 children from grades 1, 3, 5, and 7 (aged 5-6, 7-8, 9-10, & 11-12, respectively), the GEM-PR and the Bryant were found to have moderate to strong internal consistency. As a measure of concurrent validity, strong positive correlations were found between the mother and father reports (GEM-PR) of their child's affective empathy, for grades 5 and 7, and for girls of all age groups. Using a convenience sample of 31 parents and children aged 5 to 12, the GEM-PR and the Bryant demonstrated strong test-retest reliability. The reliability of the GEM-VO and the GEM-SR were assessed using a convenience sample of 20 children aged 5 to 12. These measures involve the assessment of children's facial and verbal responses to emotionally evocative videotape vignettes. Children were unobtrusively videotaped while they watched the vignettes and their facial expressions were coded. Children were then interviewed to determine the emotions they attributed to stimulus persons and to themselves whilst viewing the material. Adequate to strong test-retest reliability was found for both measures. Using 30% from the larger sample of 211 participants (N=60), the GEM-VO also demonstrated robust inter-rater reliability. This multimethod approach to assessing child affective empathy produced differing age and gender trends. Facial affect as reported by the GEM-VO decreased with age. Similarly, the matching of child facial emotion to the vignette protagonist's facial emotion was higher in the younger grades. These findings suggest that measures that assess the matching of facial affect (i.e., GEM-VO) may be more appropriate for younger age groups who have not yet learnt to conceal their facial expression of emotion. Data from the GEM-SR suggests that older children are more verbally expressive of negative emotions then younger children, with older girls found to be the most verbally expressive of feeling the same emotion as the vignette character; a role more complimentary of the female gender socialization pressures. These findings are also indicative of the increase in emotional vocabulary and self-awareness in older children, supporting the validity of child self-report measures (based on observational stimuli) with older children. In comparing data from the GEM-VO and GEM-SR, this study found that for negative emotions the consistency between facial emotions coded and emotions verbally reported increased with age. This consistency across gender and amongst the older age groups provides encouraging concurrent validity, suggesting the results of one measure could be inferred through the exclusive use of the alternate measurement approach. In contrast, affective empathy as measured by the two measures; the accurate matching of the participant and vignette character's facial expression (GEM-VO), and the accurate matching of the self reported and vignette character's emotion (GEM-SR); were not found to converge. This finding is consistent with prior research and questions the assumption that facially expressed and self-appraised indexes of affective empathy are different aspects of a complex unified process. When evaluating the convergence of all four measures of affective empathy, negative correlations were found between the Bryant and the GEM-PR, these two measures were also found to not converge with the GEM-VO and GEM-SR in a consistent and predictable way. These findings pose the question of whether different aspects of the complex phenomena of affective empathy are being assessed. Furthermore, the validity of the exclusive use of a child self report measure such as the Bryant, which is the standard assessment in the literature, is questioned. The possibility that callous-unemotional traits (CU; a unique subgroup identified in the child psychopathy literature) may account for the mixed findings throughout research regarding the assumption that deficiencies in empathy underlie conduct problems in children, was examined using regression analysis. Using the previous sample of 211 children aged 5-12, conduct problems (CP) were measured using the Strengths and Difficulties Questionnaire (SDQ; Goodman, 1999), and the CU subscale was used from the Antisocial Process Screening Device (APSD; Caputo, Frick, & Brodsky, 1999). Affective empathy when measured by the GEM-PR and the Bryant showed differing patterns in the relationship between affective empathy, CU traits and CP. While the GEM-Father reported that neither age, CU traits nor CP accounted for affective empathy variance, the GEM-Mother report supported that affective empathy was no longer associated with CP once CU traits had been partialled out. In contrast, the Bryant reported for girls, that CU traits were not found to have an underlying correlational relationship. It can be argued from the GEM-Mother data only that it was the unmeasured variance of CU traits that was accounting for the relationship between CP and affective empathy found in the literature. Furthermore, the comparison of an altered CU subscale with all possible empathy items removed, suggests that the constructs of CU traits and affective empathy are not synonymous or overlapping in nature, but rather are two independent constructs. This multimethod approach highlights the complexity of this research area, exemplifying the significant influence of the source of the reports, and suggesting that affective empathy consists of multiple components that are assessed to differing degrees by the different measurement approaches.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Applied Psychology (Health)
Full Text
APA, Harvard, Vancouver, ISO, and other styles
25

Küster, Dennis [Verfasser]. "The relationship between emotional experience, social context, and the face : an investigation of process underlying facial activity / Dennis Küster." Bremen : IRC-Library, Information Resource Center der Jacobs University Bremen, 2008. http://d-nb.info/103472200X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Hammerschmidt, Wiebke [Verfasser], Annekathrin [Akademischer Betreuer] Schacht, Annekathrin [Gutachter] Schacht, and Igor [Gutachter] Kagan. "Dissociating Inherent Emotional and Associated Motivational Salience in Human Face Processing / Wiebke Hammerschmidt ; Gutachter: Annekathrin Schacht, Igor Kagan ; Betreuer: Annekathrin Schacht." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2018. http://d-nb.info/1160442355/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Stein, Jan-Philipp, and Peter Ohler. "Saving Face in Front of the Computer? Culture and Attributions of Human Likeness Influence Users' Experience of Automatic Facial Emotion Recognition." Frontiers Media S.A, 2018. https://monarch.qucosa.de/id/qucosa%3A31524.

Full text
Abstract:
In human-to-human contexts, display rules provide an empirically sound construct to explain intercultural differences in emotional expressivity. A very prominent finding in this regard is that cultures rooted in collectivism—such as China, South Korea, or Japan—uphold norms of emotional suppression, contrasting with ideals of unfiltered self-expression found in several Western societies. However, other studies have shown that collectivistic cultures do not actually disregard the whole spectrum of emotional expression, but simply prefer displays of socially engaging emotions (e.g., trust, shame) over the more disengaging expressions favored by the West (e.g., pride, anger). Inspired by the constant advancement of affective technology, this study investigates if such cultural factors also influence how people experience being read by emotion-sensitive computers. In a laboratory experiment, we introduce 47 Chinese and 42 German participants to emotion recognition software, claiming that it would analyze their facial micro-expressions during a brief cognitive task. As we actually present standardized results (reporting either socially engaging or disengaging emotions), we manipulate participants' impression of having matched or violated culturally established display rules in a between-subject design. First, we observe a main effect of culture on the cardiovascular response to the digital recognition procedure: Whereas Chinese participants quickly return to their initial heart rate, German participants remain longer in an agitated state. A potential explanation for this—East Asians might be less stressed by sophisticated technology than people with a Western socialization—concurs with recent literature, highlighting different human uniqueness concepts across cultural borders. Indeed, while we find no cultural difference in subjective evaluations of the emotion-sensitive computer, a mediation analysis reveals a significant indirect effect from culture over perceived human likeness of the technology to its attractiveness. At the same time, violations of cultural display rules remain mostly irrelevant for participants' reaction; thus, we argue that inter-human norms for appropriate facial expressions might be loosened if faces are read by computers, at least in settings that are not associated with any social consequence.
APA, Harvard, Vancouver, ISO, and other styles
28

Ilich, Andrey, and Gabrielle Voilley. "Staying committed in the face of clientelism : A case study on the Serbian educational sector." Thesis, Uppsala universitet, Företagsekonomiska institutionen, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-414986.

Full text
Abstract:
Clientelism is a form of informal relations between political parties and other agents, based on the exchange of benefits and favors in return for political support and loyalty. It is a legacy from the socialist era that has been shaping Serbian economic and political life for decades. These clientelistic practices could have a harmful impact on organizations if it affects employees’ organizational commitment negatively. Teachers are especially important for the economy of a country since the quality of teaching has an impact on future generations of individuals who will work, live, and contribute to the local communities and the society. The aim of this study is therefore to find out if and how teachers in the public sector perceive clientelism and how this affects their organizational commitment. A case study was made on one prominent school in Serbia, where interviews were held with eight teachers. The findings suggest that clientelism affects teachers’ organizational commitment negatively in some cases, indicating that the impact of clientelism on teachers’ organizational commitment is person specific. Clientelism does not have a direct impact on teachers’ organizational commitment but impacts commitment through proxies, i.e. antecedents to organizational commitment.
APA, Harvard, Vancouver, ISO, and other styles
29

Tanganho, Carla Sofia Moleirinho. "Inteligência emocional, atitudes face à escola e sucesso escolar: estudo exploratório em alunos do 8º e 9º anos com diferentes percursos formativos." Master's thesis, Universidade de Évora, 2015. http://hdl.handle.net/10174/15907.

Full text
Abstract:
Este estudo procura compreender a relação entre inteligência emocional (avaliada pelo EQ-i:YV), atitudes face à escola (avaliadas pelo QAFE) e sucesso escolar, tendo incidido sobre uma amostra de 289 adolescentes com diferentes percursos formativos (ensino regular e ensino alternativo). Os resultados apontam para uma relação positiva e estatisticamente significativa entre as três variáveis em estudo. Foi ainda possível verificar diferenças estatisticamente significativas na inteligência emocional (IE) e nas atitudes face à escola (AFE) em função de variáveis sociodemográficas e académicas, destacando-se a idade e o percurso formativo dos alunos (com os alunos mais novos e os que frequentam o ensino regular a apresentarem uma perceção de IE mais elevada e AFE mais favoráveis). Fica, assim, evidente a importância da IE em contextos educativos como ingrediente-chave para alcançar sucesso, qualidade de relacionamento interpessoal, saúde e bem-estar, bem como das AFE enquanto determinantes do desempenho académico dos alunos, na adaptação escolar e no desejo de prosseguir estudos; Emotional Intelligence, Attitudes Toward School and Academic Success: An exploratory study involving 8th and 9th grade students with different formative paths ABSTRACT: The purpose of this study was to contribute to the understanding of the relationship between emotional intelligence (assessed by the EQ-i:YV), attitudes toward school (assessed by QAFE) and academic success, using a sample of 289 adolescent students with different formative paths (regular education and alternative education programs). The results show a positive and statistically significant relationship between the three variables. Statistically significant differences in emotional intelligence (EI) and attitudes toward school (ATS) in relation to sociodemographic and academic variables were also demonstrated, emphasizing the role of age and students’ formative path (both younger students and students attending a regular education program presented higher levels of perceived EI and more positive ATS). Hence the importance of EI in educational contexts as a key ingredient to achieve success, good interpersonal relationships, healthiness and well-being, as well as ATS as a determining factor in students’ academic achievement, school adaptation and the motivation to pursue further education.
APA, Harvard, Vancouver, ISO, and other styles
30

Gray, Katie L. H. "Unconscious processing of emotional faces." Thesis, University of Southampton, 2011. https://eprints.soton.ac.uk/341583/.

Full text
Abstract:
Due to capacity limits, the brain must select important information for further processing. Evolutionary-based theories suggest that emotional (and specifically threat-relevant) information is prioritised in the competition for attention and awareness (e.g. Ohman & Mineka, 2001). A range of experimental paradigms have been used to investigate whether emotional visual stimuli (relative to neutral stimuli) are selectively processed without awareness, and attract visual attention (e.g. Yang et al., 2007). However, very few studies have used appropriate control conditions that help clarify the extent to which observed effects are driven by the extraction of emotional meaning from these stimuli, or their low-level visual characteristics (such as contrast, or luminance). The experiments in this thesis investigated whether emotional faces are granted preferential access to awareness and which properties of face stimuli drive these effects. A control stimulus was developed to help dissociate between the extraction of emotional information and low-level accounts of the data. It was shown that preferential processing of emotional information is better accounted for by low-level characteristics of the stimuli, rather than the extraction of emotional meaning per se. Additionally, a robust ‘face’ effect was found across several experiments. Investigation of this effect suggested that it may not be driven by the meaningfulness of the stimuli as it was also apparent in an individual that finds it difficult to extract information from faces. Together these findings suggest that high-level information can be extracted from visual stimuli outside of awareness, but the prioritisation afforded to emotional faces is driven by low-level characteristics. These results are particularly timely given continued high-profile debate surrounding the origins of emotion prioritisation (e.g. Tamettio & de Gelder, 2010; Pessoa & Adolphs, 2010).
APA, Harvard, Vancouver, ISO, and other styles
31

Al-Dahoud, Ahmad. "The computational face for facial emotion analysis: Computer based emotion analysis from the face." Thesis, University of Bradford, 2018. http://hdl.handle.net/10454/17384.

Full text
Abstract:
Facial expressions are considered to be the most revealing way of understanding the human psychological state during face-to-face communication. It is believed that a more natural interaction between humans and machines can be undertaken through the detailed understanding of the different facial expressions which imitate the manner by which humans communicate with each other. In this research, we study the different aspects of facial emotion detection, analysis and investigate possible hidden identity clues within the facial expressions. We study a deeper aspect of facial expressions whereby we try to identify gender and human identity - which can be considered as a form of emotional biometric - using only the dynamic characteristics of the smile expressions. Further, we present a statistical model for analysing the relationship between facial features and Duchenne (real) and non-Duchenne (posed) smiles. Thus, we identify that the expressions in the eyes contain discriminating features between Duchenne and non-Duchenne smiles. Our results indicate that facial expressions can be identified through facial movement analysis models where we get an accuracy rate of 86% for classifying the six universal facial expressions and 94% for classifying the common 18 facial action units. Further, we successfully identify the gender using only the dynamic characteristics of the smile expression whereby we obtain an 86% classification rate. Likewise, we present a framework to study the possibility of using the smile as a biometric whereby we show that the human smile is unique and stable.
Al-Zaytoonah University
APA, Harvard, Vancouver, ISO, and other styles
32

Přinosil, Jiří. "Analýza emocionálních stavů na základě obrazových předloh." Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-233488.

Full text
Abstract:
This dissertation thesis deals with the automatic system for basic emotional facial expressions recognition from static images. Generally the system is divided into the three independent parts, which are linked together in some way. The first part deals with automatic face detection from color images. In this part they were proposed the face detector based on skin color and the methods for eyes and lips position localization from detected faces using color maps. A part of this is modified Viola-Jones face detector, which was even experimentally used for eyes detection. The both face detectors were tested on the Georgia Tech Face Database. Another part of the automatic system is features extraction process, which consists of two statistical methods and of one method based on image filtering using set of Gabor’s filters. For purposes of this thesis they were experimentally used some combinations of features extracted using these methods. The last part of the automatic system is mathematical classifier, which is represented by feed-forward neural network. The automatic system is utilized by adding an accurate particular facial features localization using active shape model. The whole automatic system was benchmarked on recognizing of basic emotional facial expressions using the Japanese Female Facial Expression database.
APA, Harvard, Vancouver, ISO, and other styles
33

Hardee, Jillian E. "The fearful face and beyond fMRI studies of the human amygdala /." Morgantown, W. Va. : [West Virginia University Libraries], 2009. http://hdl.handle.net/10450/10653.

Full text
Abstract:
Thesis (Ph. D.)--West Virginia University, 2009.
Title from document title page. Document formatted into pages; contains ix, 192 p. : ill. (some col.). Vita. Includes abstract. Includes bibliographical references (p. 171-190).
APA, Harvard, Vancouver, ISO, and other styles
34

Blagrove, Elisabeth. "Time-based visual selection with emotional faces." Thesis, University of Warwick, 2009. http://wrap.warwick.ac.uk/3623/.

Full text
Abstract:
The biological and behavioural importance of the face has led to the proposition of several mechanisms dedicated to highly efficient specialized processing (e.g., M.H. Johnston, 2005). This is reflected in the attentional properties attributed to facial stimuli, especially when they contain affective information (e.g., R. Palermo & G. Rhodes, 2007). This thesis examines those attentional properties via a modified version of the visual search paradigm (i.e. the preview search task; D.G. Watson & G.W. Humphreys, 1997), which proposes that observers can intentionally suppress items seen prior to a full search array, for effective search performance (i.e. the preview benefit; D.G.Watson & G.W. Humphreys, 1997, 1998). The findings from this thesis show that it is possible to deprioritize previewed facial stimuli from search, although only a partial preview benefit was shown. Emotional valence of previewed faces had little impact on this effect, even when preview duration was extended from 1000-3000ms. However, when duration was reduced to 250-750 ms, negatively valenced faces were more difficult to suppress than positively valenced faces. In addition, when previewed faces changed expression concurrently with the onset of the full search array, the preview benefit was abolished, irrespective of the direction of the expression change (i.e. neutral to positive, or neutral to negative). A search advantage for negative face targets was demonstrated throughout all of the investigations in this thesis. These findings are consistent with previous work establishing preferential detection of, and selectively impaired disengagement from, negative faces (e.g., J.D. Eastwood, D. Smilek, & P.M. Merikle, 2001; E.Fox, R. Russo, R.J.Bowles, & K. Dutton, 2001). However, they also suggest the sensitivity of the visual marking mechanism to ecological considerations (such as the nature of the stimulus), and the overall relevance of emotional face stimuli to the visual system.
APA, Harvard, Vancouver, ISO, and other styles
35

Bate, Sarah. "The role of emotion in face recognition." Thesis, University of Exeter, 2008. http://hdl.handle.net/10036/51993.

Full text
Abstract:
This thesis examines the role of emotion in face recognition, using measures of the visual scanpath as indicators of recognition. There are two key influences of emotion in face recognition: the emotional expression displayed upon a face, and the emotional feelings evoked within a perceiver in response to a familiar person. An initial set of studies examined these processes in healthy participants. First, positive emotional expressions were found to facilitate the processing of famous faces, and negative expressions facilitated the processing of novel faces. A second set of studies examined the role of emotional feelings in recognition. Positive feelings towards a face were also found to facilitate processing, in both an experimental study using newly learned faces and in the recognition of famous faces. A third set of studies using healthy participants examined the relative influences of emotional expression and emotional feelings in face recognition. For newly learned faces, positive expressions and positive feelings had a similar influence in recognition, with no presiding role of either dimension. However, emotional feelings had an influence over and above that of expression in the recognition of famous faces. A final study examined whether emotional valence could influence covert recognition in developmental prosopagnosia, and results suggested the patients process faces according to emotional valence rather than familiarity per se. Specifically, processing was facilitated for studied-positive faces compared to studied-neutral and novel faces, but impeded for studied-negative faces. This pattern of findings extends existing reports of a positive-facilitation effect in face recognition, and suggests there may be a closer relationship between facial familiarity and emotional valence than previously envisaged. The implications of these findings are discussed in relation to models of normal face recognition and theories of covert recognition in prosopagnosia.
APA, Harvard, Vancouver, ISO, and other styles
36

Wolf, Claudia. "Genetic influences on emotion/cognition interactions-from synaptic regulation to individual differences in working memory for emotional faces." Thesis, Bangor University, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.516118.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Tomlinson, Eleanor Katharine. "Face-processing and emotion recognition in schizophrenia." Thesis, University of Birmingham, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.433700.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Zlatar, Katherine, and Oleksandra Lysak. "Fake it till you make it: The emotional labour of project managers." Thesis, Umeå universitet, Företagsekonomi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-86937.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

BOSSI, FRANCESCO. "Investigating face and body perception." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2018. http://hdl.handle.net/10281/199061.

Full text
Abstract:
I volti e i corpi veicolano gli indizi non-verbali più importanti per le interazioni sociali. Essi forniscono numerosi dettagli essenziali per il riconoscimento dell’identità, genere, intenzioni e stato emotivo. Tutti i volti e i corpi sono simmetrici e condividono la medesima struttura tridimensionale, ma gli esseri umani riescono ad identificare facilmente centinaia di persone diverse, facendo affidamento solo sulle informazioni fornite da volto e corpo. L’elaborazione del volto e del corpo è stata ampiamente studiata e diversi modelli cogntivi e neuroanatomici sono stati ideati per spiegare questi processi. Nonostante numerose differenze sostanziali, tutti questi modelli hanno riconosciuto diversi stadi di elaborazione, dalla codifica dello stimolo rapida e più grezza (corteccia visiva occipitale) fino a processi di livello più alto, finalizzati al riconoscimento di aspetti invarianti (es., identità) e mutevoli (es., sguardo, espressioni emotive) (sottesi da un vasto network fronto-temporo-parietale). È stato dimostrato che questi processi coinvolgono l’elaborazione configurale degli stimoli. Inoltre, le espressioni emotive sembrano influenzare la codifica di questi stimoli. Le espressioni emotive vengono elaborate ad uno stadio molto precoce e pare che coinvolgano l’attivazione di una via sottocorticale. Gli studi presentati in questa tesi hanno l’obiettivo di indagare la percezione visiva di volti e corpi, e come essa può essere modulata o manipolata, in alcuni studi anche attraverso l’elettroencefalografia (EEG). Mentre il primo Capitolo presenta il quadro teorico in cui è stato concepito questo lavoro di tesi, il secondo Capitolo presenta il primo studio (composto da due esperimenti), che ha l’obiettivo di indagare come può essere modulata la percezione di indizi sociali attraverso l’esclusione sociale. La ricerca era concentrata sulla percezione di due categorie di indizi facciali diversi, ma in interazione: le espressioni emotive e la direzione dello sguardo. In questo studio, abbiamo trovato che il riconoscimento della direzione dello sguardo veniva indebolita in modo specifico, mentre il riconoscimento delle espressioni emotive non era compromesso. I risultati di questo studio hanno portato a riflessioni importanti sull’importanza dello sguardo in quanto segnale di potenziale re-inclusione, e su come l’indebolimento dell’elaborazione dello sguardo potesse portare nuovamente all’esclusione sociale. Il terzo Capitolo presenta una meta-analisi sul body-inversion effect, una manipolazione che ha lo scopo di dimostrare l’elaborazione configurale dei corpi. Con la meta-analisi è stata indagata la coerenza e la dimensione di questo effetto, fondamentale nello studio della codifica strutturale dei corpi. Nel quarto capitolo, viene presentato uno studio sulle oscillazioni neurali coinvolte negli effetti di face- e body-inversion. Le oscillazioni sono state misurate nelle bande di attività theta e gamma attraverso l’EEG, dal momento che rappresentano un mezzo notevole per indagare l’attività psicofsiologica coinvolta in diversi processi. I risultati di questo studio hanno mostrato che l’elaborazione configurale di volti e corpi coinvolge meccanismi percettivi diversi. Nel quinto Capitolo viene presentato uno studio che indaga l’influenza dell’inversione e delle espressioni emotive nella codifica di volti e corpi. I correlati neurali di questi processi sono stati indagati attraverso i potenziali evento-correlati (ERPs). I risultati hanno evidenziato che sia l’inversione che l’espressione emotiva influenzavano l’elaborazione di questi stimoli, durante diversi stadi e attraverso diversi processi, ma queste due manipolazioni non interagivano. Pertanto, sembra che le informazioni configurali e le espressioni emotive siano elaborate attraverso processi percettivi indipendenti e che non interagiscono.
Human face and body convey the most important non-verbal cues for social interactions. Face and body provide numerous cues essential for recognition of other people’s identity, gender, age, intentions and emotional state. All faces and bodies are symmetrical and share a common 3D structure, but humans are able to easily identify hundreds of different people, just relying on facial and bodily information. Face and body processing have been widely studied and several cognitive and neuroanatomical models of these processes were hypothesized. Despite many critical differences, all these models recognized different stages of processing from early coarse stimulus encoding (occipital visual cortices) to higher-level processes aimed to identify invariant (e.g., identity) and changeable features (e.g., gaze, emotional expressions) (broad fronto-temporo-parietal network). It was demonstrated that these processes involve configural processing. Moreover, emotional expressions seem to influence the encoding of these stimuli. Processing of emotional expressions occurs at very early latencies and seems to involve the activation of a subcortical pathway. The studies presented in this thesis are aimed to investigate the visual perception of faces and bodies, and how it can be modulated or manipulated. EEG was used in some of the studies presented in this thesis to investigate the psychophysiological processes involved in face and body perception. While the first Chapter is aimed to present the theoretical background of the studies reported in the thesis, the second Chapter presents the first study (composed of two experiments), aimed to investigate how the perception of social cues can be modulated by social exclusion. The process investigated is the perception of two different, but interacting, facial cues: emotional expression and gaze direction. In this study, we found that the identification of gaze direction was specifically impaired by social exclusion, while no impairment was found for emotional expression recognition. The results of this study brought important insights concerning the relevance of gaze as a signal of potential re-inclusion, and how the impaired processing of gaze direction may reiterate social exclusion. The third Chapter presents a meta-analytic review on the body inversion effect, a manipulation aimed to demonstrate configural processing of bodies. This meta-analysis was aimed to investigate consistency and size of this effect, fundamental in studying structural encoding of body shapes. In the fourth Chapter, a study on the neural oscillations involved in face and body inversion effects is presented. Neural oscillations in theta and gamma bands were measured by means of the EEG since they are a very influential measure to investigate the psychophysiological activity involved in different processes. The results of this study showed that configural processing of faces and bodies involve different perceptual mechanisms. In the fifth Chapter, a study investigating the influence of inversion and emotional expression on the visual encoding of faces and bodies is presented. The neural correlates of these processes were investigated by means of event-related potentials (ERPs). Both inversion and emotional expressions were shown to influence the processing of these stimuli, during different stages and through different perceptual mechanisms, but results revealed that these two manipulations were not interacting. Therefore, configural information and emotional expressions seem to be processed through independent and non-interacting perceptual processes.
APA, Harvard, Vancouver, ISO, and other styles
40

Meinel, Nicole A. "Recognising emotions : is it all in the face?" Thesis, University of Nottingham, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.437021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Bellegarde, Lucille Gabrielle Anna. "Perception of emotions in small ruminants." Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/25915.

Full text
Abstract:
Animals are sentient beings, capable of experiencing emotions. Being able to assess emotional states in farm animals is crucial to improving their welfare. Although the function of emotion is not primarily for communication, the outward expression of an emotional state involves changes in posture, vocalisations, odours and facial expressions. These changes can be perceived and used as indicators of emotional state by other animals. Since emotions can be perceived between conspecifics, understanding how emotions are identified and how they can spread within a social group could have a major impact on improving the welfare of farmed species, which are mostly reared in groups. A recently developed method for the evaluation of emotions in animals is based on cognitive biases such as judgment biases, i.e. an individual in a negative emotional state will show pessimistic judgments while and individual in a positive emotional state will show optimistic judgments. The aims of this project were to (A) establish whether sheep and goats can discriminate between images of faces of familiar conspecifics taken in different positive and negative situations, (B) establish whether sheep and goats perceive the valence (positive of negative) of the emotion expressed by the animal on the image, (C) validate the use of images of faces in cognitive bias studies. The use of images of faces of conspecifics as emotional stimuli was first validated, using a discrimination task in a two-armed maze. A new methodology was then developed across a series of experiments to assess spontaneous reactions of animals exposed to video clips or to images of faces of familiar conspecifics. Detailed observations of ear postures were used as the main behavioural indicator. Individual characteristics (dominance status within the herd, dominance pairwise relationships and humananimal relationship) were also recorded during preliminary tests and included in the analyses. The impact of a low-mood state on the perception of emotions was assessed in sheep after subjecting half of the animals to unpredictable negative housing conditions and keeping the other half in good standard housing conditions. Sheep were then presented with videos of conspecifics filmed in situations of varying valence. Reactions to ambiguous stimuli were evaluated by presenting goats with images of morphed faces. Goats were also presented with images of faces of familiar conspecifics taken situations of varying emotional intensity. Sheep could discriminate images of faces of conspecifics taken either in a negative or in a neutral situation and their learning process of the discrimination task was affected by the type of emotion displayed. Sheep reacted differently depending on the valence of the video clips (P < 0.05); however, there was no difference between the control and the low-mood groups (P > 0.05). Goats also showed different behavioural reactions to images of faces photographed in different situations (P < 0.05), indicating that they perceived the images as different. Responses to morphed images were not necessarily intermediate to responses to negative and positive images and not gradual either, which poses a major problem to the potential use of facial images in cognitive bias experiments. Overall, animals were more attentive towards images or videos of conspecifics in negative situations, i.e., presumably, in a negative emotional state. This suggests that sheep and goats are able to perceive the valence of the emotional state. The identity of the individual on the photo also affected the animals’ spontaneous reaction to the images. Social relationships such as dominance, but also affinity between the tested and photographed individual seem to influence emotion perception.
APA, Harvard, Vancouver, ISO, and other styles
42

Bui, Kim-Kim. "Face Processing in Schizophrenia : Deficit in Face Perception or in Recognition of Facial Emotions?" Thesis, University of Skövde, School of Humanities and Informatics, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-3349.

Full text
Abstract:

Schizophrenia is a psychiatric disorder characterized by social dysfunction. People with schizophrenia misinterpret social information and it is suggested that this difficulty may result from visual processing deficits. As faces are one of the most important sources of social information it is hypothesized that people suffering from the disorder have impairments in the visual face processing system. It is unclear which mechanism of the face processing system is impaired but two types of deficits are most often proposed: a deficit in face perception in general (i.e., processing of facial features as such) and a deficit in facial emotion processing (i.e., recognition of emotional facial expressions). Due to the contradictory evidence from behavioural, electrophysiological as well as neuroimaging studies offering support for the involvement of one or the other deficit in schizophrenia it is early to make any conclusive statements as to the nature and level of impairment. Further studies are needed for a better understanding of the key mechanism and abnormalities underlying social dysfunction in schizophrenia.

APA, Harvard, Vancouver, ISO, and other styles
43

Duret, Marie-laetitia. "Les composantes socio perceptives et socio cognitives de la cognition sociale chez les enfants sourds." Thesis, Aix-Marseille, 2012. http://www.theses.fr/2012AIXM4078.

Full text
Abstract:
Dans ce travail, nous nous proposons d'étudier la cognition sociale chez les sourds en distinguant les aspects perceptifs et cognitifs selon le modèle proposé par Tager-Flusberg et Sullivan (2000). La surdité nous permet d'aborder l'influence des facteurs environnementaux sur le développement des composantes socio-perceptive et socio-cognitive ; nous ciblons nos recherches sur les enfants sourds nés de parents entendants, éduqués dans des écoles ordinaires et portants des prothèses auditives. La première question à laquelle nous tenterons de répondre est la suivante : le manque de communication avec l'entourage familial pendant les premiers mois de vie, en lien avec le contexte particulier de la surdité dans un milieu entendant, a-t-il une influence sur le développement de la composante socio-perceptive ? Nous étudions cette question avec deux expériences impliquant la perception des visages et des émotions ; ces tests nous permettent de mettre en évidence les performances et les stratégies de traitement utilisées. Nous recherchons d'une part, l'utilisation du processus configural et l'effet de focalisation de l'attention sur la région des yeux au cours d'une tâche de jugement de similarité entre visages et d'autre part, l'effet de traitement automatique de la colère avec une tâche de recherche visuelle. La deuxième question soulevée est relative au développement de la composante socio-cognitive, et notamment aux capacités liées à la théorie de l'esprit. Les possibilités croissantes d'intégration du discours, notamment grâce aux prothèses auditives, permettraient-elles le développement des capacités nécessaires à la compréhension des états mentaux d'autrui ?
In this thesis, we aimed to study the socio-perceptive and the socio-cognitive components of social cognition (Tager-Flusberg et Sullivan, 2000) in deaf children. Deafness give us the possibility to assess environmental factors' influence on the development of these components. To do so, we focus our studies on deaf children born from hearing parents, equipped with auditory protheses, and educated in ordinary schools. First, one of the main issue of the current studies is to assess whether the lack of communication with family during the first months of life, in line with the particular context of deafness in a hearing environment, have a significant impact on the socio-perceptive component. Experiments 1 and 2 were designed to assess this issue. Participants had to respond with two experiences related to faces and emotion perceptions ; those tests allow us to show the performances and the strategies of the treatment used. On a side, we are looking for the inversion effect and the eyes area focus effect during a test of faces' similarities judgment, and another side the angry automatic effect with a visual search test. The second question studied is related to the development of the socio-cognitive component, and especially on capacities of theory of mind. Could the improvement of internalization of speech, using auditory protheses, permit the development of the capacities needed to understand the state of mind of another? Or in contrary, are the possibilities to exchange precociously about his own state of mind needed to develop socio-cognitive component?
APA, Harvard, Vancouver, ISO, and other styles
44

Kuhn, Lisa Katharina. "Emotion recognition in the human face and voice." Thesis, Brunel University, 2015. http://bura.brunel.ac.uk/handle/2438/11216.

Full text
Abstract:
At a perceptual level, faces and voices consist of very different sensory inputs and therefore, information processing from one modality can be independent of information processing from another modality (Adolphs & Tranel, 1999). However, there may also be a shared neural emotion network that processes stimuli independent of modality (Peelen, Atkinson, & Vuilleumier, 2010) or emotions may be processed on a more abstract cognitive level, based on meaning rather than on perceptual signals. This thesis therefore aimed to examine emotion recognition across two separate modalities in a within-subject design, including a cognitive Chapter 1 with 45 British adults, a developmental Chapter 2 with 54 British children as well as a cross-cultural Chapter 3 with 98 German and British children, and 78 German and British adults. Intensity ratings as well as choice reaction times and correlations of confusion analyses of emotions across modalities were analysed throughout. Further, an ERP Chapter investigated the time-course of emotion recognition across two modalities. Highly correlated rating profiles of emotions in faces and voices were found which suggests a similarity in emotion recognition across modalities. Emotion recognition in primary-school children improved with age for both modalities although young children relied mainly on faces. British as well as German participants showed comparable patterns for rating basic emotions, but subtle differences were also noted and Germans perceived emotions as less intense than British. Overall, behavioural results reported in the present thesis are consistent with the idea of a general, more abstract level of emotion processing which may act independently of modality. This could be based, for example, on a shared emotion brain network or some more general, higher-level cognitive processes which are activated across a range of modalities. Although emotion recognition abilities are already evident during childhood, this thesis argued for a contribution of ‘nurture’ to emotion mechanisms as recognition was influenced by external factors such as development and culture.
APA, Harvard, Vancouver, ISO, and other styles
45

Rellecke, Julian. "Automaticity in affective face processing." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2012. http://dx.doi.org/10.18452/16626.

Full text
Abstract:
Emotionale Gesichtsausdrücke sind hochrelevante Reize für den Menschen. Es wurde daher angenommen, dass sie von evolutionär bedingten Mechanismen automatisch verarbeitet werden. Bis zu welchem Maße diese Verarbeitung tatsächlich automatisch verläuft ist noch immer kontrovers. Die vorliegende Arbeit schließt an diese Debatte an, indem sie eine spontane Tendenz aufzeigt vor allem bedrohlichen Gesichtsaudrücken vermehrt Verarbeitungsressourcen zuzuweisen, auch dann, wenn sie nur oberflächlich enkodiert werden und Emotionalität irrelevant für die gegebene Aufgabe ist (Studie 1 und 2). Diese Tendenz wurde bezüglich zweier Schlüsselkriterien von Automatizität untersucht, nämlich dem Intentionalitäts- (Studie 3) und dem Auslastungskriterium (Studie 4 und 5); diese nehmen an, dass automatische Verarbeitung unabhängig von der gegebenen Intention des Individuums, beziehungsweise konkurrierender Aufgabenanforderungen verläuft. Anhand ereigniskorrelierter Potenziale (EKPs) konnte gezeigt werden, dass verstärkte perzeptuelle Enkodierung emotionaler Gesichtsausdrücke weitgehend unabhängig von Intention auftrat, wohingegen verstärkte höhere kognitive Verarbeitung davon abhing, ob Reize vertieft verarbeitet wurden (Studie 3). Wurde die Kontrolle über die Gesichtsverarbeitung durch eine konkurrierende Aufgabe beeinträchtigt, während Emotionalität relevant war, so verstärkte dies emotionale Effekte auf der perzeptuellen und frühen, höheren kognitiven Ebene (Studie 4). Ähnliches konnte auch für die perzeptuelle Verarbeitung attraktiver Gesichter beobachtet werden (Studie 5). Hingegen war bei verminderter Kontrolle die verstärkte Enkodierung bedrohlicher Ausdrücke in späten kognitiven Verarbeitungsstufen unterdrückt. Die vorliegenden Befunde sprechen gegen eine Automatisierung affektiver Gesichtsverarbeitung und legen stattdessen nahe, dass biologisch vorbereitete Verarbeitungstendenzen durch aufgabenorientierte Kontrollmechanismen und ihr Zusammenspiel mit Intention moduliert werden.
Emotional facial expressions are highly relevant stimuli in humans. It has thus been suggested that they are processed automatically by evolutionarily in-built mechanisms. However, to which extent such processing in fact arises automatically is still controversial. The current work feeds into this debate by showing a tendency to spontaneously allocate increased processing capacity to emotional, especially threat-related expressions, even when processed merely superficially and emotionality is irrelevant to the task at hand (Study 1 and 2). This bias was further tested with regard to key criteria of automaticity; that is the intentionality (Study 3) and the load-insensitivity criterion (Study 4 and 5) assuming automatic processing to arise irrespective of intention of the individual, and concurrent task demands, respectively. Event-related brain potentials (ERPs) revealed enhanced perceptual encoding of threat-related expressions to remain largely unaffected by intention. In contrast, at the higher cognitive level, enhanced encoding depended on whether stimuli were voluntarily processed more deeply (Study 3). However, when control over face processing was impaired by a concurrent task, while emotionality was deemed relevant, emotion effects were enhanced at both, the perceptual and early higher cognitive level (Study 4). Similar was observed for perceptual encoding of attractive faces (Study 5). In contrast, during late higher cognitive stages of in-depth face processing, enhanced encoding of threat was eliminated when control was reduced (Study 4). The present results speak against full automaticity in affective face processing but suggest that biologically prepared processing biases are modulated by task-oriented control mechanisms and their interplay with intention.
APA, Harvard, Vancouver, ISO, and other styles
46

Joshua, Nicole R. "Face processing in schizophrenia : an investigation of configural processing and the relationship with facial emotion processing and neurocognition /." Connect to thesis, 2010. http://repository.unimelb.edu.au/10187/7040.

Full text
Abstract:
Cognitive impairment is a key characteristic of schizophrenia and is a clear predictor of functional outcome. This thesis explores the relationship between cognitive ability relating to social and non-social processing. Schizophrenia patients demonstrate an impaired ability to recognise, label and discriminate emotional expression within the face. The underlying mechanisms behind this social cognitive impairment are not yet fully understood. This thesis explores the notion that a basic perceptual impairment in processing facial information adversely impacts on the perception of more complex information derived from faces, such as emotional expression. Face perception relies on processing the featural characteristics of a face as well as the relationship between these features. Information pertaining to the spatial distances between features is referred to as configural information.
A group of schizophrenia patients and healthy control participants completed a battery of tasks that assessed basic neurocognition, facial emotion processing and configural face processing. A model of face processing was proposed and used to systematically pinpoint specific deficits that may contribute to impaired face processing in schizophrenia. The results indicated that schizophrenia patients show impairments on three broad constructs; basic neurocognition, facial emotion processing, and most pertinently, deficits in configural processing. It was revealed that although neurocognitive and face processing both explained a significant proportion of the variance in facial emotion processing, the effect of neurocognition was indirect and mediated by face processing.
To investigate the diagnostic specificity of these findings, a group of bipolar disorder patients was also tested on the task battery. The results indicated that bipolar disorder patients also show social and non-social cognitive impairments, however, not as severe as that demonstrated by the schizophrenia patients. Furthermore, the effect of neurocognitive performance on facial emotion processing appeared more direct for bipolar disorder patients compared to schizophrenia patients. Although deficits in face processing were observable in bipolar, they were not specific to configural processing. Thus, deficits in emotion processing were more associated to neurocognitive ability in bipolar disorder patients, and more associated to configural face processing in schizophrenia patients. The configural processing deficits in schizophrenia are discussed as a lower-order perception problem. In conclusion, the results of this thesis are discussed in terms of their implication for treatment.
APA, Harvard, Vancouver, ISO, and other styles
47

Beer, Jenay Michelle. "Recognizing facial expression of virtual agents, synthetic faces, and human faces: the effects of age and character type on emotion recognition." Thesis, Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/33984.

Full text
Abstract:
An agent's facial expression may communicate emotive state to users both young and old. The ability to recognize emotions has been shown to differ with age, with older adults more commonly misidentifying the facial emotions of anger, fear, and sadness. This research study examined whether emotion recognition of facial expressions differed between different types of on-screen agents, and between age groups. Three on-screen characters were compared: a human, a synthetic human, and a virtual agent. In this study 42 younger (age 28-28) and 42 older (age 65-85) adults completed an emotion recognition task with static pictures of the characters demonstrating four basic emotions (anger, fear, happiness, and sadness) and neutral. The human face resulted in the highest proportion match, followed by the synthetic human, then the virtual agent with the lowest proportion match. Both the human and synthetic human faces resulted in age-related differences for the emotions anger, fear, sadness, and neutral, with younger adults showing higher proportion match. The virtual agent showed age-related differences for the emotions anger, fear, happiness, and neutral, with younger adults showing higher proportion match. The data analysis and interpretation of the present study differed from previous work by utilizing two unique approaches to understanding emotion recognition. First, misattributions participants made when identifying emotion were investigated. Second, a similarity index of the feature placement between any two virtual agent emotions was calculated, suggesting that emotions were commonly misattributed as other emotions similar in appearance. Overall, these results suggest that age-related differences transcend human faces to other types of on-screen characters, and differences between older and younger adults in emotion recognition may be further explained by perceptual discrimination between two emotions of similar feature appearance.
APA, Harvard, Vancouver, ISO, and other styles
48

Bel-Bahar, Tarik Stanislaw. "Cortico-limbic mechanisms of meaning making : judgments of personality and emotion from faces /." Connect to title online (ProQuest), 2008. http://proquest.umi.com/pqdweb?did=1678701141&sid=3&Fmt=2&clientId=11238&RQT=309&VName=PQD.

Full text
Abstract:
Thesis (Ph. D.)--University of Oregon, 2008.
Typescript. Includes vita and abstract. Includes bibliographical references (leaves 188-219). Also available online in ProQuest, free to University of Oregon users.
APA, Harvard, Vancouver, ISO, and other styles
49

Modig, Pia. "Upplevelser av emotioner : En kvalitativ studie om upplevelsen av emotioner ser annorlunda ut på sociala media i jämförelse med face to face i verkligheten." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-69815.

Full text
Abstract:
This essay will investigate if it is possible to experience the same experience on social media as in reality. This is to be done with Randall Collins theory of interaction rituals. This interaction ritual contains four aspects for it to be considered a successful interaction ritual. Social media has led to a revolution due to the social and cultural effects of the users. Technically, it has led to an evolution. This essay has been conducted with qualitative research method. Focus Group Interview starts with an initial discussion about the mobile phone as technology where applications and time aspects are in focus, as well as what the informants generally consider about social media. Previous research that has been done shows a lack of sociological and social psychological research. The previous research used in this paper shows that overuse leads to consequences for the user that society should be able to meet which is the problem. G H Mead's interaction theories have been linked to interaction on social media by interacting with people on social media and they connect themselves to social media automatically without thinking about before. It has become a natural act. I come before ME. The result shows that the mobile phone is used to connect to social media used to a large extent among the informants. They believe that what is presented on social media is impressed. The informants have experienced interaction rituals in the reality that gave them emotional energy. They indicate that the experience of successful interaction rituals through social media can be experienced to some extent, but that it precedes past experiences of events in the reality.
Denna uppsats ska utreda om det är möjligt att uppleva samma upplevelse på sociala medier som i verkligheten. Detta ska göras med Randall Collins teori om interaktionsritualer. Denna interaktionsritual innehåller fyra aspekter för att det ska anses vara en lyckad interaktionsritual. Sociala medier har lett till en revolution på grund av de sociala och kulturella effekterna bland användarna. Tekniskt sett har det lett till en evolution. Denna uppsats har utförts med kvalitativ forskningsmetod. Fokusgruppsintervjun börjar med en inledande diskussion om mobiltelefonen som teknik där användningsområde och tidsaspekter är i fokus samt vad informanterna allmänt anser om sociala medier. Tidigare forskning som har gjorts visar en brist på sociologisk och socialpsykologisk forskning. Den tidigare forskning som används i denna uppsats visar på att en överanvändning leder till konsekvenser för användaren som samhället bör kunna möta vilket är problemet. G H Meads interaktionsteorier har kopplats till interaktionen på sociala media genom att människor interagerar med varandra på sociala media och de kopplar upp sig på sociala media per automatik utan att tänka sig för innan. Det har blivit en naturlig handling. I kommer före ME. Resultatet visar att mobiltelefonen används till att koppla in sig på sociala media som används i hög grad bland informanterna. De anser att det som presenteras på sociala media är intrycksstyrt. Informanterna har upplevt interaktionsritualer i verkligheten som gett dem emotionell energi. De tyder på att upplevelsen av lyckade interaktionsritualer via sociala media går att uppleva till viss del men att det föregår av tidigare upplevelser från händelser i verkligheten.
APA, Harvard, Vancouver, ISO, and other styles
50

Mignault, Alain 1962. "Connectionist models of the perception of facial expressions of emotion." Thesis, McGill University, 1999. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=36039.

Full text
Abstract:
Two connectionist models are developed that predict humans' categorization of facial expressions of emotion and their judgements of similarity between two facial expressions. For each stimulus, the models predict the subjects' judgement, the entropy of the response, and the mean response time (RT). Both models involve a connectionist component which predicts the response probabilities and a response generator which predicts the mean RT. The input to the categorization model is a preprocessed picture of a facial expression, while the hidden unit representations generated by the first model for two facial expressions constitute the input of the similarity model. The data collected on 45 subjects in a single-session experiment involving a categorization and a similarity task provided the target outputs to train both models. Two response generators are tested. The first, called the threshold model , is a linear integrator with threshold inspired from Lacouture and Marley's (1991) model. The second, called the channel model, constitutes a new approach which assumes a linear relationship between entropy of the response and mean RT. It is inspired by Lachman's (1973) interpretation of Shannon's (1948) entropy equation. The categorization model explains 50% of the variance of mean RT for the training set. It yields an almost perfect categorization of the pure emotional stimuli of the training set and is about 70% correct on the generalization set. A two-dimensional representation of emotions in the hidden unit space reproduces most of the properties of emotional spaces found by multidimensional scaling in this study as well as in other studies (e.g., Alvarado, 1996). The similarity model explains 53% of the variance of mean similarity judgements; it provides a good account of subjects' mean RT; and it even predicts an interesting bow effect that was found in subjects' data.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography