Academic literature on the topic 'Facial cue'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Facial cue.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Facial cue"
Coetzee, Vinet, David I. Perrett, and Ian D. Stephen. "Facial Adiposity: A Cue to Health?" Perception 38, no. 11 (January 2009): 1700–1711. http://dx.doi.org/10.1068/p6423.
Full textRussell, Richard, Aurélie Porcheron, Jennifer Sweda, Emmanuelle Mauger, and Frederique Morizot. "Facial contrast is a cue for health perception." Journal of Vision 15, no. 12 (September 1, 2015): 1213. http://dx.doi.org/10.1167/15.12.1213.
Full textWatanabe, Noriya, Masahiko Haruno, and Masamichi Sakagami. "Emotional facial expression accelerates cue-reward association learning." Neuroscience Research 68 (January 2010): e291. http://dx.doi.org/10.1016/j.neures.2010.07.1292.
Full textQuist, Michelle C., Christopher D. Watkins, Finlay G. Smith, Lisa M. DeBruine, and Benedict C. Jones. "Facial masculinity is a cue to women’s dominance." Personality and Individual Differences 50, no. 7 (May 2011): 1089–93. http://dx.doi.org/10.1016/j.paid.2011.01.032.
Full textLaw Smith, M. J., D. I. Perrett, B. C. Jones, R. E. Cornwell, F. R. Moore, D. R. Feinberg, L. G. Boothroyd, et al. "Facial appearance is a cue to oestrogen levels in women." Proceedings of the Royal Society B: Biological Sciences 273, no. 1583 (November 2005): 135–40. http://dx.doi.org/10.1098/rspb.2005.3296.
Full textBoothroyd, Lynda G., Isabel Scott, Alan W. Gray, Claire I. Coombes, and Nicholas Pound. "Male Facial Masculinity as a Cue to Health Outcomes." Evolutionary Psychology 11, no. 5 (December 2013): 147470491301100. http://dx.doi.org/10.1177/147470491301100508.
Full textCarré, Justin M., Cheryl M. McCormick, and Catherine J. Mondloch. "Facial Structure Is a Reliable Cue of Aggressive Behavior." Psychological Science 20, no. 10 (October 2009): 1194–98. http://dx.doi.org/10.1111/j.1467-9280.2009.02423.x.
Full textRoberts, S. Craig, Tamsin K. Saxton, Alice K. Murray, Robert P. Burriss, Hannah M. Rowland, and Anthony C. Little. "Static and Dynamic Facial Images Cue Similar Attractiveness Judgements." Ethology 115, no. 6 (June 2009): 588–95. http://dx.doi.org/10.1111/j.1439-0310.2009.01640.x.
Full textPorcheron, Aurélie, Emmanuelle Mauger, Frédérique Soppelsa, Richard Russell, and Frédérique Morizot. "Facial contrast is a universal cue for perceiving age." Journal of Vision 15, no. 12 (September 1, 2015): 1222. http://dx.doi.org/10.1167/15.12.1222.
Full textDeBruine, Lisa M. "Trustworthy but not lust-worthy: context-specific effects of facial resemblance." Proceedings of the Royal Society B: Biological Sciences 272, no. 1566 (May 7, 2005): 919–22. http://dx.doi.org/10.1098/rspb.2004.3003.
Full textDissertations / Theses on the topic "Facial cue"
Horowitz, Erin J. "Facial Information as a Minimal Cue of Animacy." Thesis, University of California, Santa Barbara, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10784374.
Full textThe tendency for humans to give preferential attention to animate agents in their immediate surroundings has been well-documented and likely reflects an evolved specialization to a persistent adaptive problem. In uncertain or ambiguous cases, this tendency can result in an over-detection of animacy, as the potential costs of failing to detect an animate agent far outweigh those of mistaken identification. In line with this, it seems likely that humans have evolved a sensitivity to specific cues which are indicative of animacy such that the mere presence of these cues will lead to detection, regardless of the objective category membership of the entity in question. There exists a wealth of research speaking to this effect with regards to motion cues, specifically in terms of the capacity for self-propulsion and goal-directed action. Morphological cues have also been implicated - most especially the presence of facial features – as they specify a capacity for perceptual feedback from the environment, which is essential for goal-directed motion. However, it remains an open question as to whether the capacity for animacy detection is similarly sensitive to facial information in the absence of motion cues.
The experiments reported here attempted to address this question by implementing a novel task in which participants were asked to judge the animacy or inanimacy (or membership in animal or object categories) of different images: animals with and without visible facial features, and objects with and without visible facial features. Beyond replicating a general advantage for detecting animate agents over inanimate objects, the primary predictions for these experiments were that facial features would have a differential effect on performance, such that they would improve performance when visible in animals, and would hinder performance when visible in objects. Experiments 1a and 1b provided a preliminary confirmation of this pattern of responses using images of familiar and unfamiliar animals (e.g., dogs versus jellyfish), and unaltered images of objects with and without faces. Experiment 2 improved on the design of this task by more closely matching the sets of images (the same animals facing toward or away from the camera, and objects with faces which had been digitally altered to disrupt the facial features), and by changing the prompt of the task from yes/no judgments of animacy to categorization into animal or object groups. Experiment 3 examined the face inversion effect, or the failure to recognize familiar faces when their orientation is inverted, on animal-object categorization. Lastly, experiments 4 and 5 attempted to extend the findings from experiment 2 to preschool-aged children, by implementing a card sorting task (experiment 4) and a computerized animal detection task (experiment 5). The results of this series of experiments highlight the prominent role of facial features in detecting animate agents in one’s surroundings.
Han, Chengyang. "Facial appearance as a cue of physical condition." Thesis, University of Glasgow, 2018. http://theses.gla.ac.uk/8788/.
Full textEldblom, Hans. "Facial width-to-height ratio as a cue of threat : An ERP study." Thesis, Högskolan i Skövde, Institutionen för biovetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-15570.
Full textBACCOLO, ELISA. "It’s written all over your face. The ontogeny of sensitivity to facial cues to trustworthiness." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2020. http://hdl.handle.net/10281/277385.
Full textHuman beings are hypersensitive to those facial properties that convey social signals. The ability to attribute trustworthiness judgements based on facial cues to trustworthiness, i.e. those cues that we use to derive whether a person can be safely approached or better avoided, is known to be fast, automatic and based on very little information. This doctoral dissertation aims at investigating: (1) whether sensitivity to facial cues to trustworthiness is modulated by individual variations in social personality characteristics; (2) the developmental trajectory of this sensitivity; (3) if sensitivity to subtle variations in facial cues to trustworthiness is a universal phenomenon or is it modulated by culture and/or face ethnicity. Chapter 1 aimed at investigating whether individual differences in fine-grained perceptual sensitivity and mental representation of facial features related to trustworthiness judgements are associated with individual differences in social motivation. Results showed that individual differences in social motivation can have an impact on the amount of social experience and thus the level of developed sensitivity to facial cues to trustworthiness. Chapter 2 focused on the developmental trajectory of such sensitivity. Study 2 aimed to investigate how perceptual sensitivity to and mental representation of fine-grained differences in facial information subtending social perception of trustworthiness develops in time, taking into account individual differences in emotional development. Results showed that sensitivity to facial cues to trustworthiness and the ability to employ these cues to generate trustworthiness judgements is present in preschool years, but matures to reach adult-like levels at the age of 7, developing together with emotion understanding abilities. Study 3 and 4 used two different EEG paradigms with 6-month-old infants to question whether this sensitivity is already present in the first year of life. Combined data coming from Study 3 and 4 contribute in showing that 6-month-old infants are sensitive to those facial cues that are later used to generate trustworthiness judgements. Finally, Chapter 3 presents a validation of stimuli that will be used to explore the presence of developmental cross-cultural differences in the perception of face trustworthiness. Overall, all presented studies suggest that sensitivity to facial cues to trustworthiness manifests in the very first years of life, to be then refined by experience over the course of development. Moreover, they suggest that trustworthiness perception could be cross-cultural, as it is not influenced by the experience an individual gains with a certain face category.
Mazefsky, Carla Ann. "Emotion Perception in Asperger's Syndrome and High-functioning Autism: The Importance of Diagnostic Criteria and Cue Intensity." VCU Scholars Compass, 2004. http://scholarscompass.vcu.edu/etd/1449.
Full textBoraston, Zillah Louise. "Emotion recognition from facial and non-facial cues." Thesis, University College London (University of London), 2008. http://discovery.ucl.ac.uk/1445207/.
Full textVIEIRA, Tiago Figueiredo. "Identifying Kinship Cues from Facial Images." Universidade Federal de Pernambuco, 2013. https://repositorio.ufpe.br/handle/123456789/13315.
Full textMade available in DSpace on 2015-04-17T13:23:49Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) TESE Tiago Figueiredo Vieira.compressed.pdf: 2116364 bytes, checksum: b3851944ff7105bff9fdcd050d5d4f86 (MD5) Previous issue date: 2013-11-08
A investigação da face humana é comum em análise de padrões/ processamento de imagens. Abordagens tradicionais são a identificação e a verificação mas muitas outras estão surgindo, como estimativa de idade, análise de similaridade, atratividade e o reconhecimento de parentesco. Apesar deste último possuir diversas possíveis aplicações, poucos trabalhos foram apresentados até então. Esta tese apresenta um algoritmo apto a discriminar entre irmãos e não irmãos, baseado nas imagens das suas faces. Um grande desafio foi lidar com a falta de um benchmark em análise de parentesco e, por esta razão, uma base de imagens de alta qualidade de pares de irmãos foi coletada. Isto é uma contribuição relevante à comunidade científica e foi particularmente útil para evitar possíveis problemas devido a imagens de baixa qualidade e condições não-controladas de aquisição de bases de dados heterogêneas usadas em outros trabalhos. Baseado nessas imagens, vários classificadores foram construídos usando técnicas baseadas na extração de características e holística para investigar quais variáveis são mais eficientes para distinguir parentes. As características foram primeiramente testadas individualmente e então as informações mais significantes da face foram fornecidas a um algoritmo único. O classificador de irmãos superou a performance de humanos que avaliaram a mesma base de dados. Adicionalmente, a boa capacidade de distinção do algorimo foi testado aplicando-o a uma base de dados de baixa qualidade coletada da Internet. O conhecimento obtido da análise de irmãos levou ao desenvolvimento de um algoritmo similar capaz de distinguir pares pai-filho de indivíduos não relacionados. Os resultados obtidos possuem impactos na recuperação e anotação automática de bases de dados, ciência forense, pesquisa genealógica e na busca de familiares perdidos.----------------------------------------------------------------------------------------------- The investigation of human face images is ubiquitous in pattern analysis/ image processing research. Traditional approaches are related to face identification and verification but, several other areas are emerging, like age/ expression estimation, analysis of facial similarity and attractiveness and automatic kinship recognition. Despite the fact that the latter could have applications in fields such as image retrieval and annotation, little work in this area has been presented so far. This thesis presents an algorithm able to discriminate between siblings and unrelated individuals, based on their face images. In this context, a great challenge was to deal with the lack of a benchmark in kinship analysis, and for this reason, a high-quality dataset of images of siblings’ pairs was collected. This is a relevant contribution to the research community and is particularly useful to avoid potential problems due to low quality pictures and uncontrolled imaging conditions of heterogeneous datasets used in previous researches. The database includes frontal, profile, expressionless and smiling faces of siblings pairs. Based on these images, various classifiers were constructed using feature-based and holistic techniques to investigate which data are more effective for discriminating siblings from non-siblings. The features were first tested individually and then the most significant face data were supplied to a unique algorithm. The siblings classifier has been found to outperform human raters on all datasets. Also, the good discrimination capabilities of the algorithm is tested by applying the classifiers to a low quality database of images collected from the Internet in a cross-database experiment. The knowledge acquired from the analysis of siblings fostered a similar algorithm able to discriminating parent-child pairs from unrelated individuals. The results obtained in this thesis have impact in image retrieval and annotation, forensics, genealogical research and finding missing family members.
Scott, Naomi. "Facial cues to mental health symptoms." Thesis, Bangor University, 2015. https://research.bangor.ac.uk/portal/en/theses/facial-cues-to-mental-health-symptoms(1f1fa702-18f7-435c-ad59-05c59dccaec2).html.
Full textStoyanova, Raliza. "Contextual influences on perception of facial cues." Thesis, University of Cambridge, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.608041.
Full textFisher, Claire. "Social perception of facial cues of adiposity." Thesis, University of Glasgow, 2017. http://theses.gla.ac.uk/8334/.
Full textBooks on the topic "Facial cue"
Che faccia fare. Milano: Feltrinelli, 1998.
Find full textPrima che faccia buio. Venezia: Marsilio, 2005.
Find full textCapitani, Paola, ed. Il controllo terminologico delle risorse elettroniche in rete. Florence: Firenze University Press, 2001. http://dx.doi.org/10.36253/88-8453-008-3.
Full textVígh, Eva. "Il costume che appare nella faccia": Fisiognomica e letteratura italiana. Roma: Aracne, 2014.
Find full textBirattari, Massimo. È più facile scrivere bene che scrivere male: Corso di sopravvivenza. Milano: Ponte alle Grazie, 2011.
Find full textManzini, Gianna. Lettere a Giuseppe Dessí e a Luisa. Edited by Alberto Baldi. Florence: Firenze University Press, 2020. http://dx.doi.org/10.36253/978-88-6453-923-2.
Full text"Ne la faccia che a Cristo/più si somiglia": La poesia mariana di Dante. Cosenza - Italy: Luigi Pellegrini editore, 2017.
Find full textTabacco, Giovanni. La relazione fra i concetti di potere temporale e di potere spirituale nella tradizione cristiana fino al secolo XIV. Edited by Laura Gaffuri. Florence: Firenze University Press, 2011. http://dx.doi.org/10.36253/978-88-8453-995-3.
Full textGensini, Gian Franco, and Augusto Zaninelli, eds. Progetto RIARTE. Florence: Firenze University Press, 2015. http://dx.doi.org/10.36253/978-88-6655-906-1.
Full textCaproni, Giorgio. Il mondo ha bisogno dei poeti. Edited by Melissa Rota. Florence: Firenze University Press, 2014. http://dx.doi.org/10.36253/978-88-6655-677-0.
Full textBook chapters on the topic "Facial cue"
Chen, Jingying, and Bernard Tiddeman. "Multi-cue Facial Feature Detection and Tracking." In Lecture Notes in Computer Science, 356–67. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-69905-7_41.
Full textSu, Congyong, Hong Zhou, and Li Huang. "Multiple Facial Feature Tracking Using Multi-cue Based Prediction Model." In Articulated Motion and Deformable Objects, 214–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-30074-8_21.
Full textKlien, Michael, and Maria Salvetti. "CEE." In Facing the Challenges of Water Governance, 259–89. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-98515-2_10.
Full textAnchlia, Sonal. "Temporomandibular Joint Ankylosis." In Oral and Maxillofacial Surgery for the Clinician, 1401–34. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-1346-6_65.
Full textMani, Varghese. "Orthognathic Surgery for Mandible." In Oral and Maxillofacial Surgery for the Clinician, 1477–512. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-1346-6_68.
Full textFang, Xianming. "Chapter 4. Multimodality in refusals in English as a lingua franca." In Multimodal Im/politeness, 101–29. Amsterdam: John Benjamins Publishing Company, 2023. http://dx.doi.org/10.1075/pbns.333.04fan.
Full textUlz, Thomas, Jakob Ludwiger, and Gerald Steinbauer. "A Robust and Flexible System Architecture for Facing the RoboCup Logistics League Challenge." In RoboCup 2018: Robot World Cup XXII, 488–99. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-27544-0_40.
Full textAndrée, Alexander. "Caue ne facias uim in tempore! Peter Comestor and the Truth of History." In Instrumenta Patristica et Mediaevalia, 515–50. Turnhout: Brepols Publishers, 2017. http://dx.doi.org/10.1484/m.ipm-eb.5.112016.
Full textGualco, Carlo, Marco Grattarola, Alberto Federici, Francesco Mataloni, Karol Iždinský, F. Simančik, Bernhard Schwarz, C. García-Rosales, and I. López-Galilea. "Brazing Technology for Plasma Facing Components in Nuclear Fusion Applications Using Low and Graded CTE Interlayers." In Advanced Materials Research, 192–97. Stafa: Trans Tech Publications Ltd., 2008. http://dx.doi.org/10.4028/3-908454-01-8.192.
Full textTóth, Judit, Éva Szirmai, Norbert Merkovity, and Tamás Pongó. "Promising or Compelling Future in Hungary?" In Young Adults and Active Citizenship, 121–38. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-65002-5_7.
Full textConference papers on the topic "Facial cue"
Guan, Yepeng. "Robust Eye Detection from Facial Image based on Multi-cue Facial Information." In 2007 IEEE International Conference on Control and Automation. IEEE, 2007. http://dx.doi.org/10.1109/icca.2007.4376666.
Full textAli, Tauseef, and Intaek Kim. "An improved eye localization algorithm using multi-cue facial information." In 2009 2nd International Conference on Computer, Control and Communication (IC$). IEEE, 2009. http://dx.doi.org/10.1109/ic4.2009.4909266.
Full textLi, Songjiang, Wen Cui, Jinshi Cui, Li Wang, Ming Li, and Hongbin Zha. "Improving Children's Gaze Prediction via Separate Facial Areas and Attention Shift Cue." In 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017). IEEE, 2017. http://dx.doi.org/10.1109/fg.2017.92.
Full textRupasinghe, A. T., N. L. Gunawardena, S. Shujan, and D. A. S. Atukorale. "Scaling personality traits of interviewees in an online job interview by vocal spectrum and facial cue analysis." In 2016 Sixteenth International Conference on Advances in ICT for Emerging Regions (ICTer). IEEE, 2016. http://dx.doi.org/10.1109/icter.2016.7829933.
Full textLang, Christian, Sven Wachsmuth, Heiko Wersing, and Marc Hanheide. "Facial expressions as feedback cue in human-robot interaction—a comparison between human and automatic recognition performances." In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR Workshops). IEEE, 2010. http://dx.doi.org/10.1109/cvprw.2010.5543264.
Full textYap, Moi Hoon, Bashar Rajoub, Hassan Ugail, and Reyer Zwiggelaar. "Visual cues of facial behaviour in deception detection." In 2011 IEEE International Conference on Computer Applications and Industrial Electronics (ICCAIE). IEEE, 2011. http://dx.doi.org/10.1109/iccaie.2011.6162148.
Full textSuzuki, K., Y. Takeuchi, and J. Heo. "THE EFFECT OF LIGHTING ENVIRONMENT ON FACIAL EXPRESSION PERCEPTION IN VIDEO TELECONFERENCING." In CIE 2021 Conference. International Commission on Illumination, CIE, 2021. http://dx.doi.org/10.25039/x48.2021.op51.
Full textLi, Tianyu, and Biao Yang. "NEW EMPIRICAL DATA FOR PEDESTRIAN LIGHTING EFFECT ON RECOGNITION ABILITY ON REAL 3D FACIAL EXPRESSION." In CIE 2018. International Commission on Illumination, CIE, 2018. http://dx.doi.org/10.25039/x45.2018.op18.
Full textThomas, Chinchu, and Dinesh Babu Jayagopi. "Predicting student engagement in classrooms using facial behavioral cues." In ICMI '17: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3139513.3139514.
Full textWilliamson, James R., Elizabeth Godoy, Miriam Cha, Adrianne Schwarzentruber, Pooya Khorrami, Youngjune Gwon, Hsiang-Tsung Kung, Charlie Dagli, and Thomas F. Quatieri. "Detecting Depression using Vocal, Facial and Semantic Communication Cues." In MM '16: ACM Multimedia Conference. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2988257.2988263.
Full textReports on the topic "Facial cue"
Qiao, Baoyun, Xiaoqin Fan, Hanif Rahemtulla, Hans van Rijn, and Lina Li. Critical Issues for Fiscal Reform in the People’s Republic of China Part 1: Revenue and Expenditure Management. Asian Development Bank, December 2022. http://dx.doi.org/10.22617/wps220575-2.
Full textHenkin, Samuel. Dynamic Dimensions of Radicalization and Violent Extremism in Sabah, Malaysia. RESOLVE Network, December 2021. http://dx.doi.org/10.37805/pn2021.25.sea.
Full textZuccarelli, N., C. M. Lesher, M. G. Houlé, and S. J. Barnes. Variations in the textural facies of sulphide minerals in the Eagle's Nest Ni-Cu-(PGE) deposit, McFaulds Lake greenstone belt, Superior Province, Ontario: insights from microbeam scanning energy-dispersive X-ray fluorescence spectrometry. Natural Resources Canada/ESS/Scientific and Technical Publishing Services, 2020. http://dx.doi.org/10.4095/326895.
Full textKuster, K., C. M. Lesher, and M. G. Houlé. Geology and geochemistry of mafic and ultramafic bodies in the Shebandowan mine area, Wawa-Abitibi terrane: implications for Ni-Cu-(PGE) and Cr-(PGE) mineralization, Ontario and Quebec. Natural Resources Canada/CMSS/Information Management, 2022. http://dx.doi.org/10.4095/329394.
Full textSavosko, V., I. Komarova, Yu Lykholat, E. Yevtushenko, and T. Lykholat. Predictive model of heavy metals inputs to soil at Kryvyi Rih District and its use in the training for specialists in the field of Biology. IOP Publishing, 2021. http://dx.doi.org/10.31812/123456789/4511.
Full textСавосько, Василь Миколайович, Ірина Олександрівна Комарова, Юрій Васильович Лихолат, Едуард Олексійович Євтушенко,, and Тетяна Юріївна Лихолат. Predictive Model of Heavy Metals Inputs to Soil at Kryvyi Rih District and its Use in the Training for Specialists in the Field of Biology. IOP Publishing, 2021. http://dx.doi.org/10.31812/123456789/4266.
Full textCorriveau, L., J. F. Montreuil, O. Blein, E. Potter, M. Ansari, J. Craven, R. Enkin, et al. Metasomatic iron and alkali calcic (MIAC) system frameworks: a TGI-6 task force to help de-risk exploration for IOCG, IOA and affiliated primary critical metal deposits. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/329093.
Full textRousseau, Henri-Paul. Gutenberg, L’université et le défi numérique. CIRANO, December 2022. http://dx.doi.org/10.54932/wodt6646.
Full textNorelli, John L., Moshe Flaishman, Herb Aldwinckle, and David Gidoni. Regulated expression of site-specific DNA recombination for precision genetic engineering of apple. United States Department of Agriculture, March 2005. http://dx.doi.org/10.32747/2005.7587214.bard.
Full textFacial Cues: Can We Judge Who Looks Like a Leader? IEDP Ideas for Leaders, January 2015. http://dx.doi.org/10.13007/479.
Full text