Academic literature on the topic 'Facial expression – Photographic measurements'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Facial expression – Photographic measurements.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Facial expression – Photographic measurements"

1

Rhodes, Gillian. "Looking at Faces: First-Order and Second-Order Features as Determinants of Facial Appearance." Perception 17, no. 1 (February 1988): 43–63. http://dx.doi.org/10.1068/p170043.

Full text
Abstract:
The encoding and relative importance of first-order (discrete) and second-order (configural) features in mental representations of unfamiliar faces have been investigated. Nonmetric multidimensional scaling (KYST) was carried out on similarity judgments of forty-one photographs of faces (homogeneous with respect to sex, race, facial expression, and, to a lesser extent, age). A large set of ratings, measurements, and ratios of measurements of the faces was regressed against the three-dimensional KYST solution in order to determine the first-order and second-order features used to judge similarity. Parameters characterizing both first-order and second-order features emerged as important determinants of facial similarity. First-order feature parameters characterizing the appearance of the eyes, eyebrows, and mouth, and second-order feature parameters characterizing the position of the eyes, spatial relations between the internal features, and chin shape correlated with the dimensions of the KYST solution. There was little difference in the extent to which first-order and second-order features were encoded. Two higher-level parameters, age and weight, were also used to judge similarity. The implications of these results for mental representations of faces are discussed.
APA, Harvard, Vancouver, ISO, and other styles
2

Lewczuk, Joanna. "Change of social value orientation affected by the observed mimical expression of the interaction partner." Studia z Teorii Wychowania X, no. 4 (29) (December 25, 2019): 85–106. http://dx.doi.org/10.5604/01.3001.0014.1075.

Full text
Abstract:
The issues addressed in this paper relate to a possible change in the observer’s social value orientation under the influence of a specific emotional expression being perceived on another individual’s face. The paper fits into the trend in the research into the link between social value orientations and the perception of a facial emotional expression. An „omnibus” type representative survey was carried out according to the experimental scheme, entirely via the Internet (N = 972). The following tools were used: for the measurement of social value orientations, a modified version of the Ring Measure of Social Values; for the experimental manipulation, photographs of facial expressions (happiness, anger, neutrality). In the light of the data obtained, one may, for the very first time, speak of social value orientations as of a dimension being susceptible to a change under the influence of a facial expression. A diversity of the indicators of the orientation on the others was shown, as well as of the distribution of the groups of the dominant social value orientations before and after the experimental manipulation, depending on the type of a basic facial emotional expression being presented (happiness vs anger). Directional predictions were confirmed with regard to the negative manipulation (expression of anger) which was followed by a reduction in the orientation on the others and a reduction in the total number of altruists, while the positive manipulation (expression of happiness) resulted in a general increase being observed in the number of altruists. It remains in line with the trend in predictions that observation of a positive facial expression triggers prosocial tendencies, while observation of a negative facial expression undermines prosocial tendencies. Keywords: social value orientations; prosociality; orientation on the self/orientation on the others; variability of social value orientations; Ring Measure of Social Values; facial emotional expressions
APA, Harvard, Vancouver, ISO, and other styles
3

Benton, Christopher P. "Effect of Photographic Negation on Face Expression Aftereffects." Perception 38, no. 9 (January 1, 2009): 1267–74. http://dx.doi.org/10.1068/p6468.

Full text
Abstract:
Our visual representation of facial expression is examined in this study: is this representation built from edge information, or does it incorporate surface-based information? To answer this question, photographic negation of grey-scale images is used. Negation preserves edge information whilst disrupting the surface-based information. In two experiments visual aftereffects produced by prolonged viewing of images of facial expressions were measured. This adaptation-based technique allows a behavioural assessment of the characteristics encoded by the neural systems underlying our representation of facial expression. The experiments show that photographic negation of the adapting images results in a profound decrease of expression aftereffect. Our visual representation of facial expression therefore appears to not just be built from edge information, but to also incorporate surface information. The latter allows an appreciation of the 3-D structure of the expressing face that, it is argued, may underpin the subtlety and range of our non-verbal facial communication.
APA, Harvard, Vancouver, ISO, and other styles
4

Kramer, Robin S. S. "Within-person variability in men’s facial width-to-height ratio." PeerJ 4 (March 10, 2016): e1801. http://dx.doi.org/10.7717/peerj.1801.

Full text
Abstract:
Background.In recent years, researchers have investigated the relationship between facial width-to-height ratio (FWHR) and a variety of threat and dominance behaviours. The majority of methods involved measuring FWHR from 2D photographs of faces. However, individuals can vary dramatically in their appearance across images, which poses an obvious problem for reliable FWHR measurement.Methods.I compared the effect sizes due to the differences between images taken with unconstrained camera parameters (Studies 1 and 2) or varied facial expressions (Study 3) to the effect size due to identity, i.e., the differences between people. In Study 1, images of Hollywood actors were collected from film screenshots, providing the least amount of experimental control. In Study 2, controlled photographs, which only varied in focal length and distance to camera, were analysed. In Study 3, images of different facial expressions, taken in controlled conditions, were measured.Results.Analyses revealed that simply varying the focal length and distance between the camera and face had a relatively small effect on FWHR, and therefore may prove less of a problem if uncontrolled in study designs. In contrast, when all camera parameters (including the camera itself) are allowed to vary, the effect size due to identity was greater than the effect of image selection, but the ranking of the identities was significantly altered by the particular image used. Finally, I found significant changes to FWHR when people posed with four of seven emotional expressions in comparison with neutral, and the effect size due to expression was larger than differences due to identity.Discussion.The results of these three studies demonstrate that even when head pose is limited to forward facing, changes to the camera parameters and a person’s facial expression have sizable effects on FWHR measurement. Therefore, analysing images that fail to constrain some of these variables can lead to noisy and unreliable results, but also relationships caused by previously unconsidered confounds.
APA, Harvard, Vancouver, ISO, and other styles
5

de Carvalho Rosas Gomes, Liliane, Karla Orfelina Carpio Horta, Luiz Gonzaga Gandini, Marcelo Gonçalves, and João Roberto Gonçalves. "Photographic assessment of cephalometric measurements." Angle Orthodontist 83, no. 6 (April 18, 2013): 1049–58. http://dx.doi.org/10.2319/120712-925.1.

Full text
Abstract:
ABSTRACT Objective: To investigate the relationship between craniofacial measurements obtained from cephalometric radiographs and analogous measurements from profile photographs. Materials and Methods: Lateral cephalograms and standardized facial profile photographs were obtained from a sample of 123 subjects (65 girls, 58 boys; age 7–12 years). Intraclass correlation coefficients (ICCs) were calculated from repeated photographic measurements to evaluate method reliability. Analogous cephalometric and photographic measurements were compared to assess Pearson correlation coefficients. Linear regression analyses were conducted between the measurements that achieved correlation coefficients greater than r = 0.7. Results: The reliability of the photographic technique was satisfactory. Most measurements showed ICCs above 0.80 and highly significant correlations (P ≤ .001) with cephalometric variables. Among all measurements used, the A'N'B' angle was the most effective in explaining the variability of its analogous cephalometric, mainly for female subjects (r2 = 0.80). The FMA' angle showed the best results for vertical assessment (r2 = 0.65). Conclusions: The photographic method has proven to be a repeatable and reproducible tool provided that a standardized protocol is followed. Therefore, it may be considered a feasible and practical diagnostic alternative, particularly if there is a need for a low-cost and noninvasive method.
APA, Harvard, Vancouver, ISO, and other styles
6

Hausken, Liv. "Photographic Passport Biometry." Public 30, no. 60 (March 1, 2020): 50–59. http://dx.doi.org/10.1386/public_00005_7.

Full text
Abstract:
When ICAO approved a new standard for international passports, they recommended including a high-resolution facial image on a chip in addition to the visual portrait on the identity page. Accordingly, there is, in a certain sense, two images in the current passport, one on the chip, the other visually displayed. In this article I relate the functional distribution between these two images to the nineteenth-century mugshot and argue that the photographically generated images in the current passport represent a subdued tension between two parallel paths in the history of photography: depiction and measurement. By looking at the arguments for facial recognition technologies in today's passport as specified by ICAO, I argue that the current regulation of international mobility downplays the importance of physical measurements and hides this behind photography’s more familiar function, namely depiction. This contributes to conceal biometrics as a tool for power and control in today’s society.
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Xingzhong, Mark G. Hans, Greg Graham, H. Lester Kirchner, and Susan Redline. "Correlations between cephalometric and facial photographic measurements of craniofacial form." American Journal of Orthodontics and Dentofacial Orthopedics 131, no. 1 (January 2007): 67–71. http://dx.doi.org/10.1016/j.ajodo.2005.02.033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Calder, A. J., A. W. Young, D. Rowland, D. R. Gibbenson, B. M. Hayes, and D. I. Perrett. "Perception of Photographic-Quality Caricatures of Emotional Facial Expressions." Perception 25, no. 1_suppl (August 1996): 28. http://dx.doi.org/10.1068/v96l1004.

Full text
Abstract:
G Rhodes, S E Brennan, S Carey (1987 Cognitive Psychology19 473 – 497) and P J Benson and D I Perrett (1991 European Journal of Cognitive Psychology3 105 – 135) have shown that computer-enhanced (caricatured) representations of familiar faces are named faster and rated as better likenesses than veridical (undistorted) representations. Here we have applied Benson and Perrett's graphic technique to examine subjects' perception of enhanced representations of photographic-quality facial expressions of basic emotions. To enhance a facial expression the target face is compared to a norm or prototype face, and, by exaggerating the differences between the two, a caricatured image is produced; reducing the differences results in an anticaricatured image. In experiment 1 we examined the effect of degree of caricature and types of norm on subjects' ratings for ‘intensity of expression’. Three facial expressions (fear, anger, and sadness) were caricatured at seven levels (−50%, −30%, −15%, 0%, +15%, +30%, and +50%) relative to three different norms; (1) an average norm prepared by blending pictures of six different emotional expressions; (2) a neutral expression norm; and (3) a different expression norm (eg anger caricatured relative to a happy expression). Irrespective of norm, the caricatured expressions were rated as significantly more intense than the veridical images. Furthermore, for the average and neutral norm sets, the anticaricatures were rated as significantly less intense. We also examined subjects' reaction times to recognise caricatured (−50%, 0%, and +50%) representations of six emotional facial expressions. The results showed that the caricatured images were identified fastest, followed by the veridical, and then anticaricatured images. Hence the perception of facial expression and identity is facilitated by caricaturing; this has important implications for the mental representation of facial expressions.
APA, Harvard, Vancouver, ISO, and other styles
9

Guan, Haoming, Hongxu Wei, Richard J. Hauer, and Ping Liu. "Facial expressions of Asian people exposed to constructed urban forests: Accuracy validation and variation assessment." PLOS ONE 16, no. 6 (June 17, 2021): e0253141. http://dx.doi.org/10.1371/journal.pone.0253141.

Full text
Abstract:
An outcome of building sustainable urban forests is that people’s well-being is improved when they are exposed to trees. Facial expressions directly represents one’s inner emotions, and can be used to assess real-time perception. The emergence and change in the facial expressions of forest visitors are an implicit process. As such, the reserved character of Asians requires an instrument rating to accurately recognize expressions. In this study, a dataset was established with 2,886 randomly photographed faces from visitors at a constructed urban forest park and at a promenade during summertime in Shenyang City, Northeast China. Six experts were invited to choose 160 photos in total with 20 images representing one of eight typical expressions: angry, contempt, disgusted, happy, neutral, sad, scared, and surprised. The FireFACE ver. 3.0 software was used to test hit-ratio validation as an accuracy measurement (ac.) to match machine-recognized photos with those identified by experts. According to the Kruskal-Wallis test on the difference from averaged scores in 20 recently published papers, contempt (ac. = 0.40%, P = 0.0038) and scared (ac. = 25.23%, P = 0.0018) expressions do not pass the validation test. Both happy and sad expression scores were higher in forests than in promenades, but there were no difference in net positive response (happy minus sad) between locations. Men had a higher happy score but lower disgusted score in forests than in promenades. Men also had a higher angry score in forests. We conclude that FireFACE can be used for analyzing facial expressions in Asian people within urban forests. Women are encouraged to visit urban forests rather than promenades to elicit more positive emotions.
APA, Harvard, Vancouver, ISO, and other styles
10

Ozdemir, Senem Turan, Deniz Sigirli, Ilker Ercan, and N. Simsek Cankur. "Photographic Facial Soft Tissue Analysis of Healthy Turkish Young Adults: Anthropometric Measurements." Aesthetic Plastic Surgery 33, no. 2 (December 13, 2008): 175–84. http://dx.doi.org/10.1007/s00266-008-9274-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Facial expression – Photographic measurements"

1

Bennett, Troy. "Human-IntoFace.net : May 6th, 2003 /." access the artist's thesis portfolio on the Web, 2003. http://human-intoface.net/.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Facial expression – Photographic measurements"

1

Invention of hysteria: Charcot and the photographic iconography of the Salpêtrière. Cambridge, Mass: MIT Press, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Didi-Huberman, Georges. Invention of Hysteria: Charcot and the Photographic Iconography of the Salpetriere. The MIT Press, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Didi-Huberman, Georges. Invention of Hysteria: Charcot and the Photographic Iconography of the Salpetriere. The MIT Press, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Facial expression – Photographic measurements"

1

Lee, Will, Danielle Allessio, William Rebelsky, Sai Satish Gattupalli, Hao Yu, Ivon Arroyo, Margrit Betke, et al. "Measurements and Interventions to Improve Student Engagement Through Facial Expression Recognition." In Adaptive Instructional Systems, 286–301. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-05887-5_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Weigel, Sigrid. "Faces: Between Trace and Image, Encoding and Measurement." In Grammatology of Images, 43–83. Fordham University Press, 2022. http://dx.doi.org/10.5422/fordham/9781531500153.003.0003.

Full text
Abstract:
The chapter analyses the ‘facial-expression’–code in recent emotion-research (neuro-psychology) as a modern equivalent to the vera icon-problem in the image of Christ's face, namely the image of the face representing inaccessible traces of something entirely heterogeneous: emotions on the one hand and Christ's lacking body on the other. Departing from recent empirical emotion research, the data-iconography of digital imaging, and the underlying ‘Facial Action Coding System’ (FACS), the chapter presents an archaeology of the components of the system: a limited number of facial movements-patterns represented on the frontal face as code or indicator of certain emotions. Whereas the equation of certain facial patterns with one of the emotions from the catalogue of emotions is traced back via Duchenne de Bologne's physiognomic orthography up to Le Brun, the schema of the frontal face refers to the enface-format or frontal icon in the history of the likeness. The chapter closes with examples from art and photography to demonstrate the large distance between the facial expression code and the multiplicity and partial unreadability of expressive faces in humans.
APA, Harvard, Vancouver, ISO, and other styles
3

Quatieri, Thomas F., and James R. Williamson. "Multimodal Biomarkers to Discriminate Cognitive State." In The Role of Technology in Clinical Neuropsychology. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190234737.003.0021.

Full text
Abstract:
Multimodal biomarkers based on behavioral, neurophysiological, and cognitive measurements have recently increased in popularity for the detection of cognitive stress and neurologically based disorders. Such conditions significantly and adversely affect human performance and quality of life in a large fraction of the world’s population. Example modalities used in detection of these conditions include speech, facial expression, physiology, eye tracking, gait, and electroencephalography (EEG). Toward the goal of finding simple, noninvasive means to detect, predict, and monitor cognitive stress and neurological conditions, MIT Lincoln Laboratory is developing biomarkers that satisfy three criteria. First, we seek biomarkers that reflect core components of cognitive status, such as work­ing memory capacity, processing speed, attention, and arousal. Second, and as importantly, we seek biomarkers that reflect timing and coordination relations both within components of each modality and across different modalities. This is based on the hypothesis that neural coordination across different parts of the brain is essential in cognition. An example of timing and coordination within a modality is the set of finely timed and synchronized physiological components of speech production, whereas an example of coordination across modalities is the timing and synchrony that occur between speech and facial expression during speaking. Third, we seek multimodal biomarkers that contribute in a complementary fashion under various channel and background conditions. In this chapter, as an illustration of the biomarker approach, we focus on cognitive stress and the particular case of detecting different cognitive load levels. We also briefly show how similar feature-extraction principles can be applied to a neurological condition through the example of major depressive disorder (MDD). MDD is one of several neuropsychiatric disorders where multimodal biomarkers based on principles of timing and coordination are important for detection (Cummins et al., 2015; Helfer et al., 2014; Quatieri & Malyska, 2012; Trevino, Quatieri, & Malyska, 2011; Williamson, Quatieri, Helfer, Ciccarelli, & Mehta, 2014; Williamson et al., 2013, 2015; Yu, Quatieri, Williamson, & Mundt, 2014).
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Facial expression – Photographic measurements"

1

Santos, Patrick, Erbene de Castro Maia, Miriam Goubran, and Emil Petriu. "Facial expression communication for healthcare androids." In 2013 IEEE International Symposium on Medical Measurements and Applications (MeMeA). IEEE, 2013. http://dx.doi.org/10.1109/memea.2013.6549703.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Song, Kai-Tai, Meng-Ju Han, and Shuo-Hung Chang. "Pose-variant facial expression recognition using an embedded image system." In Fourth International Symposium on Precision Mechanical Measurements, edited by Yetai Fei, Kuang-Chao Fan, and Rongsheng Lu. SPIE, 2008. http://dx.doi.org/10.1117/12.819570.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Oz, I. A., and M. M. Khan. "Efficacy of biophysiological measurements at FTFPs for facial expression classification: A validation." In 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI). IEEE, 2012. http://dx.doi.org/10.1109/bhi.2012.6211519.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Qing, Marius D. Cordea, Emil M. Petriu, Thomas E. Whalen, Imre J. Rudas, and Annamaria Varkonyi-Koczy. "Hand-Gesture and Facial-Expression Human-Computer Interfaces for Intelligent Space Applications." In 2008 IEEE International Workshop on Medical Measurements and Applications (MeMeA). IEEE, 2008. http://dx.doi.org/10.1109/memea.2008.4542987.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography