Добірка наукової літератури з теми "Facial expression manipulation"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Facial expression manipulation".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "Facial expression manipulation"

1

Kobai, Ryota, and Hiroki Murakami. "Effects of interactions between facial expressions and self-focused attention on emotion." PLOS ONE 16, no. 12 (December 23, 2021): e0261666. http://dx.doi.org/10.1371/journal.pone.0261666.

Повний текст джерела
Анотація:
Self-focus is a type of cognitive processing that maintains negative emotions. Moreover, bodily feedback is also essential for maintaining emotions. This study investigated the effect of interactions between self-focused attention and facial expressions on emotions. The results indicated that control facial expression manipulation after self-focus reduced happiness scores. On the contrary, the smiling facial expression manipulation after self-focus increased happiness scores marginally. However, facial expressions did not affect positive emotions after the other-focus manipulation. These findings suggest that self-focus plays a pivotal role in facial expressions’ effect on positive emotions. However, self-focusing is insufficient for decreasing positive emotions, and the interaction between self-focus and facial expressions is crucial for developing positive emotions.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Geng, Zhenglin, Chen Cao, and Sergey Tulyakov. "Towards Photo-Realistic Facial Expression Manipulation." International Journal of Computer Vision 128, no. 10-11 (August 28, 2020): 2744–61. http://dx.doi.org/10.1007/s11263-020-01361-8.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Kuehne, Maria, Isabelle Siwy, Tino Zaehle, Hans-Jochen Heinze, and Janek S. Lobmaier. "Out of Focus: Facial Feedback Manipulation Modulates Automatic Processing of Unattended Emotional Faces." Journal of Cognitive Neuroscience 31, no. 11 (November 2019): 1631–40. http://dx.doi.org/10.1162/jocn_a_01445.

Повний текст джерела
Анотація:
Facial expressions provide information about an individual's intentions and emotions and are thus an important medium for nonverbal communication. Theories of embodied cognition assume that facial mimicry and resulting facial feedback plays an important role in the perception of facial emotional expressions. Although behavioral and electrophysiological studies have confirmed the influence of facial feedback on the perception of facial emotional expressions, the influence of facial feedback on the automatic processing of such stimuli is largely unexplored. The automatic processing of unattended facial expressions can be investigated by visual expression-related MMN. The expression-related MMN reflects a differential ERP of automatic detection of emotional changes elicited by rarely presented facial expressions (deviants) among frequently presented facial expressions (standards). In this study, we investigated the impact of facial feedback on the automatic processing of facial expressions. For this purpose, participants ( n = 19) performed a centrally presented visual detection task while neutral (standard), happy, and sad faces (deviants) were presented peripherally. During the task, facial feedback was manipulated by different pen holding conditions (holding the pen with teeth, lips, or nondominant hand). Our results indicate that automatic processing of facial expressions is influenced and thus dependent on the own facial feedback.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Lewczuk, Joanna. "Change of social value orientation affected by the observed mimical expression of the interaction partner." Studia z Teorii Wychowania X, no. 4 (29) (December 25, 2019): 85–106. http://dx.doi.org/10.5604/01.3001.0014.1075.

Повний текст джерела
Анотація:
The issues addressed in this paper relate to a possible change in the observer’s social value orientation under the influence of a specific emotional expression being perceived on another individual’s face. The paper fits into the trend in the research into the link between social value orientations and the perception of a facial emotional expression. An „omnibus” type representative survey was carried out according to the experimental scheme, entirely via the Internet (N = 972). The following tools were used: for the measurement of social value orientations, a modified version of the Ring Measure of Social Values; for the experimental manipulation, photographs of facial expressions (happiness, anger, neutrality). In the light of the data obtained, one may, for the very first time, speak of social value orientations as of a dimension being susceptible to a change under the influence of a facial expression. A diversity of the indicators of the orientation on the others was shown, as well as of the distribution of the groups of the dominant social value orientations before and after the experimental manipulation, depending on the type of a basic facial emotional expression being presented (happiness vs anger). Directional predictions were confirmed with regard to the negative manipulation (expression of anger) which was followed by a reduction in the orientation on the others and a reduction in the total number of altruists, while the positive manipulation (expression of happiness) resulted in a general increase being observed in the number of altruists. It remains in line with the trend in predictions that observation of a positive facial expression triggers prosocial tendencies, while observation of a negative facial expression undermines prosocial tendencies. Keywords: social value orientations; prosociality; orientation on the self/orientation on the others; variability of social value orientations; Ring Measure of Social Values; facial emotional expressions
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Dethier, Marie, Sylvie Blairy, Hannah Rosenberg, and Skye McDonald. "Emotional Regulation Impairments Following Severe Traumatic Brain Injury: An Investigation of the Body and Facial Feedback Effects." Journal of the International Neuropsychological Society 19, no. 4 (January 28, 2013): 367–79. http://dx.doi.org/10.1017/s1355617712001555.

Повний текст джерела
Анотація:
AbstractThe object of this study was to evaluate the combined effect of body and facial feedback in adults who had suffered from a severe traumatic brain injury (TBI) to gain some understanding of their difficulties in the regulation of negative emotions. Twenty-four participants with TBI and 28 control participants adopted facial expressions and body postures according to specific instructions and maintained these positions for 10 s. Expressions and postures entailed anger, sadness, and happiness as well as a neutral (baseline) condition. After each expression/posture manipulation, participants evaluated their subjective emotional state (including cheerfulness, sadness, and irritation). TBI participants were globally less responsive to the effects of body and facial feedback than control participants, F(1,50) = 5.89, p = .02, η2 = .11. More interestingly, the TBI group differed from the Control group across emotions, F(8,400) = 2.51, p = .01, η2 = .05. Specifically, participants with TBI were responsive to happy but not to negative expression/posture manipulations whereas control participants were responsive to happy, angry, and sad expression/posture manipulations. In conclusion, TBI appears to impair the ability to recognize both the physical configuration of a negative emotion and its associated subjective feeling. (JINS, 2013, 19, 1–13)
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Pollick, Frank E., Harold Hill, Andrew Calder, and Helena Paterson. "Recognising Facial Expression from Spatially and Temporally Modified Movements." Perception 32, no. 7 (July 2003): 813–26. http://dx.doi.org/10.1068/p3319.

Повний текст джерела
Анотація:
We examined how the recognition of facial emotion was influenced by manipulation of both spatial and temporal properties of 3-D point-light displays of facial motion. We started with the measurement of 3-D position of multiple locations on the face during posed expressions of anger, happiness, sadness, and surprise, and then manipulated the spatial and temporal properties of the measurements to obtain new versions of the movements. In two experiments, we examined recognition of these original and modified facial expressions: in experiment 1, we manipulated the spatial properties of the facial movement, and in experiment 2 we manipulated the temporal properties. The results of experiment 1 showed that exaggeration of facial expressions relative to a fixed neutral expression resulted in enhanced ratings of the intensity of that emotion. The results of experiment 2 showed that changing the duration of an expression had a small effect on ratings of emotional intensity, with a trend for expressions with shorter durations to have lower ratings of intensity. The results are discussed within the context of theories of encoding as related to caricature and emotion.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Lin, Wenyi, Jing Hu, and Yanfei Gong. "Is it Helpful for Individuals with Minor Depression to Keep Smiling? An Event-Related Potentials Analysis." Social Behavior and Personality: an international journal 43, no. 3 (April 23, 2015): 383–96. http://dx.doi.org/10.2224/sbp.2015.43.3.383.

Повний текст джерела
Анотація:
We used event-related potentials (ERPs) to explore the influence of manipulating facial expression on error monitoring in individuals. The participants were 11 undergraduate students who had been diagnosed with minor depression (MinD). We recorded error-related negativity (ERN) as the participants performed a modified flanker task in 3 conditions: Duchenne smile, standard smile, and no smile. Behavioral data results showed that, in both the Duchenne smile and standard smile conditions, error rates were significantly lower than in the no-smile condition. The ERP analysis results indicated that, compared to the no-smile condition, both Duchenne and standard smiling facial expressions decreased ERN amplitude, and ERN amplitudes were smallest for those in the Duchenne smile condition. Our findings suggested that even brief smile manipulation may improve long-term negative mood states of people with MinD.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Guo, Zixin, and Ruizhi Yang. "A channel attention and feature manipulation network for facial expression recognition." Applied and Computational Engineering 6, no. 1 (June 14, 2023): 1344–54. http://dx.doi.org/10.54254/2755-2721/6/20230751.

Повний текст джерела
Анотація:
Facial expression conveys a variety of emotional and intentional message from human beings, and automated facial expression recognition (FER) has become an ongoing and promising research topic in the field of computer vision. However, the primary challenge of FER is learning to discriminate similar features among different emotion categories. In this paper, a hybrid architecture using Efficient Channel Attention (ECA) residual network ResNet-18, and feature manipulation network is proposed to tackle the above challenge. First, the ECA residual network effectively extract input features with local cross-channel interaction. Then, the feature decomposition network (FDN), feature reconstruction network (FRN) modules are added to decompose and aggregate latent features for enhancing the compactness of intra-category features and discrimination of inter-category features. Finally, an expression prediction network is connected to FRN to draw the final expression classification result. To examine the efficacy of the suggested approach, the model is trained independently using in-the-lab (CK+) and in-the-wild (RAF-DB) datasets. Several important evaluation metrics such as confusion matrix, Grad-CAM are reported, and the ablation study is conducted for demonstrating the efficacy and interpretability of the proposed network. It achieves the state-of-the-art accuracy compared to the existing facial recognition work, at 99.70% in CK+ and 89.17% in RAF-DB.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Yi, Jonathan, Philip Pärnamets, and Andreas Olsson. "The face value of feedback: facial behaviour is shaped by goals and punishments during interaction with dynamic faces." Royal Society Open Science 8, no. 7 (July 2021): 202159. http://dx.doi.org/10.1098/rsos.202159.

Повний текст джерела
Анотація:
Responding appropriately to others' facial expressions is key to successful social functioning. Despite the large body of work on face perception and spontaneous responses to static faces, little is known about responses to faces in dynamic, naturalistic situations, and no study has investigated how goal directed responses to faces are influenced by learning during dyadic interactions. To experimentally model such situations, we developed a novel method based on online integration of electromyography signals from the participants’ face (corrugator supercilii and zygomaticus major) during facial expression exchange with dynamic faces displaying happy and angry facial expressions. Fifty-eight participants learned by trial-and-error to avoid receiving aversive stimulation by either reciprocate (congruently) or respond opposite (incongruently) to the expression of the target face. Our results validated our method, showing that participants learned to optimize their facial behaviour, and replicated earlier findings of faster and more accurate responses in congruent versus incongruent conditions. Moreover, participants performed better on trials when confronted with smiling, when compared with frowning, faces, suggesting it might be easier to adapt facial responses to positively associated expressions. Finally, we applied drift diffusion and reinforcement learning models to provide a mechanistic explanation for our findings which helped clarifying the underlying decision-making processes of our experimental manipulation. Our results introduce a new method to study learning and decision-making in facial expression exchange, in which there is a need to gradually adapt facial expression selection to both social and non-social reinforcements.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Boker, Steven M., Jeffrey F. Cohn, Barry-John Theobald, Iain Matthews, Timothy R. Brick, and Jeffrey R. Spies. "Effects of damping head movement and facial expression in dyadic conversation using real–time facial expression tracking and synthesized avatars." Philosophical Transactions of the Royal Society B: Biological Sciences 364, no. 1535 (December 12, 2009): 3485–95. http://dx.doi.org/10.1098/rstb.2009.0152.

Повний текст джерела
Анотація:
When people speak with one another, they tend to adapt their head movements and facial expressions in response to each others' head movements and facial expressions. We present an experiment in which confederates' head movements and facial expressions were motion tracked during videoconference conversations, an avatar face was reconstructed in real time, and naive participants spoke with the avatar face. No naive participant guessed that the computer generated face was not video. Confederates' facial expressions, vocal inflections and head movements were attenuated at 1 min intervals in a fully crossed experimental design. Attenuated head movements led to increased head nods and lateral head turns, and attenuated facial expressions led to increased head nodding in both naive participants and confederates. Together, these results are consistent with a hypothesis that the dynamics of head movements in dyadicconversation include a shared equilibrium. Although both conversational partners were blind to the manipulation, when apparent head movement of one conversant was attenuated, both partners responded by increasing the velocity of their head movements.
Стилі APA, Harvard, Vancouver, ISO та ін.

Дисертації з теми "Facial expression manipulation"

1

Yan, Sen. "Personalizing facial expressions by exploring emotional mental prototypes." Electronic Thesis or Diss., CentraleSupélec, 2023. http://www.theses.fr/2023CSUP0002.

Повний текст джерела
Анотація:
Les expressions faciales sont une forme essentielle de communication non verbale. Aujourd’hui, les techniques de manipulation des expressions faciales (FEM) ont envahi notre quotidien. Cependant, dans le contexte de l’application, plusieurs exigences doivent être satisfaites. Diversité : les prototypes d’expression faciale doivent être multiples et différents selon les utilisateurs. Flexibilité : les expressions faciales doivent être personnalisées, c’est-à-dire que le système peut trouver le prototype d’expression faciale qui répond aux besoins des utilisateurs. Exhaustivité : la plupart des technologies FEM ne peuvent traiter que les six émotions de base, alors qu’il existe plus de 4000 émotions dans le monde réel. Absence d’expertise : le système FEM doit pouvoir être contrôlé par n’importe qui sans nécessiter de connaissances spécialisées (par exemple, des psychologues). Efficacité : le système avec interaction doit tenir compte de la fatigue de l’utilisateur. Dans cette thèse, pour répondre à toutes les exigences, nous avons proposé une approche interdisciplinaire en combinant les réseaux adversaires génératifs avec le processus de corrélation renversée psychophysique. De plus, nous avons créé un algorithme génétique microbien interactif pour optimiser l’ensemble du système
Facial expressions are an essential form of nonverbal communication. Nowfacial expression manipulation (FEM) techniques have flooded our daily lives. However,in the application context, there are several requirements that need to be addressed. Diversity: facial expression prototypes should be multiple and different between different users. Flexibility: facial expressions should be personalized, i.e., the system can find the facial expression prototype that can meet the need of the users. Exhaustiveness: most FEM technologies can only deal with the six basic emotions, whereas there are more than 4000 emotion labels. Expertise-free: the FEM system should be controllable by anyone withoutthe need for expert knowledge (e.g., psychol-ogists). Efficiency: the system with interactionshould consider user fatigue. In this thesis, to fulfill all the requirements,we proposed an interdisciplinary approach by combining generative adversarial networks with the psychophysical reverse correlation process. Moreover, we created an interactive microbial genetic algorithm to optimize the entire system
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Lang, Charlene Jasmin. "The relationship between depressive symptoms, rumination and sensitivity to emotion specified in facial expressions." Thesis, University of Canterbury. Psychology, 2011. http://hdl.handle.net/10092/5347.

Повний текст джерела
Анотація:
In social interactions it is important for perceivers to be able to differentiate between facial expressions of emotion associated with a congruent emotional experience (genuine expressions) and those that are not (posed expressions). This research investigated the sensitivity of participants with a range of depressive symptom severity and varying levels of rumination to the differences between genuine and posed facial expressions The suggested mechanisms underlying impairments in emotion recognition were also investigated; the effect of cognitive load (as a distraction from deliberate processing of stimuli) and attention, and the relationships between mechanisms and sensitivity across a range of depressive symptoms and level of rumination. Participants completed an emotion categorisation task in which they were asked if targets were showing either happiness or sadness, and then if targets were feeling those emotions. Participants also completed the same task under cognitive load. In addition, a recognition task was used to measure attention. Results showed that when making judgements about whether targets were feeling sad lower sensitivity was related to higher levels of depressive symptoms, but contrary to predictions, only when under cognitive load. Depressive symptoms and rumination were not related to higher levels of bias towards sad expressions. Recognition did not show a relationship with sensitivity, rumination or depression scores. Cognitive load did not show the expected effects or improving sensitivity but instead showed lower sensitivity scores in some conditions compared to conditions without load. Implications of results are discussed, as well as directions for future research.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Chan, Kai-Fu, and 詹凱富. "Manipulation of human facial expressions via finite elements." Thesis, 2000. http://ndltd.ncl.edu.tw/handle/44231864906180191850.

Повний текст джерела
Анотація:
碩士
國立交通大學
機械工程系
88
In this paper, we analyze the human facial deformation due to applied muscle force by the finite element method. We take the results of the deformation to control facial expressions. There are six degrees of freedom on each node of the finite element composed of DKT and CST. Besides, we discuss the knowledge about computer graphics and the muscle distributions of human face and their functions. We present the results of our experiment in color pictures with contrast of the expressions before and after the force application. The major purpose of this paper is that we hope to offer later generations some directions of making some better simulations by the results of our experimentation. Or we can offer some applying information when we want to add some facial expressions for robots.
Стилі APA, Harvard, Vancouver, ISO та ін.

Книги з теми "Facial expression manipulation"

1

Wittmer, Virgil Theodore. Defensive style: Manipulation of facial expressiveness and its impact on psychophysiological reactivity and self-reported anxiety. 1985.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Averbach, Patrina. Understand People : Laws about the Manipulation of the Human Mind: Read Facial Expressions. Independently Published, 2021.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Institute, Psychology, and Jason Moonie. How to Analyze People with Dark Psychology: Learn Secrets and Techniques to Speed Reading People with NLP, Emotional Intelligence, Body Language, Facial Expressions and Mind Manipulation Easily. Independently Published, 2021.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Griffith, Joseph. Analyze People with Body Language Reading: The Ultimate Guide to Speed-Reading of Human Personality Types by Analyzing Body Language, Facial Expressions, Manipulation, and by Learning to Decode People. Independently Published, 2021.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Частини книг з теми "Facial expression manipulation"

1

Ling, Jun, Han Xue, Li Song, Shuhui Yang, Rong Xie, and Xiao Gu. "Toward Fine-Grained Facial Expression Manipulation." In Computer Vision – ECCV 2020, 37–53. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58604-1_3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Tolosana, Ruben, Ruben Vera-Rodriguez, Julian Fierrez, Aythami Morales, and Javier Ortega-Garcia. "An Introduction to Digital Face Manipulation." In Handbook of Digital Face Manipulation and Detection, 3–26. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-87664-7_1.

Повний текст джерела
Анотація:
AbstractDigital manipulation has become a thriving topic in the last few years, especially after the popularity of the term DeepFakes. This chapter introduces the prominent digital manipulations with special emphasis on the facial content due to their large number of possible applications. Specifically, we cover the principles of six types of digital face manipulations: (i) entire face synthesis, (ii) identity swap, (iii) face morphing, (iv) attribute manipulation, (v) expression swap (a.k.a. face reenactment or talking faces), and (vi) audio- and text-to-video. These six main types of face manipulation are well established by the research community, having received the most attention in the last few years. In addition, we highlight in this chapter publicly available databases and code for the generation of digital fake content.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Matsumoto, Asami, Yuta Tange, Atsushi Nakazawa, and Toyoaki Nishida. "Estimation of Task Difficulty and Habituation Effect While Visual Manipulation Using Pupillary Response." In Video Analytics. Face and Facial Expression Recognition and Audience Measurement, 24–35. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-56687-0_3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Amberg, Brian, Pascal Paysan, and Thomas Vetter. "Weight, Sex, and Facial Expressions: On the Manipulation of Attributes in Generative 3D Face Models." In Advances in Visual Computing, 875–85. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-10331-5_81.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Dodds, Sherril. "CRY." In Facial Choreographies, 111–36. Oxford University PressNew York, 2024. http://dx.doi.org/10.1093/oso/9780197620366.003.0005.

Повний текст джерела
Анотація:
Abstract The CRY chapter analyzes Maddie Ziegler’s emotive performance as a surrogate face for Sia in her music video “Big Girls Cry.” The video shows Ziegler performing an arty facial choreography that both moves and disturbs: young female spectators and music critics speak of the affective capacity of her performance, and feminist scholarship illuminates how it critiques the expectations placed on girls and women that reduces their emotional expression to that of cry-babies but which needs to be kept in check as adult women. Yet literature on celebrity and whiteness reveals how Sia replaces her own face with that of a beautiful child star to ensure that her economic and racial privilege is maintained. The chapter develops the idea of a manipulative cry to show how the facial choreography produces affective connections between Sia, Ziegler, and their fans, but that it stages an ethically questionable performance of a young girl.
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "Facial expression manipulation"

1

Wang, Feng, Suncheng Xiang, Ting Liu, and Yuzhuo Fu. "Attention Based Facial Expression Manipulation." In 2021 IEEE International Conference on Multimedia & Expo Workshops (ICMEW). IEEE, 2021. http://dx.doi.org/10.1109/icmew53276.2021.9456007.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Rajyaguru, Keval, Srimanta Mandal, and Suman K. Mitra. "Temporally Consistent Video Manipulation for Facial Expression Transfer." In 2022 IEEE IAS Global Conference on Emerging Technologies (GlobConET). IEEE, 2022. http://dx.doi.org/10.1109/globconet53749.2022.9872385.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Kim, Seongho, and Byung Cheol Song. "Emotional Intensity-Aware Learning for Facial Expression Manipulation." In 2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC). IEEE, 2023. http://dx.doi.org/10.1109/itc-cscc58803.2023.10212641.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Zhu, Jiashu, and Junfei Huang. "LGA-GAN: landmarks guided attentive generative adversarial network for facial expression manipulation." In 2021 International Conference on Computer Vision and Pattern Analysis, edited by Ruimin Hu, Yang Yue, and Siting Chen. SPIE, 2022. http://dx.doi.org/10.1117/12.2626924.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Mazaheri, Ghazal, and Amit K. Roy-Chowdhury. "Detection and Localization of Facial Expression Manipulations." In 2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV). IEEE, 2022. http://dx.doi.org/10.1109/wacv51458.2022.00283.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Arya, G. J., K. S. Ajish Kumar, and R. Rajasree. "Synthesize of emotional facial expressions through manipulating facial parameters." In 2014 International Conference on Control, Instrumentation, Communication and Computational Technologies (ICCICCT). IEEE, 2014. http://dx.doi.org/10.1109/iccicct.2014.6993088.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Banerjee, Sandipan, Ajjen Joshi, Prashant Mahajan, Sneha Bhattacharya, Survi Kyal, and Taniya Mishra. "LEGAN: Disentangled Manipulation of Directional Lighting and Facial Expressions whilst Leveraging Human Perceptual Judgements." In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 2021. http://dx.doi.org/10.1109/cvprw53098.2021.00169.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Suzuki, K., Y. Takeuchi, and J. Heo. "THE EFFECT OF LIGHTING ENVIRONMENT ON FACIAL EXPRESSION PERCEPTION IN VIDEO TELECONFERENCING." In CIE 2021 Conference. International Commission on Illumination, CIE, 2021. http://dx.doi.org/10.25039/x48.2021.op51.

Повний текст джерела
Анотація:
In this study, we investigated whether manipulating the lighting environment in videoconferencing changes the readability of facial expressions. In the experiment, the participants were asked to evaluate their impressions of a video that simulated the situation in a videoconference. A total of 12 lighting conditions were used, including three colour temperature conditions and four lighting directions conditions. As a result of the factor analysis, four factors were identified: "Clarity," "Dynamism," "Naturalness," and "Healthiness." The results of ANOVA showed that placing the lighting in front was effective for all factors. And in all of the factors, it showed that lighting from the front was effective for the participants. In addition, while lower colour temperature decreased clarity, it improved naturalness and healthiness and was particularly effective when the lighting was placed in front of the subject.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Zak, Monika, Bruno Laeng, and Joschua T. Simon-Liedtke. "Can the recognition of emotional expressions be enhanced by manipulating facial skin color?" In 2015 Colour and Visual Computing Symposium (CVCS). IEEE, 2015. http://dx.doi.org/10.1109/cvcs.2015.7274897.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Aravot, Iris. "An Attempt at Making Urban Design Principles Explicit." In 1995 ACSA International Conference. ACSA Press, 1995. http://dx.doi.org/10.35483/acsa.intl.1995.42.

Повний текст джерела
Анотація:
Since its rise as an autonomous field in the seventies, Urban Design has been a conglomerate of diverse concepts and value outlooks.The present approach, which is an a posteriori propositional expression of applications in actual practice and education, presents both theory and method by means of ten points. The approach is basically generated by formal considerations, thus originating in and focussing on aspects which cannot be expressed through theory and methods of other disciplines. It starts with systematic, conventional and objective studies which are then connected to a system of manipulations – the rules of game – which emphasize interpretation and are clarified by narrative and formal metaphors. The ‘rules of game’ set a framework of no a priori preferred contents, which is then applied according to local characteristics, needs and potentials. This conceptual – interpretative framework imposes a structural, consistent and hierarchical system on the factual data, so as to assure the realization of two apparently opposed values: (1) unity and phenomenological qualities and (2) free development and unfolding of the design that .The propositional expression of the approach aims at its exposure to explicit evaluation and criticism.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії