Littérature scientifique sur le sujet « Facial cue »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Facial cue ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "Facial cue"
Coetzee, Vinet, David I. Perrett et Ian D. Stephen. « Facial Adiposity : A Cue to Health ? » Perception 38, no 11 (janvier 2009) : 1700–1711. http://dx.doi.org/10.1068/p6423.
Texte intégralRussell, Richard, Aurélie Porcheron, Jennifer Sweda, Emmanuelle Mauger et Frederique Morizot. « Facial contrast is a cue for health perception ». Journal of Vision 15, no 12 (1 septembre 2015) : 1213. http://dx.doi.org/10.1167/15.12.1213.
Texte intégralWatanabe, Noriya, Masahiko Haruno et Masamichi Sakagami. « Emotional facial expression accelerates cue-reward association learning ». Neuroscience Research 68 (janvier 2010) : e291. http://dx.doi.org/10.1016/j.neures.2010.07.1292.
Texte intégralQuist, Michelle C., Christopher D. Watkins, Finlay G. Smith, Lisa M. DeBruine et Benedict C. Jones. « Facial masculinity is a cue to women’s dominance ». Personality and Individual Differences 50, no 7 (mai 2011) : 1089–93. http://dx.doi.org/10.1016/j.paid.2011.01.032.
Texte intégralLaw Smith, M. J., D. I. Perrett, B. C. Jones, R. E. Cornwell, F. R. Moore, D. R. Feinberg, L. G. Boothroyd et al. « Facial appearance is a cue to oestrogen levels in women ». Proceedings of the Royal Society B : Biological Sciences 273, no 1583 (novembre 2005) : 135–40. http://dx.doi.org/10.1098/rspb.2005.3296.
Texte intégralBoothroyd, Lynda G., Isabel Scott, Alan W. Gray, Claire I. Coombes et Nicholas Pound. « Male Facial Masculinity as a Cue to Health Outcomes ». Evolutionary Psychology 11, no 5 (décembre 2013) : 147470491301100. http://dx.doi.org/10.1177/147470491301100508.
Texte intégralCarré, Justin M., Cheryl M. McCormick et Catherine J. Mondloch. « Facial Structure Is a Reliable Cue of Aggressive Behavior ». Psychological Science 20, no 10 (octobre 2009) : 1194–98. http://dx.doi.org/10.1111/j.1467-9280.2009.02423.x.
Texte intégralRoberts, S. Craig, Tamsin K. Saxton, Alice K. Murray, Robert P. Burriss, Hannah M. Rowland et Anthony C. Little. « Static and Dynamic Facial Images Cue Similar Attractiveness Judgements ». Ethology 115, no 6 (juin 2009) : 588–95. http://dx.doi.org/10.1111/j.1439-0310.2009.01640.x.
Texte intégralPorcheron, Aurélie, Emmanuelle Mauger, Frédérique Soppelsa, Richard Russell et Frédérique Morizot. « Facial contrast is a universal cue for perceiving age. » Journal of Vision 15, no 12 (1 septembre 2015) : 1222. http://dx.doi.org/10.1167/15.12.1222.
Texte intégralDeBruine, Lisa M. « Trustworthy but not lust-worthy : context-specific effects of facial resemblance ». Proceedings of the Royal Society B : Biological Sciences 272, no 1566 (7 mai 2005) : 919–22. http://dx.doi.org/10.1098/rspb.2004.3003.
Texte intégralThèses sur le sujet "Facial cue"
Horowitz, Erin J. « Facial Information as a Minimal Cue of Animacy ». Thesis, University of California, Santa Barbara, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10784374.
Texte intégralThe tendency for humans to give preferential attention to animate agents in their immediate surroundings has been well-documented and likely reflects an evolved specialization to a persistent adaptive problem. In uncertain or ambiguous cases, this tendency can result in an over-detection of animacy, as the potential costs of failing to detect an animate agent far outweigh those of mistaken identification. In line with this, it seems likely that humans have evolved a sensitivity to specific cues which are indicative of animacy such that the mere presence of these cues will lead to detection, regardless of the objective category membership of the entity in question. There exists a wealth of research speaking to this effect with regards to motion cues, specifically in terms of the capacity for self-propulsion and goal-directed action. Morphological cues have also been implicated - most especially the presence of facial features – as they specify a capacity for perceptual feedback from the environment, which is essential for goal-directed motion. However, it remains an open question as to whether the capacity for animacy detection is similarly sensitive to facial information in the absence of motion cues.
The experiments reported here attempted to address this question by implementing a novel task in which participants were asked to judge the animacy or inanimacy (or membership in animal or object categories) of different images: animals with and without visible facial features, and objects with and without visible facial features. Beyond replicating a general advantage for detecting animate agents over inanimate objects, the primary predictions for these experiments were that facial features would have a differential effect on performance, such that they would improve performance when visible in animals, and would hinder performance when visible in objects. Experiments 1a and 1b provided a preliminary confirmation of this pattern of responses using images of familiar and unfamiliar animals (e.g., dogs versus jellyfish), and unaltered images of objects with and without faces. Experiment 2 improved on the design of this task by more closely matching the sets of images (the same animals facing toward or away from the camera, and objects with faces which had been digitally altered to disrupt the facial features), and by changing the prompt of the task from yes/no judgments of animacy to categorization into animal or object groups. Experiment 3 examined the face inversion effect, or the failure to recognize familiar faces when their orientation is inverted, on animal-object categorization. Lastly, experiments 4 and 5 attempted to extend the findings from experiment 2 to preschool-aged children, by implementing a card sorting task (experiment 4) and a computerized animal detection task (experiment 5). The results of this series of experiments highlight the prominent role of facial features in detecting animate agents in one’s surroundings.
Han, Chengyang. « Facial appearance as a cue of physical condition ». Thesis, University of Glasgow, 2018. http://theses.gla.ac.uk/8788/.
Texte intégralEldblom, Hans. « Facial width-to-height ratio as a cue of threat : An ERP study ». Thesis, Högskolan i Skövde, Institutionen för biovetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-15570.
Texte intégralBACCOLO, ELISA. « It’s written all over your face. The ontogeny of sensitivity to facial cues to trustworthiness ». Doctoral thesis, Università degli Studi di Milano-Bicocca, 2020. http://hdl.handle.net/10281/277385.
Texte intégralHuman beings are hypersensitive to those facial properties that convey social signals. The ability to attribute trustworthiness judgements based on facial cues to trustworthiness, i.e. those cues that we use to derive whether a person can be safely approached or better avoided, is known to be fast, automatic and based on very little information. This doctoral dissertation aims at investigating: (1) whether sensitivity to facial cues to trustworthiness is modulated by individual variations in social personality characteristics; (2) the developmental trajectory of this sensitivity; (3) if sensitivity to subtle variations in facial cues to trustworthiness is a universal phenomenon or is it modulated by culture and/or face ethnicity. Chapter 1 aimed at investigating whether individual differences in fine-grained perceptual sensitivity and mental representation of facial features related to trustworthiness judgements are associated with individual differences in social motivation. Results showed that individual differences in social motivation can have an impact on the amount of social experience and thus the level of developed sensitivity to facial cues to trustworthiness. Chapter 2 focused on the developmental trajectory of such sensitivity. Study 2 aimed to investigate how perceptual sensitivity to and mental representation of fine-grained differences in facial information subtending social perception of trustworthiness develops in time, taking into account individual differences in emotional development. Results showed that sensitivity to facial cues to trustworthiness and the ability to employ these cues to generate trustworthiness judgements is present in preschool years, but matures to reach adult-like levels at the age of 7, developing together with emotion understanding abilities. Study 3 and 4 used two different EEG paradigms with 6-month-old infants to question whether this sensitivity is already present in the first year of life. Combined data coming from Study 3 and 4 contribute in showing that 6-month-old infants are sensitive to those facial cues that are later used to generate trustworthiness judgements. Finally, Chapter 3 presents a validation of stimuli that will be used to explore the presence of developmental cross-cultural differences in the perception of face trustworthiness. Overall, all presented studies suggest that sensitivity to facial cues to trustworthiness manifests in the very first years of life, to be then refined by experience over the course of development. Moreover, they suggest that trustworthiness perception could be cross-cultural, as it is not influenced by the experience an individual gains with a certain face category.
Mazefsky, Carla Ann. « Emotion Perception in Asperger's Syndrome and High-functioning Autism : The Importance of Diagnostic Criteria and Cue Intensity ». VCU Scholars Compass, 2004. http://scholarscompass.vcu.edu/etd/1449.
Texte intégralBoraston, Zillah Louise. « Emotion recognition from facial and non-facial cues ». Thesis, University College London (University of London), 2008. http://discovery.ucl.ac.uk/1445207/.
Texte intégralVIEIRA, Tiago Figueiredo. « Identifying Kinship Cues from Facial Images ». Universidade Federal de Pernambuco, 2013. https://repositorio.ufpe.br/handle/123456789/13315.
Texte intégralMade available in DSpace on 2015-04-17T13:23:49Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) TESE Tiago Figueiredo Vieira.compressed.pdf: 2116364 bytes, checksum: b3851944ff7105bff9fdcd050d5d4f86 (MD5) Previous issue date: 2013-11-08
A investigação da face humana é comum em análise de padrões/ processamento de imagens. Abordagens tradicionais são a identificação e a verificação mas muitas outras estão surgindo, como estimativa de idade, análise de similaridade, atratividade e o reconhecimento de parentesco. Apesar deste último possuir diversas possíveis aplicações, poucos trabalhos foram apresentados até então. Esta tese apresenta um algoritmo apto a discriminar entre irmãos e não irmãos, baseado nas imagens das suas faces. Um grande desafio foi lidar com a falta de um benchmark em análise de parentesco e, por esta razão, uma base de imagens de alta qualidade de pares de irmãos foi coletada. Isto é uma contribuição relevante à comunidade científica e foi particularmente útil para evitar possíveis problemas devido a imagens de baixa qualidade e condições não-controladas de aquisição de bases de dados heterogêneas usadas em outros trabalhos. Baseado nessas imagens, vários classificadores foram construídos usando técnicas baseadas na extração de características e holística para investigar quais variáveis são mais eficientes para distinguir parentes. As características foram primeiramente testadas individualmente e então as informações mais significantes da face foram fornecidas a um algoritmo único. O classificador de irmãos superou a performance de humanos que avaliaram a mesma base de dados. Adicionalmente, a boa capacidade de distinção do algorimo foi testado aplicando-o a uma base de dados de baixa qualidade coletada da Internet. O conhecimento obtido da análise de irmãos levou ao desenvolvimento de um algoritmo similar capaz de distinguir pares pai-filho de indivíduos não relacionados. Os resultados obtidos possuem impactos na recuperação e anotação automática de bases de dados, ciência forense, pesquisa genealógica e na busca de familiares perdidos.----------------------------------------------------------------------------------------------- The investigation of human face images is ubiquitous in pattern analysis/ image processing research. Traditional approaches are related to face identification and verification but, several other areas are emerging, like age/ expression estimation, analysis of facial similarity and attractiveness and automatic kinship recognition. Despite the fact that the latter could have applications in fields such as image retrieval and annotation, little work in this area has been presented so far. This thesis presents an algorithm able to discriminate between siblings and unrelated individuals, based on their face images. In this context, a great challenge was to deal with the lack of a benchmark in kinship analysis, and for this reason, a high-quality dataset of images of siblings’ pairs was collected. This is a relevant contribution to the research community and is particularly useful to avoid potential problems due to low quality pictures and uncontrolled imaging conditions of heterogeneous datasets used in previous researches. The database includes frontal, profile, expressionless and smiling faces of siblings pairs. Based on these images, various classifiers were constructed using feature-based and holistic techniques to investigate which data are more effective for discriminating siblings from non-siblings. The features were first tested individually and then the most significant face data were supplied to a unique algorithm. The siblings classifier has been found to outperform human raters on all datasets. Also, the good discrimination capabilities of the algorithm is tested by applying the classifiers to a low quality database of images collected from the Internet in a cross-database experiment. The knowledge acquired from the analysis of siblings fostered a similar algorithm able to discriminating parent-child pairs from unrelated individuals. The results obtained in this thesis have impact in image retrieval and annotation, forensics, genealogical research and finding missing family members.
Scott, Naomi. « Facial cues to mental health symptoms ». Thesis, Bangor University, 2015. https://research.bangor.ac.uk/portal/en/theses/facial-cues-to-mental-health-symptoms(1f1fa702-18f7-435c-ad59-05c59dccaec2).html.
Texte intégralStoyanova, Raliza. « Contextual influences on perception of facial cues ». Thesis, University of Cambridge, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.608041.
Texte intégralFisher, Claire. « Social perception of facial cues of adiposity ». Thesis, University of Glasgow, 2017. http://theses.gla.ac.uk/8334/.
Texte intégralLivres sur le sujet "Facial cue"
Che faccia fare. Milano : Feltrinelli, 1998.
Trouver le texte intégralPrima che faccia buio. Venezia : Marsilio, 2005.
Trouver le texte intégralCapitani, Paola, dir. Il controllo terminologico delle risorse elettroniche in rete. Florence : Firenze University Press, 2001. http://dx.doi.org/10.36253/88-8453-008-3.
Texte intégralVígh, Eva. "Il costume che appare nella faccia" : Fisiognomica e letteratura italiana. Roma : Aracne, 2014.
Trouver le texte intégralBirattari, Massimo. È più facile scrivere bene che scrivere male : Corso di sopravvivenza. Milano : Ponte alle Grazie, 2011.
Trouver le texte intégralManzini, Gianna. Lettere a Giuseppe Dessí e a Luisa. Sous la direction de Alberto Baldi. Florence : Firenze University Press, 2020. http://dx.doi.org/10.36253/978-88-6453-923-2.
Texte intégral"Ne la faccia che a Cristo/più si somiglia" : La poesia mariana di Dante. Cosenza - Italy : Luigi Pellegrini editore, 2017.
Trouver le texte intégralTabacco, Giovanni. La relazione fra i concetti di potere temporale e di potere spirituale nella tradizione cristiana fino al secolo XIV. Sous la direction de Laura Gaffuri. Florence : Firenze University Press, 2011. http://dx.doi.org/10.36253/978-88-8453-995-3.
Texte intégralGensini, Gian Franco, et Augusto Zaninelli, dir. Progetto RIARTE. Florence : Firenze University Press, 2015. http://dx.doi.org/10.36253/978-88-6655-906-1.
Texte intégralCaproni, Giorgio. Il mondo ha bisogno dei poeti. Sous la direction de Melissa Rota. Florence : Firenze University Press, 2014. http://dx.doi.org/10.36253/978-88-6655-677-0.
Texte intégralChapitres de livres sur le sujet "Facial cue"
Chen, Jingying, et Bernard Tiddeman. « Multi-cue Facial Feature Detection and Tracking ». Dans Lecture Notes in Computer Science, 356–67. Berlin, Heidelberg : Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-69905-7_41.
Texte intégralSu, Congyong, Hong Zhou et Li Huang. « Multiple Facial Feature Tracking Using Multi-cue Based Prediction Model ». Dans Articulated Motion and Deformable Objects, 214–26. Berlin, Heidelberg : Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-30074-8_21.
Texte intégralKlien, Michael, et Maria Salvetti. « CEE ». Dans Facing the Challenges of Water Governance, 259–89. Cham : Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-98515-2_10.
Texte intégralAnchlia, Sonal. « Temporomandibular Joint Ankylosis ». Dans Oral and Maxillofacial Surgery for the Clinician, 1401–34. Singapore : Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-1346-6_65.
Texte intégralMani, Varghese. « Orthognathic Surgery for Mandible ». Dans Oral and Maxillofacial Surgery for the Clinician, 1477–512. Singapore : Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-15-1346-6_68.
Texte intégralFang, Xianming. « Chapter 4. Multimodality in refusals in English as a lingua franca ». Dans Multimodal Im/politeness, 101–29. Amsterdam : John Benjamins Publishing Company, 2023. http://dx.doi.org/10.1075/pbns.333.04fan.
Texte intégralUlz, Thomas, Jakob Ludwiger et Gerald Steinbauer. « A Robust and Flexible System Architecture for Facing the RoboCup Logistics League Challenge ». Dans RoboCup 2018 : Robot World Cup XXII, 488–99. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-27544-0_40.
Texte intégralAndrée, Alexander. « Caue ne facias uim in tempore ! Peter Comestor and the Truth of History ». Dans Instrumenta Patristica et Mediaevalia, 515–50. Turnhout : Brepols Publishers, 2017. http://dx.doi.org/10.1484/m.ipm-eb.5.112016.
Texte intégralGualco, Carlo, Marco Grattarola, Alberto Federici, Francesco Mataloni, Karol Iždinský, F. Simančik, Bernhard Schwarz, C. García-Rosales et I. López-Galilea. « Brazing Technology for Plasma Facing Components in Nuclear Fusion Applications Using Low and Graded CTE Interlayers ». Dans Advanced Materials Research, 192–97. Stafa : Trans Tech Publications Ltd., 2008. http://dx.doi.org/10.4028/3-908454-01-8.192.
Texte intégralTóth, Judit, Éva Szirmai, Norbert Merkovity et Tamás Pongó. « Promising or Compelling Future in Hungary ? » Dans Young Adults and Active Citizenship, 121–38. Cham : Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-65002-5_7.
Texte intégralActes de conférences sur le sujet "Facial cue"
Guan, Yepeng. « Robust Eye Detection from Facial Image based on Multi-cue Facial Information ». Dans 2007 IEEE International Conference on Control and Automation. IEEE, 2007. http://dx.doi.org/10.1109/icca.2007.4376666.
Texte intégralAli, Tauseef, et Intaek Kim. « An improved eye localization algorithm using multi-cue facial information ». Dans 2009 2nd International Conference on Computer, Control and Communication (IC$). IEEE, 2009. http://dx.doi.org/10.1109/ic4.2009.4909266.
Texte intégralLi, Songjiang, Wen Cui, Jinshi Cui, Li Wang, Ming Li et Hongbin Zha. « Improving Children's Gaze Prediction via Separate Facial Areas and Attention Shift Cue ». Dans 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017). IEEE, 2017. http://dx.doi.org/10.1109/fg.2017.92.
Texte intégralRupasinghe, A. T., N. L. Gunawardena, S. Shujan et D. A. S. Atukorale. « Scaling personality traits of interviewees in an online job interview by vocal spectrum and facial cue analysis ». Dans 2016 Sixteenth International Conference on Advances in ICT for Emerging Regions (ICTer). IEEE, 2016. http://dx.doi.org/10.1109/icter.2016.7829933.
Texte intégralLang, Christian, Sven Wachsmuth, Heiko Wersing et Marc Hanheide. « Facial expressions as feedback cue in human-robot interaction—a comparison between human and automatic recognition performances ». Dans 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR Workshops). IEEE, 2010. http://dx.doi.org/10.1109/cvprw.2010.5543264.
Texte intégralYap, Moi Hoon, Bashar Rajoub, Hassan Ugail et Reyer Zwiggelaar. « Visual cues of facial behaviour in deception detection ». Dans 2011 IEEE International Conference on Computer Applications and Industrial Electronics (ICCAIE). IEEE, 2011. http://dx.doi.org/10.1109/iccaie.2011.6162148.
Texte intégralSuzuki, K., Y. Takeuchi et J. Heo. « THE EFFECT OF LIGHTING ENVIRONMENT ON FACIAL EXPRESSION PERCEPTION IN VIDEO TELECONFERENCING ». Dans CIE 2021 Conference. International Commission on Illumination, CIE, 2021. http://dx.doi.org/10.25039/x48.2021.op51.
Texte intégralLi, Tianyu, et Biao Yang. « NEW EMPIRICAL DATA FOR PEDESTRIAN LIGHTING EFFECT ON RECOGNITION ABILITY ON REAL 3D FACIAL EXPRESSION ». Dans CIE 2018. International Commission on Illumination, CIE, 2018. http://dx.doi.org/10.25039/x45.2018.op18.
Texte intégralThomas, Chinchu, et Dinesh Babu Jayagopi. « Predicting student engagement in classrooms using facial behavioral cues ». Dans ICMI '17 : INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION. New York, NY, USA : ACM, 2017. http://dx.doi.org/10.1145/3139513.3139514.
Texte intégralWilliamson, James R., Elizabeth Godoy, Miriam Cha, Adrianne Schwarzentruber, Pooya Khorrami, Youngjune Gwon, Hsiang-Tsung Kung, Charlie Dagli et Thomas F. Quatieri. « Detecting Depression using Vocal, Facial and Semantic Communication Cues ». Dans MM '16 : ACM Multimedia Conference. New York, NY, USA : ACM, 2016. http://dx.doi.org/10.1145/2988257.2988263.
Texte intégralRapports d'organisations sur le sujet "Facial cue"
Qiao, Baoyun, Xiaoqin Fan, Hanif Rahemtulla, Hans van Rijn et Lina Li. Critical Issues for Fiscal Reform in the People’s Republic of China Part 1 : Revenue and Expenditure Management. Asian Development Bank, décembre 2022. http://dx.doi.org/10.22617/wps220575-2.
Texte intégralHenkin, Samuel. Dynamic Dimensions of Radicalization and Violent Extremism in Sabah, Malaysia. RESOLVE Network, décembre 2021. http://dx.doi.org/10.37805/pn2021.25.sea.
Texte intégralZuccarelli, N., C. M. Lesher, M. G. Houlé et S. J. Barnes. Variations in the textural facies of sulphide minerals in the Eagle's Nest Ni-Cu-(PGE) deposit, McFaulds Lake greenstone belt, Superior Province, Ontario : insights from microbeam scanning energy-dispersive X-ray fluorescence spectrometry. Natural Resources Canada/ESS/Scientific and Technical Publishing Services, 2020. http://dx.doi.org/10.4095/326895.
Texte intégralKuster, K., C. M. Lesher et M. G. Houlé. Geology and geochemistry of mafic and ultramafic bodies in the Shebandowan mine area, Wawa-Abitibi terrane : implications for Ni-Cu-(PGE) and Cr-(PGE) mineralization, Ontario and Quebec. Natural Resources Canada/CMSS/Information Management, 2022. http://dx.doi.org/10.4095/329394.
Texte intégralSavosko, V., I. Komarova, Yu Lykholat, E. Yevtushenko et T. Lykholat. Predictive model of heavy metals inputs to soil at Kryvyi Rih District and its use in the training for specialists in the field of Biology. IOP Publishing, 2021. http://dx.doi.org/10.31812/123456789/4511.
Texte intégralСавосько, Василь Миколайович, Ірина Олександрівна Комарова, Юрій Васильович Лихолат, Едуард Олексійович Євтушенко, et Тетяна Юріївна Лихолат. Predictive Model of Heavy Metals Inputs to Soil at Kryvyi Rih District and its Use in the Training for Specialists in the Field of Biology. IOP Publishing, 2021. http://dx.doi.org/10.31812/123456789/4266.
Texte intégralCorriveau, L., J. F. Montreuil, O. Blein, E. Potter, M. Ansari, J. Craven, R. Enkin et al. Metasomatic iron and alkali calcic (MIAC) system frameworks : a TGI-6 task force to help de-risk exploration for IOCG, IOA and affiliated primary critical metal deposits. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/329093.
Texte intégralRousseau, Henri-Paul. Gutenberg, L’université et le défi numérique. CIRANO, décembre 2022. http://dx.doi.org/10.54932/wodt6646.
Texte intégralNorelli, John L., Moshe Flaishman, Herb Aldwinckle et David Gidoni. Regulated expression of site-specific DNA recombination for precision genetic engineering of apple. United States Department of Agriculture, mars 2005. http://dx.doi.org/10.32747/2005.7587214.bard.
Texte intégralFacial Cues : Can We Judge Who Looks Like a Leader ? IEDP Ideas for Leaders, janvier 2015. http://dx.doi.org/10.13007/479.
Texte intégral