Dissertations / Theses on the topic 'Gaze'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Gaze.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Ilic, Sasa <1987>. "The Duel of the Gazes: Male Gaze on Women vs Female Self-Gaze in Carver and Altman." Master's Degree Thesis, Università Ca' Foscari Venezia, 2013. http://hdl.handle.net/10579/3744.
Full textEdwards, Stephen Gareth. "Social orienting in gaze-based interactions : consequences of joint gaze." Thesis, University of East Anglia, 2015. https://ueaeprints.uea.ac.uk/59591/.
Full textBergeron, André 1967. "Multiple-step gaze shifts reveal gaze position error in brainstem." Thesis, McGill University, 2003. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=82831.
Full textLi, Anying M. Eng Massachusetts Institute of Technology. "Learning driver gaze." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/119533.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 65-69).
Driving is a singularly complex task that humans manage to perform successfully day in and day out, guided only by what their eyes can see. Given how prevalent, complex, and not to mention dangerous driving is, it's surprising that we don't really understand how drivers actually use vision to drive. The release of a large scale driving dataset with eye tracking data, DrEyeVe [1], makes analyzing the role of vision feasible. In this thesis, we 1) study the impact of various external features on driver attention, and 2) present a two-path deep-learning model that exploits both static and dynamic information for modeling driver gaze. Our model shows promising results against state-of-the-art saliency models, especially on sequences when the driver is not just looking straight ahead on the road. This model enables us to estimate important regions that the driver should be aware of, and potentially allows an automatic driving assistant to alert drivers of hazards on the road they haven't seen yet.
by Anying Li.
M. Eng.
Wood, Erroll William. "Gaze estimation with graphics." Thesis, University of Cambridge, 2017. https://www.repository.cam.ac.uk/handle/1810/267905.
Full textWon, Cassandra L. "(Un)Focusing the Gaze." Scholarship @ Claremont, 2014. http://scholarship.claremont.edu/scripps_theses/343.
Full textGafny, Tal. "Pools / Dreams / Parental Gaze." VCU Scholars Compass, 2014. http://scholarscompass.vcu.edu/etd/3482.
Full textDubrovsky, Alexander Sasha. "Gaze, eye, and head movement dynamics during closed- and open-loop gaze pursuit." Thesis, McGill University, 2000. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=31222.
Full textIde, Ichiro, Kenji Yamashiro, Daisuke Deguchi, Tomokazu Takahashi, Hiroshi Murase, Kazunori Higuchi, and Takashi Naito. "Automatic calibration of an in-vehicle gaze tracking system using driver's typical gaze behavior." IEEE, 2009. http://hdl.handle.net/2237/13967.
Full textBeckmann, Jeffery Linn. "Single camera 3D gaze determination." [College Station, Tex. : Texas A&M University, 2007. http://hdl.handle.net/1969.1/ETD-TAMU-1247.
Full textHolm, Linus. "Gaze control in episodic memory." Licentiate thesis, Umeå University, Department of Psychology, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-14733.
Full textThe role of gaze control in episodic recognition was investigated in two studies. In Study 1, participants encoded human faces inverted or upright, with or without eye movements (Experiment 1) and under sorting or rating tasks (Experiment 2) respectively. At test, participants indicated their recollective experience with R(emember) responses (explicit recollection) orK(now) responses (familiarity based recognition). Experiment 1 showed that face inversion and occlusion of eye movements reduced levels of explicit recollection as measured by R responses. In Experiment 2, the relation between recollective experience and perceptual reinstatement wasexamined. Whereas the study instructions produced no differences in terms of eye movements, R responses were associated with a higher proportion of refixations than K responses.In Study 2, perceptual consistency was investigated in two experiments. In Experiment 1, participants studied scenes under different concurrent tasks. Subsequently, their recognition memory was examined in a R / K test. Executive load produced parallel effects on eye movements and R responses. Furthermore, R responses were associated with a higher proportion ofrefixations than K responses. However, number of fixations was correlated with refixations.Experiment 2 corroborated these results and controlled for number of fixations.Together, these studies suggest that visual episodic representations are supported by perceptual detail, and that explicit recollection is a function of encoding and retrieving those details. To this end, active gaze control is an important factor in visual recognition.
Linn, Andreas. "Gaze Teleportation in Virtual Reality." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-216585.
Full textI det här dokumentet rapporteras preliminära resultat av blickteleportation, en rörelseinteraktion för virtuella verkligheter där användaren kan trycka på en knapp och teleportera till den punkt som de tittar på. Resultaten kan hjälpa framtida applikationsskapare att designa intuitiva rörelsegränssnitt så att användarna lättare kan röra sig i virtuella världar som är större än deras spelrum. I en studie med 12 deltagare jämfördes blick med teleportation med den konventionella handkontroll metoden. Deltagarna spelade en del av Valve’s The Lab med en HTC Vive och en Tobii Eyetracker; Hälften av deltagarna slutförde de uppsatta uppgifterna med blickteleportation, och den andra hälften använde handmetoden. Med Likert-frågor bedömde de sedan sina erfarenheter när det gällde njutning, frustration, ansträngning, avstånd, ocklusion och rörelsesjuka. Efter att ha besvarat frågorna fick deltagarna prova båda metoderna och intervjuades om sina preferenser och åsikter. Våra resultat tyder på att blickteleportation är en trevlig, snabb, intuitiv och naturlig rörelseinteraktion som presterar likt handmetoden, men föredras av användarna när de får välja. Vi drar slutsatsen att blickteleportation passar bra för applikationer där användarna förväntas förflytta sig i samma riktning som deras fokus.
Anderson, Nicola Christine Cole. "Motion cues enhance gaze processing." Thesis, University of British Columbia, 2012. http://hdl.handle.net/2429/42922.
Full textLenz, Alexander. "Cerebellum inspired robotic gaze control." Thesis, University of the West of England, Bristol, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557412.
Full textFjellström, Jonatan. "Gaze Interaction in Modern Trucks." Thesis, Linköpings universitet, Interaktiva och kognitiva system, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-114284.
Full textMollenbach, Emilie. "Selection strategies in gaze interaction." Thesis, Loughborough University, 2010. https://dspace.lboro.ac.uk/2134/8101.
Full textRicciardelli, Paola. "Gaze perception and social attention." Thesis, University College London (University of London), 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.342292.
Full textCai, Haibin. "Gaze estimation in unconstrained environments." Thesis, University of Portsmouth, 2018. https://researchportal.port.ac.uk/portal/en/theses/gaze-estimation-in-unconstrained-environments(5c391e0b-4026-4415-a1e1-8995b622d246).html.
Full textHipiny, Irwandi. "Egocentric activity recognition using gaze." Thesis, University of Bristol, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.682564.
Full textFarid, Mohsen Mohamed. "Eye-gaze : modelling and applications." Thesis, Queen's University Belfast, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.673850.
Full textKumar, Manu. "Gaze-enhanced user interface design /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.
Full textHall, Courtney D. "Efficacy of Gaze Stability Exercises." Digital Commons @ East Tennessee State University, 2014. https://dc.etsu.edu/etsu-works/582.
Full textHuterer, Marko. "Characterization of vestibulo-ocular reflex dynamics : responses to head perturbations during gaze stabilization versus gaze redirection." Thesis, McGill University, 2001. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=33004.
Full textLööf, Jenny. "An Inquisitive Gaze: Exploring the Male Gaze and the Portrayal of Gender in Dragon Age: Inquisition." Thesis, Stockholms universitet, Engelska institutionen, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-117976.
Full textTong, Irene Go. "Eye gaze tracking in surgical robotics." Thesis, University of British Columbia, 2017. http://hdl.handle.net/2429/62845.
Full textApplied Science, Faculty of
Electrical and Computer Engineering, Department of
Graduate
Ferguson, Sarah Alexandra. "Fracturing the gaze in Approaching Zanzibar." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp01/MQ38560.pdf.
Full textOkamoto-Barth, Sanae. "Gaze processing in chimpanzees and humans." [Maastricht] : Maastricht : UPM, Universitaire Pers Maastricht ; University Library, Maastricht University [Host], 2005. http://arno.unimaas.nl/show.cgi?fid=6377.
Full textReeves, Allison Hillary. "Disrupting the gaze : a film cooperative." Thesis, Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/23098.
Full textStellmach, Sophie [Verfasser]. "Gaze-supported Multimodal Interaction / Sophie Stellmach." München : Verlag Dr. Hut, 2013. http://d-nb.info/1045125547/34.
Full textDesanghere, Loni. "Gaze strategies in perception and action." Experimental Brain Research, 2011. http://hdl.handle.net/1993/17898.
Full textCohanim, Samira. "A Glance at the Male Gaze." Scholarship @ Claremont, 2015. http://scholarship.claremont.edu/scripps_theses/513.
Full textWilmut, Kate. "Gaze, attention and coordination in children." Thesis, University of Reading, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.427811.
Full textKaymak, Sertan. "Real-time appearance-based gaze tracking." Thesis, Queen Mary, University of London, 2015. http://qmro.qmul.ac.uk/xmlui/handle/123456789/8949.
Full textNunez-Varela, Jose Ignacio. "Gaze control for visually guided manipulation." Thesis, University of Birmingham, 2013. http://etheses.bham.ac.uk//id/eprint/4444/.
Full textAl-Sader, Mohamed. "Gaze-driven interaction in video games." Thesis, Linköpings universitet, Medie- och Informationsteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-156718.
Full textWilliams, Sara. "The maternal gaze in the Gothic." Thesis, University of Hull, 2011. http://hydra.hull.ac.uk/resources/hull:6756.
Full textKurauchi, Andrew Toshiaki Nakayama. "EyeSwipe: text entry using gaze paths." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-03072018-151733/.
Full textPessoas com deficiências motoras severas podem se comunicar usando movimentos do olhar com o auxílio de um teclado virtual e um rastreador de olhar. A entrada de texto usando o olhar também beneficia usuários imersos em realidade virtual ou realidade aumentada, quando não possuem acesso a um teclado físico ou tela sensível ao toque. Assim, tanto usuários com e sem deficiência podem se beneficiar da possibilidade de entrar texto usando o olhar. Entretanto, métodos para entrada de texto com o olhar são tipicamente lentos e desconfortáveis. Nesta tese propomos o EyeSwipe como mais um passo em direção à entrada rápida e confortável de texto com o olhar. O EyeSwipe mapeia gestos do olhar em palavras, de maneira similar a como os movimentos do dedo em uma tela sensível ao toque são utilizados em métodos baseados em gestos (swipe). Um gesto do olhar difere de um gesto com os dedos em que ele não possui posições de início e fim claramente definidas. Para segmentar o gesto do olhar a partir do fluxo contínuo de dados do olhar, o EyeSwipe requer que o usuário indique explicitamente seu início e fim. O usuário pode olhar rapidamente a vizinhança dos outros caracteres que compõe a palavra. Palavras candidatas são ordenadas baseadas no gesto do olhar e apresentadas ao usuário. Discutimos duas versões do EyeSwipe. O EyeSwipe 1 usa um gesto do olhar determinístico chamado Cruzamento Reverso para selecionar tanto a primeira quanto a última letra da palavra. Levando em consideração os aprendizados obtidos durante o desenvolvimento e teste do EyeSwipe 1 nós propusemos o EyeSwipe 2. O usuário emite comandos para a interface ao trocar o foco entre as regiões do teclado. Em um experimento de entrada de texto comparando o EyeSwipe 2 com o EyeSwipe 1, 11 participantes atingiram uma taxa de entrada média de 12.58 palavras por minuto (ppm) usando o EyeSwipe 1 e 14.59 ppm com o EyeSwipe 2 após utilizar cada método por 75 minutos. A taxa de entrada de texto máxima alcançada com o EyeSwipe 1 e EyeSwipe 2 foram, respectivamente, 21.27 ppm e 32.96 ppm. Os participantes consideraram o EyeSwipe 2 mais confortável e rápido, mas menos preciso do que o EyeSwipe 1. Além disso, com o EyeSwipe 2 nós propusemos o uso dos dados dos gestos do olhar para ajustar a estimação do olhar dinamicamente. Utilizando dados obtidos no experimento mostramos que os gestos do olhar podem ser usados para melhorar a estimação dinamicamente durante a interação.
MacDonald, R. G. "Gaze cues and language in communication." Thesis, University of Dundee, 2014. https://discovery.dundee.ac.uk/en/studentTheses/476122c4-9264-44aa-8f08-c70f6dbb14d8.
Full textPfeuffer, Ken. "Extending touch with eye gaze input." Thesis, Lancaster University, 2017. http://eprints.lancs.ac.uk/89076/.
Full textAlanenpää, Madelene. "Gaze detection in human-robot interaction." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-428387.
Full textSafavi, Safoura. "The Inner Gaze In Artistic Practice." Thesis, Stockholms konstnärliga högskola, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uniarts:diva-934.
Full textMaclean, Coinneach. "The 'Tourist Gaze' on Gaelic Scotland." Thesis, University of Glasgow, 2014. http://theses.gla.ac.uk/5178/.
Full textWakefield, Steve. "Carpentier's baroque fiction : returning Meduza's gaze /." Woodbridge : Tamesis, 2004. http://catalogue.bnf.fr/ark:/12148/cb39927077v.
Full textFitzgerald, Aimee. "Photography as Gaze, Painting as Caress." Thesis, The University of Sydney, 2016. http://hdl.handle.net/2123/15952.
Full textAli, Asad. "Biometric liveness detection using gaze information." Thesis, University of Kent, 2015. https://kar.kent.ac.uk/50524/.
Full textWANG, HAOCHEN. ""Gaze-Based Biometrics: some Case Studies"." Doctoral thesis, Università degli studi di Pavia, 2018. http://hdl.handle.net/11571/1280026.
Full textWANG, HAOCHEN. ""Gaze-Based Biometrics: some Case Studies"." Doctoral thesis, Università degli studi di Pavia, 2018. http://hdl.handle.net/11571/1280066.
Full textWANG, HAOCHEN. ""Gaze-Based Biometrics: some Case Studies"." Doctoral thesis, Università degli studi di Pavia, 2018. http://hdl.handle.net/11571/1280086.
Full textWiklund, Alexis. "The Male Gaze som retorisktverktyg : En utredande litteraturstudie över hur the Male Gaze kan användas inom retorikvetenskapen." Thesis, Södertörns högskola, Institutionen för kultur och lärande, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-22981.
Full textWeaver, Joseph S. "High working memory capacity predicts negative gaze but high self-esteem predicts positive gaze following ego threat." Case Western Reserve University School of Graduate Studies / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=case1307144564.
Full text