Academic literature on the topic 'Eye tracking glasses'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Eye tracking glasses.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Eye tracking glasses"

1

OGASAWARA, Tomohito, Ryogo HORIUCHI, Yasuto TANAKA, and Norihisa MIKI. "Eye-tracking system for reverse glasses." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2017 (2017): 1A1—K12. http://dx.doi.org/10.1299/jsmermd.2017.1a1-k12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ehinger, Benedikt V., Katharina Groß, Inga Ibs, and Peter König. "A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000." PeerJ 7 (July 9, 2019): e7086. http://dx.doi.org/10.7717/peerj.7086.

Full text
Abstract:
Eye-tracking experiments rely heavily on good data quality of eye-trackers. Unfortunately, it is often the case that only the spatial accuracy and precision values are available from the manufacturers. These two values alone are not sufficient to serve as a benchmark for an eye-tracker: Eye-tracking quality deteriorates during an experimental session due to head movements, changing illumination or calibration decay. Additionally, different experimental paradigms require the analysis of different types of eye movements; for instance, smooth pursuit movements, blinks or microsaccades, which themselves cannot readily be evaluated by using spatial accuracy or precision alone. To obtain a more comprehensive description of properties, we developed an extensive eye-tracking test battery. In 10 different tasks, we evaluated eye-tracking related measures such as: the decay of accuracy, fixation durations, pupil dilation, smooth pursuit movement, microsaccade classification, blink classification, or the influence of head motion. For some measures, true theoretical values exist. For others, a relative comparison to a reference eye-tracker is needed. Therefore, we collected our gaze data simultaneously from a remote EyeLink 1000 eye-tracker as the reference and compared it with the mobile Pupil Labs glasses. As expected, the average spatial accuracy of 0.57° for the EyeLink 1000 eye-tracker was better than the 0.82° for the Pupil Labs glasses (N= 15). Furthermore, we classified less fixations and shorter saccade durations for the Pupil Labs glasses. Similarly, we found fewer microsaccades using the Pupil Labs glasses. The accuracy over time decayed only slightly for the EyeLink 1000, but strongly for the Pupil Labs glasses. Finally, we observed that the measured pupil diameters differed between eye-trackers on the individual subject level but not on the group level. To conclude, our eye-tracking test battery offers 10 tasks that allow us to benchmark the many parameters of interest in stereotypical eye-tracking situations and addresses a common source of confounds in measurement errors (e.g., yaw and roll head movements). All recorded eye-tracking data (including Pupil Labs’ eye videos), the stimulus code for the test battery, and the modular analysis pipeline are freely available (https://github.com/behinger/etcomp).
APA, Harvard, Vancouver, ISO, and other styles
3

Behe, Bridget K., R. Thomas Fernandez, Patricia T. Huddleston, Stella Minahan, Kristin L. Getter, Lynnell Sage, and Allison M. Jones. "Practical Field Use of Eye-tracking Devices for Consumer Research in the Retail Environment." HortTechnology 23, no. 4 (August 2013): 517–24. http://dx.doi.org/10.21273/horttech.23.4.517.

Full text
Abstract:
Eye-tracking equipment is now affordable and portable, making it a practical instrument for consumer research. Engineered to best analyze gaze on a plane (e.g., a retail shelf), both portable eye-tracking glasses and computer monitor–mounted hardware can play key roles in analyzing merchandise displays to better understand what consumers view. Researchers and practitioners can use that information to improve the sales efficacy of displays. Eye-tracking hardware was nearly exclusively used to investigate the reading process but can now be used for a broader range of study, namely in retail settings. This article presents an approach to using glasses eye tracker (GET) and light eye tracker (LET) eye-tracking hardware for applied consumer research in the field. We outline equipment use, study construction, data extraction as well as benefits and limitations of the technology collected from several pilot studies.
APA, Harvard, Vancouver, ISO, and other styles
4

Al-Haddad, Sara, Matthew Sears, Omar Alruwaythi, and Paul M. Goodrum. "Complexity, Performance, and Search Efficiency: An Eye-Tracking Study on Assembly-Based Tasks among Construction Workers (Pipefitters)." Buildings 12, no. 12 (December 8, 2022): 2174. http://dx.doi.org/10.3390/buildings12122174.

Full text
Abstract:
Past studies have used eye-tracking glasses to analyze people’s perception of visual stimuli, usually regarding wayfinding, safety, or visual appeal. Some industries, such as the automotive industry, studied the effects of visual stimuli on task completion. However, the architecture and construction industries have mainly conducted eye-tracking experiments with surveys or search tasks instead of performing a task. This paper uses eye-tracking glasses to analyze people’s perception of visual stimuli while completing tangible tasks that simulate real-world applications. This research studies how people look at visual stimuli that influence their ability to interpret drawings with varying degrees of complexity, assess task completion performance, and inspect how people search for information. Twenty pipefitters wore eye-tracking glasses to record their eye movement patterns while completing a model pipe spool assembly. The eye-tracking glasses and Visual Eyes software measured visit metrics, fixations, fixation durations, convex hull coverage, assembly time, rework, and errors. Unlike previous studies, convex hull areas are calculated and used to measure search efficiency. This research found that people interacted more frequently with more complex visual stimuli but did not necessarily require more time to complete a task. People with lower search efficiency visited the drawings more frequently than people with higher search efficiency. People with higher search efficiency made fewer mistakes, redid less work, and completed tasks quicker than those with lower search efficiency. Search efficiency was found to be a good predictor of task performance.
APA, Harvard, Vancouver, ISO, and other styles
5

Thibeault, Mark, Monica Jesteen, and Andrew Beitman. "Improved Accuracy Test Method for Mobile Eye Tracking in Usability Scenarios." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 63, no. 1 (November 2019): 2226–30. http://dx.doi.org/10.1177/1071181319631083.

Full text
Abstract:
Eye tracking has been used in usability testing for many years to gain objective measurements to inform label, instruction, and product design. With many different testing environments, possible participants, hardware, and software, key metrics can vary greatly. These metrics can also vary for different studies, so a standardized test method, metrics, and calculation method are proposed in this study. The Tobii Pro Glasses 2 is a mobile eye tracker that does not significantly affect user mobility compared to other eye-trackers. This study aims to build a testing method, which can modify to better fit the varying conditions found in usability testing. This study was performed with Tobii Pro Glasses 2; however, this test method can be used with any mobile eye tracking units. Even with poor testing conditions, this test method results in reliable metrics which can be utilized to inform expectation and decisions with eye tracking. These methods are recommended to be performed prior to eye tracking testing to determine testing-specific performance.
APA, Harvard, Vancouver, ISO, and other styles
6

Katz, Trixie A., Danielle D. Weinberg, Claire E. Fishman, Vinay Nadkarni, Patrice Tremoulet, Arjan B. te Pas, Aleksandra Sarcevic, and Elizabeth E. Foglia. "Visual attention on a respiratory function monitor during simulated neonatal resuscitation: an eye-tracking study." Archives of Disease in Childhood - Fetal and Neonatal Edition 104, no. 3 (June 14, 2018): F259—F264. http://dx.doi.org/10.1136/archdischild-2017-314449.

Full text
Abstract:
ObjectiveA respiratory function monitor (RFM) may improve positive pressure ventilation (PPV) technique, but many providers do not use RFM data appropriately during delivery room resuscitation. We sought to use eye-tracking technology to identify RFM parameters that neonatal providers view most commonly during simulated PPV.DesignMixed methods study. Neonatal providers performed RFM-guided PPV on a neonatal manikin while wearing eye-tracking glasses to quantify visual attention on displayed RFM parameters (ie, exhaled tidal volume, flow, leak). Participants subsequently provided qualitative feedback on the eye-tracking glasses.SettingLevel 3 academic neonatal intensive care unit.ParticipantsTwenty neonatal resuscitation providers.Main outcome measuresVisual attention: overall gaze sample percentage; total gaze duration, visit count and average visit duration for each displayed RFM parameter. Qualitative feedback: willingness to wear eye-tracking glasses during clinical resuscitation.ResultsTwenty providers participated in this study. The mean gaze sample captured wa s 93% (SD 4%). Exhaled tidal volume waveform was the RFM parameter with the highest total gaze duration (median 23%, IQR 13–51%), highest visit count (median 5.17 per 10 s, IQR 2.82–6.16) and longest visit duration (median 0.48 s, IQR 0.38–0.81 s). All participants were willing to wear the glasses during clinical resuscitation.ConclusionWearable eye-tracking technology is feasible to identify gaze fixation on the RFM display and is well accepted by providers. Neonatal providers look at exhaled tidal volume more than any other RFM parameter. Future applications of eye-tracking technology include use during clinical resuscitation.
APA, Harvard, Vancouver, ISO, and other styles
7

Niehorster, Diederick C., Thiago Santini, Roy S. Hessels, Ignace T. C. Hooge, Enkelejda Kasneci, and Marcus Nyström. "The impact of slippage on the data quality of head-worn eye trackers." Behavior Research Methods 52, no. 3 (January 2, 2020): 1140–60. http://dx.doi.org/10.3758/s13428-019-01307-0.

Full text
Abstract:
AbstractMobile head-worn eye trackers allow researchers to record eye-movement data as participants freely move around and interact with their surroundings. However, participant behavior may cause the eye tracker to slip on the participant’s head, potentially strongly affecting data quality. To investigate how this eye-tracker slippage affects data quality, we designed experiments in which participants mimic behaviors that can cause a mobile eye tracker to move. Specifically, we investigated data quality when participants speak, make facial expressions, and move the eye tracker. Four head-worn eye-tracking setups were used: (i) Tobii Pro Glasses 2 in 50 Hz mode, (ii) SMI Eye Tracking Glasses 2.0 60 Hz, (iii) Pupil-Labs’ Pupil in 3D mode, and (iv) Pupil-Labs’ Pupil with the Grip gaze estimation algorithm as implemented in the EyeRecToo software. Our results show that whereas gaze estimates of the Tobii and Grip remained stable when the eye tracker moved, the other systems exhibited significant errors (0.8–3.1∘ increase in gaze deviation over baseline) even for the small amounts of glasses movement that occurred during the speech and facial expressions tasks. We conclude that some of the tested eye-tracking setups may not be suitable for investigating gaze behavior when high accuracy is required, such as during face-to-face interaction scenarios. We recommend that users of mobile head-worn eye trackers perform similar tests with their setups to become aware of its characteristics. This will enable researchers to design experiments that are robust to the limitations of their particular eye-tracking setup.
APA, Harvard, Vancouver, ISO, and other styles
8

Golard, Andre, and Sachin S. Talathi. "Ultrasound for Gaze Estimation—A Modeling and Empirical Study." Sensors 21, no. 13 (June 30, 2021): 4502. http://dx.doi.org/10.3390/s21134502.

Full text
Abstract:
Most eye tracking methods are light-based. As such, they can suffer from ambient light changes when used outdoors, especially for use cases where eye trackers are embedded in Augmented Reality glasses. It has been recently suggested that ultrasound could provide a low power, fast, light-insensitive alternative to camera-based sensors for eye tracking. Here, we report on our work on modeling ultrasound sensor integration into a glasses form factor AR device to evaluate the feasibility of estimating eye-gaze in various configurations. Next, we designed a benchtop experimental setup to collect empirical data on time of flight and amplitude signals for reflected ultrasound waves for a range of gaze angles of a model eye. We used this data as input for a low-complexity gradient-boosted tree machine learning regression model and demonstrate that we can effectively estimate gaze (gaze RMSE error of 0.965 ± 0.178 degrees with an adjusted R2 score of 90.2 ± 4.6).
APA, Harvard, Vancouver, ISO, and other styles
9

Kortman, Brenton. "Assessing for Hemi-Spatial Neglect Using Eye Tracking Glasses." Archives of Physical Medicine and Rehabilitation 96, no. 10 (October 2015): e24. http://dx.doi.org/10.1016/j.apmr.2015.08.074.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Law, Brenda Hiu Yan, Po-Yin Cheung, Michael Wagner, Sylvia van Os, Bin Zheng, and Georg Schmölzer. "Analysis of neonatal resuscitation using eye tracking: a pilot study." Archives of Disease in Childhood - Fetal and Neonatal Edition 103, no. 1 (August 19, 2017): F82—F84. http://dx.doi.org/10.1136/archdischild-2017-313114.

Full text
Abstract:
BackgroundVisual attention (VA) is important for situation awareness and decision-making. Eye tracking can be used to analyse the VA of healthcare providers. No study has examined eye tracking during neonatal resuscitation.ObjectiveTo test the use of eye tracking to examine VA during neonatal resuscitation.MethodsSix video recordings were obtained using eye tracking glasses worn by resuscitators during the first 5 min of neonatal resuscitation. Videos were analysed to obtain (i) areas of interest (AOIs), (ii) time spent on each AOI and (iii) frequency of saccades between AOIs.ResultsFive videos were of acceptable quality and analysed. Only 35% of VA was directed at the infant, with 33% at patient monitors and gauges. There were frequent saccades (0.45/s) and most involved patient monitors.ConclusionDuring neonatal resuscitation, VA is often directed away from the infant towards patient monitors. Eye tracking can be used to analyse human performance during neonatal resuscitation.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Eye tracking glasses"

1

Dahlberg, Joakim. "Eye Tracking with Eye Glasses." Thesis, Umeå University, Department of Physics, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-32868.

Full text
Abstract:

This study is concerned with the negative effects of wearing corrective lenses while using eye trackers, and the correction of those negative effects. The eye tracker technology studied is the video based real-time Pupil Center and Corneal Reflection method. With a user study, the wearing of eyeglasses is shown to cause 20 % greater errors in the accuracy of an eye tracker than when not wearing glasses. The error is shown to depend on where on the eye tracker viewing area the user is looking.

A model for ray refraction when wearing glasses was developed. Measurements on distortions on the image of the eye caused by eyeglass lenses were carried out. The distortions were analyzed with eye tracking software to determine their impact on the image-to-world coordinates mapping. A typical dependence of 1 mm relative distance change on cornea to 9 degrees of visual field was found.

The developed mathematical/physiological model for eyeglasses focuses on artifacts not possible to accommodate for with existing calibration methods, primarily varying combinations of viewing angles and head rotations. The main unknown in the presented model is the effective strength of the glasses. Automatic identification is discussed. The model presented here is general in nature and needs to be developed further in order to be a part of a specific application.

 

APA, Harvard, Vancouver, ISO, and other styles
2

Fredriksson, Alfred, and Joakim Wallin. "Mapping an Auditory Scene Using Eye Tracking Glasses." Thesis, Linköpings universitet, Reglerteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-170849.

Full text
Abstract:
The cocktail party problem introduced in 1953 describes the ability to focus auditory attention in a noisy environment epitomised by a cocktail party. An individual with normal hearing uses several cues to unmask talkers of interest, such cues often lacks for people with hearing loss. This thesis explores the possibility to use a pair of glasses equipped with an inertial measurement unit (IMU), monocular camera and eye tacker to estimate an auditory scene and estimate the attention of the person wearing the glasses. Three main areas of interest have been investigated: estimating head orientation of the user; track faces in the scene and determine talker of interest using gaze. Implemented on a hearing aid, this solution could be used to artificially unmask talkers in a noisy environment. The head orientation of the user has been estimated with an extended Kalman filter (\EKF) algorithm, with a constant velocity model and different sets of measurements: accelerometer; gyrosope; monocular visual odometry (MVO); gaze estimated bias (GEB). An intrinsic property of IMU sensors is a drift in yaw. A method using eye data and gyroscope measurements to estimate gyroscope bias has been investigated and is called GEB. The MVO methods investigated use either optical flow to track features in succeeding frames or a key frame approach to match features over multiple frames.Using estimated head orientation and face detection software, faces have been tracked since they can be assumed as regions of interest in a cocktail party environment. A constant position EKF with a nearest neighbour approach has been used for tracking. Further, eye data retrieved from the glasses has been analyzed to investigate the relation between gaze direction and current talker during conversations.
APA, Harvard, Vancouver, ISO, and other styles
3

Atan, Levent. "Multi-Person Infrared Pupil Tracking for 3D TV without Glasses." Thesis, Umeå universitet, Institutionen för informatik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-62822.

Full text
Abstract:
The success of recent 3-D stereoscopic movies such as Avatar has created a lot of attention for 3-D in the home. Almost all major consumer electronics (CE) manufacturers have launched their 3-D stereoscopic displays in the market. A problem with those solutions is that viewers have to wear glasses. Glasses-free autostereoscopic 3-D displays typically use lenticular lenses or barriers to create multiple views. However these displays suffer from a number of issues: inverted views at viewing cone transitions, cross-talk between views, and need for multi-view content.  As Philips Electronics research group, we believe that some of these issues can be reduced by using pupil tracking. In the research process, we began with an extensive literature study on people detection and tracking techniques that helped us to understand the benefits and the shortcomings of different applications. Addition to literature studies, we greatly benefited from constant experimentation with prototypes and the hands-on experience with variety of digital and optical components under different conditions. As a result, we designed a multi-person infrared pupil tracker and multi-view renderer for 3D display to adapt the view rendering in real-time according to viewer’s position. Together with the integration of these two applications, the integrated 3D TV successfully adapts the center view according to position of the viewer and able to provide a smooth transition while the viewer actively changes her position from a notable distance under ambient illumination. However, even though the pupil tracker is implemented for multiple people, because of the time limitation and the complexity of the problem regarding multi-view renderer, the integrated system functions only for one person.   Exploring the employed technique, in-depth description and detailed illustration of designed applications and the conclusions drawn from the implemented system; we believe that this paper forms a substantial guidance and show-how source for further research in the field of 3D display and people tracking methods.
APA, Harvard, Vancouver, ISO, and other styles
4

Larsson, Sofia, and Jimmy Åkesson. "Subtly Influencing Gaze Direction Using a Handheld Augmented Reality Device." Thesis, Malmö universitet, Fakulteten för teknik och samhälle (TS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-20760.

Full text
Abstract:
Smarta och hjälpsamma teknologier kommer ut varje år och blir snabbt en del avvår vardag. Teknologierna blir mer och mer medvetna om när och var vi behöverdem och stödjer oss i att nå personliga mål såsom att ta cykeln istället för bilen tilljobbet. Dock har dessa teknologier begränsad funktionalitet när vi inte interagerarmed dem, vilket resulterar i att vi behöver interagera med dem och förlorar fokusfrån andra uppgifter.Vi tror att det ett decennium fram kommer att finnas enkla applikationsgränssnitti glasögon med förstärkt verklighet (AR-glasögon). Förstärkt verklighet kan vara ettkraftfullt verktyg för att skapa ett skikt över den riktiga världen så önskar vi ocksåatt gränssnitten inte stör upplevelsen av den riktiga världen.I denna studie har vi undersökt vilka möjligheter det finns i att skicka visuellastimuli till en användare på ett subtilt eller subliminalt sätt. Slutsatsen är att detinte uppenbarligen går att säga att det visuella stimuli i systemet somimplementerades i denna studie var subliminalt. Dock så kunde vi dra slutsatsenatt tiden det tar att upptäcka ett visuellt stimuli som gradvis ökar med tiden skiljersig mellan användare och att användare är mer benägna att fokusera på objekt somär placerade i ögonhöjd, vilket bör beaktas när man ska bestämma vart visuellastimuli ska placeras.
Smart and helpful technologies are released every year and are quick to become partof our everyday lives. Technologies are becoming more aware of when and where weneed them and help us to achieve personal goals, such as taking the bike instead ofthe car to work. Still, many of these smart technologies have only limitedfunctionality without us interacting with them, resulting in us having to interactwith them and losing focus on other tasks.We believe that a decade from now, some future application interfaces will reside inaugmented reality smart glasses. While augmented reality can be a powerful toolwhen overlaying the real world, we also wish that the augmented reality interfacesdo not break the immersion of everyday life.In this study, we have explored the possibilities of sending visual cues to a user in asubtle or even a subliminal way in an augmented reality setting. In conclusion, thereis no obvious answer to whether a cue in the system that was implemented in thisstudy was subliminal. However, we found that the time it took to perceive a cue thatgradually intensifies with time differed between people and that people are moreinclined to focus on objects placed at an eye-level height which should be taken inconsideration when deciding on where to place visual cues.
APA, Harvard, Vancouver, ISO, and other styles
5

LEE, TUNG-YING, and 李侗穎. "Gaze Tracking System for User Wearing Eye Glasses." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/nr8w5d.

Full text
Abstract:
碩士
國立暨南國際大學
資訊工程學系
105
Gaze tracking has become a popular human-computer interface which is widely adopted in various applications. However, most gaze tracking methods do not consider the influences of spectacle lens refraction and, therefore, can not be applied to users wearing eyeglasses. In this thesis, a new glint-feature-based (GFB) method (NGFB hereafter) is proposed which solves the spectacle lens refraction problem in gaze tracking. In the NGFB method, the ray tracking technique is employed to tackle the spectacle lens refraction. Thus, the NGFB method is compatible with either spherical for aspheric spectacle lenses. With the aid of ray tracing, the curvature center of the cornea and the refracted line of sight can be evaluated accurately. In this thesis, four experiments are conducted to test the proposed method via computer simulations. First, robustness of the NGFB method against the image noises is tested with spectacle lenes of diopters -5, -3, -1, 1, 3, and 5, respectively. Second, the influences of the position of virtual reflection plane used in the NGFB method to gaze tracking accuracy are examined. Third, the influences of the pose error of the spectacle lens to gaze tracking accuracy are evaluated. Experimental results show that the proposed NGFB method is robust against image noises and is insensitive to the spectacle lens pose estimation error.
APA, Harvard, Vancouver, ISO, and other styles
6

CHEN, PIN-CHIH, and 陳品志. "Attention estimation from the integrated IMU and eye-tracking perception of smart glasses." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/50219533067558386819.

Full text
Abstract:
碩士
國立中正大學
電機工程研究所
104
Wearable mobile learning becomes an inevitable trend in the future. With the matured wearable technologies, related applications have boosted in the market, but lacked for a killer application that can lead and spread wearable technologies. In another way, attention ability is essential to effective learning. By estimating and recording attention capability help learners to improve and review their learning process. In the light of this, attention-estimating system is constructed form IMU information, and eye-tracking perception in our study. In this study, we combine eye-tracking perception with IMU information to realize the Attention-estimating system based on Android smart glasses. First, we divided the eye-tracking into eyeball detection and eye corner detection. Minimum Average Gray Value was used to find the general eyeball contour. Moreover, considered all kinds of light changes, Otsu algorithm also was used to detect the contour of eye corners. Then, we can master the eye movement by determining the sight and projection point. The result of the average error angle of view is 1.82 degree. In the attention recognition, the system divided steps into five part: data acquisition, feature extraction, feature selection, classify and voting mechanism in this study. In order to extract the eye movement and IMU information by recording experiment of attention and non-attention in seven different scenarios; In the feature extraction, we extracted forty one kinds of features based on feature characteristics, and selected the suitable feature by four different kinds of feature selections; Finally, we optimized the parameter of SVM by Genetic algorithm, and validated the reliability by K-fold algorithm, however, we also adapted the voting mechanism to estimate the result of attention level. The classified accuracy of attention and non-attention are reach up to 86.10% under seven scenarios. The three and two categories of Continuous Performance Test experiment result are 81.12% and 83.44%, respectively. Eventually our result as compared with reference papers that indicated feasibility and reliability.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Eye tracking glasses"

1

Hsieh, Yi-Yu, Chia-Chen Liu, Wei-Lin Wang, and Jen-Hui Chuang. "Investigating Size Personalization for More Accurate Eye Tracking Glasses." In Computer Vision – ACCV 2016 Workshops, 239–48. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-54526-4_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bao, Haifeng, Weining Fang, Beiyuan Guo, and Peng Wang. "Real-Time Eye-Interaction System Developed with Eye Tracking Glasses and Motion Capture." In Advances in Human Factors in Wearable Technologies and Game Design, 72–81. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-60639-2_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

King, Meggan, Feiyan Hu, Joanna McHugh, Emma Murphy, Eamonn Newman, Kate Irving, and Alan F. Smeaton. "Visibility of Wearable Sensors as Measured Using Eye Tracking Glasses." In Communications in Computer and Information Science, 23–32. Cham: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-04406-4_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chaloupka, Christine, Ralf Risser, and Elisabeth Füssl. "Evaluation of Technologies That Help to Identify Hazards for Cyclists in Cities." In Advances in Civil and Industrial Engineering, 155–71. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-5225-9932-6.ch008.

Full text
Abstract:
How do people look at sites and places and perceive details? Studies are referred dealing with the looking behaviour of cyclists, under the assumption that the method used could also be applied for assessing looking behaviour of tourists and to learn more about risks for cyclists. In the frame of a naturalistic cycling study in Austria, among others, a method should be developed that would help to find out where bicyclists direct their visual attention on their ways. Use was made of mobile eye tracking glasses. Results will be shown of the development work that demonstrate that points of interest can be identified via the detection of gaze plots of samples of cyclists. No technology helps to register what is perceived by peripheral vision which would give a more complete picture of reality. Any technological method that today registers where cyclists, or customers, or visitors direct their attention to has to be completed with verbal data from interviews, questionnaires, etc. in order to assure themselves of what really has been perceived.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Eye tracking glasses"

1

Ye, Zhefan, Yin Li, Alireza Fathi, Yi Han, Agata Rozga, Gregory D. Abowd, and James M. Rehg. "Detecting eye contact using wearable eye-tracking glasses." In the 2012 ACM Conference. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2370216.2370368.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Topal, Cihan, Atakan Dogan, and Omer Nezih Gerek. "An eye-glasses-like wearable eye gaze tracking system." In 2008 IEEE 16th Signal Processing, Communication and Applications Conference (SIU). IEEE, 2008. http://dx.doi.org/10.1109/siu.2008.4632568.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Oskina, Maria, Zoltan Rusak, and Peter Boom. "Eye on HMI - Assessment of Human-Machine Interface with wearable eye-tracking glasses." In Design Computation Input/Output 2022. Design Computation, 2022. http://dx.doi.org/10.47330/dcio.2022.gpqp2161.

Full text
Abstract:
More and more modern transport modalities are equipped with complex human-machine interfaces (HMI). HMI aim to narrow the information gap between the complex automation system and their human operator to ensure fast, effective interaction and decision making. We see HMI in the traffic controllers' rooms, the ADAS-equipped vehicles, the public transport drivers' rooms, and many other modern transport modes. Designers create HMIs to effectively draw the operator’s attention to the most necessary and critical information and to facilitate accurate and fast decision making. Whether these systems adequately support human operators and achieve the intention of their designer is difficult to test objectively. [Hamilton and Grabowki 2013] showed that visual, manual and cognitive distractions of ADAS-equipped vehicles tend to distract drivers, who in turn behave less safe on the roads. There is, however, no comprehensive overview about the typical cognitive challenges operators facing in different domains of HMI applications and how these challenges can be objectively assessed. We conducted a series of interviews on difficulties of operators’ Human-Machine interface experience with human factors experts working with in railway and ADAS systems and investigated Endsley's situation awareness theory in dynamic systems [Endsley 1995]. Our interviewees reported several typical issues from their HMI studies, including missing events on the HMI displays, information overload of operators, lack of contextual and situational awareness and, as a resulting mismatch in expected and performed operator actions. We aim to develop and objective approach based on mobile eye tracking technology that can be used to characterize operator situation awareness, decision making and task performance and validate HMI designs in specific mobility and industry applications. The first step of our method is HAZOP analysis of the Human-Machine events and operator tasks, which results in a set of use cases for the eye-tracking experiments. In the experiments, we use wearable eye-tracking glasses combined with AI based computer vision algorithms. Wearable eyetracking enables us to conduct studies in real world scenarios, while AI based computer vision helps use to automatically identify relevant events and streamline the eye tracking data analysis workflow. With the use of glasses, we collect hotspot analysis, sequence of eye movement analysis, time to capture alarms and other parameters. Finally, we use an AI (and open AI) component in the glasses to mark the event of interest and track when the eye interacts with an area or an event of interest. We process gained data to conclude the events engagement, mistakes in responses, and missed out information and explain the root causes. In the past period, we conducted a pilot study to validate the quality of data collected with the openeye eye-tracking equipment (https://kexxu.com/ ). In the next step, we will use validate our method in a full-size experiment. We are convinced that our insights will help to bring significant improvements in current research approaches for human factor studies about comfort, safety and effectiveness of the human-machine interaction. We also aim to apply our method in training and upskilling operators."
APA, Harvard, Vancouver, ISO, and other styles
4

Meyer, Johannes, Thomas Schlebusch, Thomas Kuebler, and Enkelejda Kasneci. "Low Power Scanned Laser Eye Tracking for Retinal Projection AR Glasses." In ETRA '20: 2020 Symposium on Eye Tracking Research and Applications. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3379157.3391995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hong, Injoon, Hoi-Jun Yoo, and Kyeongryeol Bong. "Challenges of eye tracking systems for mobile XR glasses." In Applications of Digital Image Processing XLI, edited by Andrew G. Tescher. SPIE, 2018. http://dx.doi.org/10.1117/12.2322657.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gao, Xiang-Yu, Yu-Fei Zhang, Wei-Long Zheng, and Bao-Liang Lu. "Evaluating driving fatigue detection algorithms using eye tracking glasses." In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 2015. http://dx.doi.org/10.1109/ner.2015.7146736.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mulvey, Fiona Bríd, Marek Mikitovic, Mateusz Sadowski, Baosheng Hou, Nils David Rasamoel, John Paulin Paulin Hansen, and Per Bækgaard. "Gaze Interactive and Attention Aware Low Vision Aids as Future Smart Glasses." In ETRA '21: 2021 Symposium on Eye Tracking Research and Applications. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3450341.3460769.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yip, Hiu Man, David Navarro-Alarcon, and Yun-hui Liu. "Development of an eye-gaze controlled interface for surgical manipulators using eye-tracking glasses." In 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2016. http://dx.doi.org/10.1109/robio.2016.7866606.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Shi, Zhen-Feng, Chang Zhou, Wei-Long Zheng, and Bao-Liang Lu. "Attention evaluation with eye tracking glasses for EEG-based emotion recognition." In 2017 8th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 2017. http://dx.doi.org/10.1109/ner.2017.8008298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Yanxin, Hong Zeng, and Jia Liu. "Low-cost eye-tracking glasses with real-time head rotation compensation." In 2016 10th International Conference on Sensing Technology (ICST). IEEE, 2016. http://dx.doi.org/10.1109/icsenst.2016.7796336.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography