Journal articles on the topic 'Eye tracking glasses'

To see the other types of publications on this topic, follow the link: Eye tracking glasses.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Eye tracking glasses.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

OGASAWARA, Tomohito, Ryogo HORIUCHI, Yasuto TANAKA, and Norihisa MIKI. "Eye-tracking system for reverse glasses." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2017 (2017): 1A1—K12. http://dx.doi.org/10.1299/jsmermd.2017.1a1-k12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ehinger, Benedikt V., Katharina Groß, Inga Ibs, and Peter König. "A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000." PeerJ 7 (July 9, 2019): e7086. http://dx.doi.org/10.7717/peerj.7086.

Full text
Abstract:
Eye-tracking experiments rely heavily on good data quality of eye-trackers. Unfortunately, it is often the case that only the spatial accuracy and precision values are available from the manufacturers. These two values alone are not sufficient to serve as a benchmark for an eye-tracker: Eye-tracking quality deteriorates during an experimental session due to head movements, changing illumination or calibration decay. Additionally, different experimental paradigms require the analysis of different types of eye movements; for instance, smooth pursuit movements, blinks or microsaccades, which themselves cannot readily be evaluated by using spatial accuracy or precision alone. To obtain a more comprehensive description of properties, we developed an extensive eye-tracking test battery. In 10 different tasks, we evaluated eye-tracking related measures such as: the decay of accuracy, fixation durations, pupil dilation, smooth pursuit movement, microsaccade classification, blink classification, or the influence of head motion. For some measures, true theoretical values exist. For others, a relative comparison to a reference eye-tracker is needed. Therefore, we collected our gaze data simultaneously from a remote EyeLink 1000 eye-tracker as the reference and compared it with the mobile Pupil Labs glasses. As expected, the average spatial accuracy of 0.57° for the EyeLink 1000 eye-tracker was better than the 0.82° for the Pupil Labs glasses (N= 15). Furthermore, we classified less fixations and shorter saccade durations for the Pupil Labs glasses. Similarly, we found fewer microsaccades using the Pupil Labs glasses. The accuracy over time decayed only slightly for the EyeLink 1000, but strongly for the Pupil Labs glasses. Finally, we observed that the measured pupil diameters differed between eye-trackers on the individual subject level but not on the group level. To conclude, our eye-tracking test battery offers 10 tasks that allow us to benchmark the many parameters of interest in stereotypical eye-tracking situations and addresses a common source of confounds in measurement errors (e.g., yaw and roll head movements). All recorded eye-tracking data (including Pupil Labs’ eye videos), the stimulus code for the test battery, and the modular analysis pipeline are freely available (https://github.com/behinger/etcomp).
APA, Harvard, Vancouver, ISO, and other styles
3

Behe, Bridget K., R. Thomas Fernandez, Patricia T. Huddleston, Stella Minahan, Kristin L. Getter, Lynnell Sage, and Allison M. Jones. "Practical Field Use of Eye-tracking Devices for Consumer Research in the Retail Environment." HortTechnology 23, no. 4 (August 2013): 517–24. http://dx.doi.org/10.21273/horttech.23.4.517.

Full text
Abstract:
Eye-tracking equipment is now affordable and portable, making it a practical instrument for consumer research. Engineered to best analyze gaze on a plane (e.g., a retail shelf), both portable eye-tracking glasses and computer monitor–mounted hardware can play key roles in analyzing merchandise displays to better understand what consumers view. Researchers and practitioners can use that information to improve the sales efficacy of displays. Eye-tracking hardware was nearly exclusively used to investigate the reading process but can now be used for a broader range of study, namely in retail settings. This article presents an approach to using glasses eye tracker (GET) and light eye tracker (LET) eye-tracking hardware for applied consumer research in the field. We outline equipment use, study construction, data extraction as well as benefits and limitations of the technology collected from several pilot studies.
APA, Harvard, Vancouver, ISO, and other styles
4

Al-Haddad, Sara, Matthew Sears, Omar Alruwaythi, and Paul M. Goodrum. "Complexity, Performance, and Search Efficiency: An Eye-Tracking Study on Assembly-Based Tasks among Construction Workers (Pipefitters)." Buildings 12, no. 12 (December 8, 2022): 2174. http://dx.doi.org/10.3390/buildings12122174.

Full text
Abstract:
Past studies have used eye-tracking glasses to analyze people’s perception of visual stimuli, usually regarding wayfinding, safety, or visual appeal. Some industries, such as the automotive industry, studied the effects of visual stimuli on task completion. However, the architecture and construction industries have mainly conducted eye-tracking experiments with surveys or search tasks instead of performing a task. This paper uses eye-tracking glasses to analyze people’s perception of visual stimuli while completing tangible tasks that simulate real-world applications. This research studies how people look at visual stimuli that influence their ability to interpret drawings with varying degrees of complexity, assess task completion performance, and inspect how people search for information. Twenty pipefitters wore eye-tracking glasses to record their eye movement patterns while completing a model pipe spool assembly. The eye-tracking glasses and Visual Eyes software measured visit metrics, fixations, fixation durations, convex hull coverage, assembly time, rework, and errors. Unlike previous studies, convex hull areas are calculated and used to measure search efficiency. This research found that people interacted more frequently with more complex visual stimuli but did not necessarily require more time to complete a task. People with lower search efficiency visited the drawings more frequently than people with higher search efficiency. People with higher search efficiency made fewer mistakes, redid less work, and completed tasks quicker than those with lower search efficiency. Search efficiency was found to be a good predictor of task performance.
APA, Harvard, Vancouver, ISO, and other styles
5

Thibeault, Mark, Monica Jesteen, and Andrew Beitman. "Improved Accuracy Test Method for Mobile Eye Tracking in Usability Scenarios." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 63, no. 1 (November 2019): 2226–30. http://dx.doi.org/10.1177/1071181319631083.

Full text
Abstract:
Eye tracking has been used in usability testing for many years to gain objective measurements to inform label, instruction, and product design. With many different testing environments, possible participants, hardware, and software, key metrics can vary greatly. These metrics can also vary for different studies, so a standardized test method, metrics, and calculation method are proposed in this study. The Tobii Pro Glasses 2 is a mobile eye tracker that does not significantly affect user mobility compared to other eye-trackers. This study aims to build a testing method, which can modify to better fit the varying conditions found in usability testing. This study was performed with Tobii Pro Glasses 2; however, this test method can be used with any mobile eye tracking units. Even with poor testing conditions, this test method results in reliable metrics which can be utilized to inform expectation and decisions with eye tracking. These methods are recommended to be performed prior to eye tracking testing to determine testing-specific performance.
APA, Harvard, Vancouver, ISO, and other styles
6

Katz, Trixie A., Danielle D. Weinberg, Claire E. Fishman, Vinay Nadkarni, Patrice Tremoulet, Arjan B. te Pas, Aleksandra Sarcevic, and Elizabeth E. Foglia. "Visual attention on a respiratory function monitor during simulated neonatal resuscitation: an eye-tracking study." Archives of Disease in Childhood - Fetal and Neonatal Edition 104, no. 3 (June 14, 2018): F259—F264. http://dx.doi.org/10.1136/archdischild-2017-314449.

Full text
Abstract:
ObjectiveA respiratory function monitor (RFM) may improve positive pressure ventilation (PPV) technique, but many providers do not use RFM data appropriately during delivery room resuscitation. We sought to use eye-tracking technology to identify RFM parameters that neonatal providers view most commonly during simulated PPV.DesignMixed methods study. Neonatal providers performed RFM-guided PPV on a neonatal manikin while wearing eye-tracking glasses to quantify visual attention on displayed RFM parameters (ie, exhaled tidal volume, flow, leak). Participants subsequently provided qualitative feedback on the eye-tracking glasses.SettingLevel 3 academic neonatal intensive care unit.ParticipantsTwenty neonatal resuscitation providers.Main outcome measuresVisual attention: overall gaze sample percentage; total gaze duration, visit count and average visit duration for each displayed RFM parameter. Qualitative feedback: willingness to wear eye-tracking glasses during clinical resuscitation.ResultsTwenty providers participated in this study. The mean gaze sample captured wa s 93% (SD 4%). Exhaled tidal volume waveform was the RFM parameter with the highest total gaze duration (median 23%, IQR 13–51%), highest visit count (median 5.17 per 10 s, IQR 2.82–6.16) and longest visit duration (median 0.48 s, IQR 0.38–0.81 s). All participants were willing to wear the glasses during clinical resuscitation.ConclusionWearable eye-tracking technology is feasible to identify gaze fixation on the RFM display and is well accepted by providers. Neonatal providers look at exhaled tidal volume more than any other RFM parameter. Future applications of eye-tracking technology include use during clinical resuscitation.
APA, Harvard, Vancouver, ISO, and other styles
7

Niehorster, Diederick C., Thiago Santini, Roy S. Hessels, Ignace T. C. Hooge, Enkelejda Kasneci, and Marcus Nyström. "The impact of slippage on the data quality of head-worn eye trackers." Behavior Research Methods 52, no. 3 (January 2, 2020): 1140–60. http://dx.doi.org/10.3758/s13428-019-01307-0.

Full text
Abstract:
AbstractMobile head-worn eye trackers allow researchers to record eye-movement data as participants freely move around and interact with their surroundings. However, participant behavior may cause the eye tracker to slip on the participant’s head, potentially strongly affecting data quality. To investigate how this eye-tracker slippage affects data quality, we designed experiments in which participants mimic behaviors that can cause a mobile eye tracker to move. Specifically, we investigated data quality when participants speak, make facial expressions, and move the eye tracker. Four head-worn eye-tracking setups were used: (i) Tobii Pro Glasses 2 in 50 Hz mode, (ii) SMI Eye Tracking Glasses 2.0 60 Hz, (iii) Pupil-Labs’ Pupil in 3D mode, and (iv) Pupil-Labs’ Pupil with the Grip gaze estimation algorithm as implemented in the EyeRecToo software. Our results show that whereas gaze estimates of the Tobii and Grip remained stable when the eye tracker moved, the other systems exhibited significant errors (0.8–3.1∘ increase in gaze deviation over baseline) even for the small amounts of glasses movement that occurred during the speech and facial expressions tasks. We conclude that some of the tested eye-tracking setups may not be suitable for investigating gaze behavior when high accuracy is required, such as during face-to-face interaction scenarios. We recommend that users of mobile head-worn eye trackers perform similar tests with their setups to become aware of its characteristics. This will enable researchers to design experiments that are robust to the limitations of their particular eye-tracking setup.
APA, Harvard, Vancouver, ISO, and other styles
8

Golard, Andre, and Sachin S. Talathi. "Ultrasound for Gaze Estimation—A Modeling and Empirical Study." Sensors 21, no. 13 (June 30, 2021): 4502. http://dx.doi.org/10.3390/s21134502.

Full text
Abstract:
Most eye tracking methods are light-based. As such, they can suffer from ambient light changes when used outdoors, especially for use cases where eye trackers are embedded in Augmented Reality glasses. It has been recently suggested that ultrasound could provide a low power, fast, light-insensitive alternative to camera-based sensors for eye tracking. Here, we report on our work on modeling ultrasound sensor integration into a glasses form factor AR device to evaluate the feasibility of estimating eye-gaze in various configurations. Next, we designed a benchtop experimental setup to collect empirical data on time of flight and amplitude signals for reflected ultrasound waves for a range of gaze angles of a model eye. We used this data as input for a low-complexity gradient-boosted tree machine learning regression model and demonstrate that we can effectively estimate gaze (gaze RMSE error of 0.965 ± 0.178 degrees with an adjusted R2 score of 90.2 ± 4.6).
APA, Harvard, Vancouver, ISO, and other styles
9

Kortman, Brenton. "Assessing for Hemi-Spatial Neglect Using Eye Tracking Glasses." Archives of Physical Medicine and Rehabilitation 96, no. 10 (October 2015): e24. http://dx.doi.org/10.1016/j.apmr.2015.08.074.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Law, Brenda Hiu Yan, Po-Yin Cheung, Michael Wagner, Sylvia van Os, Bin Zheng, and Georg Schmölzer. "Analysis of neonatal resuscitation using eye tracking: a pilot study." Archives of Disease in Childhood - Fetal and Neonatal Edition 103, no. 1 (August 19, 2017): F82—F84. http://dx.doi.org/10.1136/archdischild-2017-313114.

Full text
Abstract:
BackgroundVisual attention (VA) is important for situation awareness and decision-making. Eye tracking can be used to analyse the VA of healthcare providers. No study has examined eye tracking during neonatal resuscitation.ObjectiveTo test the use of eye tracking to examine VA during neonatal resuscitation.MethodsSix video recordings were obtained using eye tracking glasses worn by resuscitators during the first 5 min of neonatal resuscitation. Videos were analysed to obtain (i) areas of interest (AOIs), (ii) time spent on each AOI and (iii) frequency of saccades between AOIs.ResultsFive videos were of acceptable quality and analysed. Only 35% of VA was directed at the infant, with 33% at patient monitors and gauges. There were frequent saccades (0.45/s) and most involved patient monitors.ConclusionDuring neonatal resuscitation, VA is often directed away from the infant towards patient monitors. Eye tracking can be used to analyse human performance during neonatal resuscitation.
APA, Harvard, Vancouver, ISO, and other styles
11

Niehorster, Diederick C., Roy S. Hessels, and Jeroen S. Benjamins. "GlassesViewer: Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker." Behavior Research Methods 52, no. 3 (January 2, 2020): 1244–53. http://dx.doi.org/10.3758/s13428-019-01314-1.

Full text
Abstract:
AbstractWe present GlassesViewer, open-source software for viewing and analyzing eye-tracking data of the Tobii Pro Glasses 2 head-mounted eye tracker as well as the scene and eye videos and other data streams (pupil size, gyroscope, accelerometer, and TTL input) that this headset can record. The software provides the following functionality written in MATLAB: (1) a graphical interface for navigating the study- and recording structure produced by the Tobii Glasses 2; (2) functionality to unpack, parse, and synchronize the various data and video streams comprising a Glasses 2 recording; and (3) a graphical interface for viewing the Glasses 2’s gaze direction, pupil size, gyroscope and accelerometer time-series data, along with the recorded scene and eye camera videos. In this latter interface, segments of data can furthermore be labeled through user-provided event classification algorithms or by means of manual annotation. Lastly, the toolbox provides integration with the GazeCode tool by Benjamins et al. (2018), enabling a completely open-source workflow for analyzing Tobii Pro Glasses 2 recordings.
APA, Harvard, Vancouver, ISO, and other styles
12

Merino, Giselle Schmidt A. D., Carmen Elena Martinez Riascos, Angelina Dias Leão Costa, Gleice Virginia Medeiros de Azambuja Elali, and Eugenio Merino. "The focus of visual attention in people with motor disabilities through Eye tracking." Gestão & Tecnologia de Projetos 13, no. 3 (December 26, 2018): 7–20. http://dx.doi.org/10.11606/gtp.v13i3.146091.

Full text
Abstract:
Make the environment that can be achieved, fires, used and experienced by anyone, including those with reduced mobility, is an increasingly important need for professionals. Being the eye tracking is an assistive technology that enables you to identify objectively the visual perception was held an experiment that allows analyzing the people’s difficulties in internal visual identification on buildings. The article goal is to identify the focus of visual attention in people with motor disabilities using eye tracking glasses. To perform the experiment was used Senso Motoric Instruments (SMI) eye tracking glasses and was did analyses with the BeGaze software version 3.6. The results indicate the lack of visual information causes difficulties for people to locate and identify the correct route for the offset inside a building, reducing the subjectivity in making decisions to make accessible environments. The tests show that the participants do not have fixed their gaze on specific points, because it remained looking for visual information into the building generating lack of orientation and difficulties to define the right route at offset. With this experiment was possible to validate an application of the device to contribute to the decision-making process of professionals to make accessible environments. In addition, they recognized the particularities in the use of Assistive Technology, the glasses eye tracker, and the possibility of being used in the analysis of various tasks contributing in the Design, in the Architecture, and the Engineering.
APA, Harvard, Vancouver, ISO, and other styles
13

Li, Ting-Hao, Hiromasa Suzuki, and Yutaka Ohtake. "Visualization of user’s attention on objects in 3D environment using only eye tracking glasses." Journal of Computational Design and Engineering 7, no. 2 (March 30, 2020): 228–37. http://dx.doi.org/10.1093/jcde/qwaa019.

Full text
Abstract:
Abstract Eye tracking technology is widely applied to detect user’s attention in a 2D field, such as web page design, package design, and shooting games. However, because our surroundings primarily consist of 3D objects, applications will be expanded if there is an effective method to obtain and display user’s 3D gaze fixation. In this research, a methodology is proposed to demonstrate the user’s 3D gaze fixation on a digital model of a scene using only a pair of eye tracking glasses. The eye tracking glasses record user’s gaze data and scene video. Thus, using image-based 3D reconstruction, a 3D model of the scene can be reconstructed from the frame images; simultaneously, the transformation matrix of each frame image can be evaluated to find 3D gaze fixation on the 3D model. In addition, a method that demonstrates multiple users’ 3D gaze fixation on the same digital model is presented to analyze gaze distinction between different subjects. With this preliminary development, this approach shows potential to be applied to a larger environment and conduct a more reliable investigation.
APA, Harvard, Vancouver, ISO, and other styles
14

Nag, Anish, Nick Haber, Catalin Voss, Serena Tamura, Jena Daniels, Jeffrey Ma, Bryan Chiang, et al. "Toward Continuous Social Phenotyping: Analyzing Gaze Patterns in an Emotion Recognition Task for Children With Autism Through Wearable Smart Glasses." Journal of Medical Internet Research 22, no. 4 (April 22, 2020): e13810. http://dx.doi.org/10.2196/13810.

Full text
Abstract:
Background Several studies have shown that facial attention differs in children with autism. Measuring eye gaze and emotion recognition in children with autism is challenging, as standard clinical assessments must be delivered in clinical settings by a trained clinician. Wearable technologies may be able to bring eye gaze and emotion recognition into natural social interactions and settings. Objective This study aimed to test: (1) the feasibility of tracking gaze using wearable smart glasses during a facial expression recognition task and (2) the ability of these gaze-tracking data, together with facial expression recognition responses, to distinguish children with autism from neurotypical controls (NCs). Methods We compared the eye gaze and emotion recognition patterns of 16 children with autism spectrum disorder (ASD) and 17 children without ASD via wearable smart glasses fitted with a custom eye tracker. Children identified static facial expressions of images presented on a computer screen along with nonsocial distractors while wearing Google Glass and the eye tracker. Faces were presented in three trials, during one of which children received feedback in the form of the correct classification. We employed hybrid human-labeling and computer vision–enabled methods for pupil tracking and world–gaze translation calibration. We analyzed the impact of gaze and emotion recognition features in a prediction task aiming to distinguish children with ASD from NC participants. Results Gaze and emotion recognition patterns enabled the training of a classifier that distinguished ASD and NC groups. However, it was unable to significantly outperform other classifiers that used only age and gender features, suggesting that further work is necessary to disentangle these effects. Conclusions Although wearable smart glasses show promise in identifying subtle differences in gaze tracking and emotion recognition patterns in children with and without ASD, the present form factor and data do not allow for these differences to be reliably exploited by machine learning systems. Resolving these challenges will be an important step toward continuous tracking of the ASD phenotype.
APA, Harvard, Vancouver, ISO, and other styles
15

Takahashi, Ryo, Hiromasa Suzuki, Jouh Yeong Chew, Yutaka Ohtake, Yukie Nagai, and Koichi Ohtomi. "A system for three-dimensional gaze fixation analysis using eye tracking glasses." Journal of Computational Design and Engineering 5, no. 4 (December 30, 2017): 449–57. http://dx.doi.org/10.1016/j.jcde.2017.12.007.

Full text
Abstract:
Abstract Eye tracking is a technology that has quickly become a commonplace tool for evaluating package and webpage design. In such design processes, static two-dimensional images are shown on a computer screen while a subject's gaze where he or she looks is measured via an eye tracking device. The collected gaze fixation data are then visualized and analyzed via gaze plots and heat maps. Such evaluations using two-dimensional images are often too limited to analyze gaze on three-dimensional physical objects such as products because users look at them not from a single point of view but rather from various angles. Therefore in this study we propose methods for collecting gaze fixation data for a three-dimensional model of a given product and visualizing corresponding gaze plots and heat maps also in three dimensions. To achieve our goals, we used a wearable eye-tracking device, i.e., eye-tracking glasses. Further, we implemented a prototype system to demonstrate its advantages in comparison with two-dimensional gaze fixation methods. Highlights Proposing a method for collecting gaze fixation data for a three-dimensional model of a given product. Proposing two visualization methods for three dimensional gaze data; gaze plots and heat maps. Proposed system was applied to two practical examples of hair dryer and car interior.
APA, Harvard, Vancouver, ISO, and other styles
16

Hareide, Odd Sveinung, and Runar Ostnes. "Maritime Usability Study by Analysing Eye Tracking Data." Journal of Navigation 70, no. 5 (April 17, 2017): 927–43. http://dx.doi.org/10.1017/s0373463317000182.

Full text
Abstract:
The aim of the Integrated Navigation System (INS) on a ship bridge should be to provide the navigator with added value and aid in the complex task of conducting a safe and efficient passage at high speeds in demanding waters. This article presents a method for analysing eye tracking data to reveal sub-optimal design in the bridge layout and in the software graphical user interface on a maritime navigation display. The analysis of eye tracking data with a focus on scan path events indicates sub-optimal design, and the paper provides suggestions for improvement in design and interfaces. Pros and cons of using Eye Tracking Glasses in a maritime environment are presented. The importance of not affecting the normal behaviour of the navigator by collecting data is stressed, and how the software should provide good visualisation and interpretation of the eye tracking data.
APA, Harvard, Vancouver, ISO, and other styles
17

Gordieiev, Oleksandr, Vyacheslav Kharchenko, Oleg Illiashenko, Olga Morozova, and Magomediemin Gasanov. "Concept of Using Eye Tracking Technology to Assess and Ensure Cybersecurity, Functional Safety and Usability." International Journal of Safety and Security Engineering 11, no. 4 (August 31, 2021): 361–67. http://dx.doi.org/10.18280/ijsse.110409.

Full text
Abstract:
Eye tracking technology is based on tracking the trajectory of human eye movement. As a rule, it is implemented in the form of an additional device attached under the monitor or in the form of glasses. On the basis of a mathematical model, the focus of a person's attention is calculated and, accordingly, the user's visual route is built. Eye tracking technology is used to solve various problems, e.g. for marketing research, assessing the quality of user interfaces, developing simulators for operators, etc. The article discusses the concept of using eye tracking technology to assess and ensure cyber security, functional safety and usability. The possibility of using eye tracking technology (ETT) to solve the problem of identifying a person's personality is considered separately. The solution is achieved by reproducing a certain trajectory by a person's vision. This technique can be used as a basic or additional technique for identifying a person's personality. It also analyzes the results of using eye tracking to study the interface of an automated information system for operator support based on algorithms for symptom-oriented emergency instructions (ASOEI), which is used at nuclear power plants (NPP).
APA, Harvard, Vancouver, ISO, and other styles
18

Nourrit, Vincent, Rémi Poilane, and Jean-Louis de Bougrenet IMT Atlantique. "Custom on-axis head-mounted eye tracker for 3D active glasses." Electronic Imaging 2021, no. 2 (January 18, 2021): 55–1. http://dx.doi.org/10.2352/issn.2470-1173.2021.2.sda-055.

Full text
Abstract:
Currently, no low cost commercial 3D active glasses with embedded eye tracker are available despite the importance of 3D and eye tracking for numerous applications. In this context, a simple low cost eye tracker for 3D glasses with liquid crystal shutters is presented and tested for orthoptics applications. By using a beam splitter to better align the camera with the line of sight when the subject looks at a target in front of him at far range, the new design allows recording high quality images with limited pupil deformation when compared to other commercial eye trackers where the cameras can be far from this axis (head mounted or fixed). Such a design could be useful for various applications from orthoptics to virtual reality
APA, Harvard, Vancouver, ISO, and other styles
19

Pentus, Kristian, Kerli Ploom, Tanel Mehine, Madli Koiv, Age Tempel, and Andres Kuusik. "Mobile and stationary eye tracking comparison – package design and in-store results." Journal of Consumer Marketing 37, no. 3 (February 10, 2020): 259–69. http://dx.doi.org/10.1108/jcm-04-2019-3190.

Full text
Abstract:
Purpose This paper aims to test the similarity of the results of on-screen eye tracking compared to mobile eye tracking in the context of first fixation location on stimuli. Design/methodology/approach Three studies were conducted altogether with 117 participants, where the authors compared both methods: stationary eye tracking (Tobii Pro X2-60) and mobile eye tracking (Tobii Pro Glasses 2). Findings The studies revealed that the reported average first fixation locations from stationary and mobile eye tracking are different. Stationary eye tracking is more affected by a centre fixation bias. Based on the research, it can be concluded that stationary eye tracking is not always suitable for studying consumer perception and behaviour because of the centre viewing bias. Research limitations/implications When interpreting the results, researchers should take into account that stationary eye tracking results are affected by a centre fixation bias. Previous stationary eye tracking research should be interpreted with the centre fixation bias in mind. Some of this previous work should be retested using mobile eye tracking. If possible small-scale pilot studies should be included in papers to show that the more appropriate method, less affected by attention biases, was chosen. Practical implications Managers should trust research where the ability of package design to attract attention on a shelf is tested using mobile eye tracking. The authors suggest using mobile eye tracking to optimise store shelf planograms, point-of-purchase materials, and shelf layouts. In package design, interpretations of research using stationary eye tracking should consider its centre fixation bias. Managers should also be cautious when interpreting previous stationary eye tracking research (both applied and scientific), knowing that stationary eye tracking is more prone to a centre fixation bias. Originality/value While eye tracking research has become more and more popular as a marketing research method, the limitations of the method have not been fully understood by the field. This paper shows that the chosen eye tracking method can influence the results. No such comparative paper about mobile and stationary eye tracking research has been done in the marketing field.
APA, Harvard, Vancouver, ISO, and other styles
20

Bozkir, Efe, Onur Günlü, Wolfgang Fuhl, Rafael F. Schaefer, and Enkelejda Kasneci. "Differential privacy for eye tracking with temporal correlations." PLOS ONE 16, no. 8 (August 17, 2021): e0255979. http://dx.doi.org/10.1371/journal.pone.0255979.

Full text
Abstract:
New generation head-mounted displays, such as VR and AR glasses, are coming into the market with already integrated eye tracking and are expected to enable novel ways of human-computer interaction in numerous applications. However, since eye movement properties contain biometric information, privacy concerns have to be handled properly. Privacy-preservation techniques such as differential privacy mechanisms have recently been applied to eye movement data obtained from such displays. Standard differential privacy mechanisms; however, are vulnerable due to temporal correlations between the eye movement observations. In this work, we propose a novel transform-coding based differential privacy mechanism to further adapt it to the statistics of eye movement feature data and compare various low-complexity methods. We extend the Fourier perturbation algorithm, which is a differential privacy mechanism, and correct a scaling mistake in its proof. Furthermore, we illustrate significant reductions in sample correlations in addition to query sensitivities, which provide the best utility-privacy trade-off in the eye tracking literature. Our results provide significantly high privacy without any essential loss in classification accuracies while hiding personal identifiers.
APA, Harvard, Vancouver, ISO, and other styles
21

Wu, Tunhua, Ping Wang, Shengnan Yin, and Yezhi Lin. "A New Human Eye Tracking Algorithm of Optimized TLD Based on Improved Mean-Shift." International Journal of Pattern Recognition and Artificial Intelligence 31, no. 03 (February 2017): 1755007. http://dx.doi.org/10.1142/s0218001417550072.

Full text
Abstract:
In this paper, an improved Mean-shift algorithm was integrated with standard tracking–learning–detection (TLD) model tracker for improving the tracking effects of standard TLD model and enhancing the anti-occlusion capability and the recognition capability of similar objectives. The target region obtained by the improved Mean-shift algorithm and the target region obtained by the TLD model tracker are integrated to achieve favorable tracking effects. Then the optimized TLD tracking system was applied to human eye tracking. In the tests, the model can be self-adopted to partial occlusion, such as eye-glasses, closed eyes and hand occlusion. And the roll angle can approach 90[Formula: see text], raw angle can approach 45[Formula: see text] and pitch angle can approach 60[Formula: see text]. In addition, the model never mistakenly transfers the tracking region to another eye (similar target on the same face) in longtime tracking. Experimental results indicate that: (1) the optimized TLD model shows sound tracking stability even when targets are partially occluded or rotated; (2) tracking speed and accuracy are superior to those of the standard TLD and some mainstream tracking methods. In summary, the optimized TLD model show higher robustness, stability and better responding to complex eye tracking requirement.
APA, Harvard, Vancouver, ISO, and other styles
22

Kuo, Yung-Lung, Jiann-Shu Lee, and Min-Chai Hsieh. "Video-Based Eye Tracking to Detect the Attention Shift." International Journal of Distance Education Technologies 12, no. 4 (October 2014): 66–81. http://dx.doi.org/10.4018/ijdet.2014100105.

Full text
Abstract:
Eye and head movements evoked in response to obvious visual attention shifts. However, there has been little progress on the causes of absent-mindedness so far. The paper proposes an attention awareness system that captures the conditions regarding the interaction of eye gaze and head pose under various attentional switching in computer classroom. Via the algorithm of complexion area detection, eye location and eye tracking, the system detects the shifts of the subject's attention, records it and sends a notification to the class teacher. In five variant experiments of attentional shift, the authors find the cues of subjects when turning to other directions, moving forwards or backwards, and closing eyes. It applies even if the students wear glasses or have hair fringe. The findings of absent-mindedness are very important for psychologists or educators conducting post-hoc analysis. These experimental results show eye tracking are useful for detecting shift of attention.
APA, Harvard, Vancouver, ISO, and other styles
23

Miranda, Antonio Miguel, Eduardo J. Nunes-Pereira, Karthikeyan Baskaran, and Antonio Filipe Macedo. "Eye movements, convergence distance and pupil-size when reading from smartphone, computer, print and tablet." Scandinavian Journal of Optometry and Visual Science 11, no. 1 (July 22, 2018): 1–5. http://dx.doi.org/10.5384/sjovs.vol11i1p1-5.

Full text
Abstract:
This study investigated the use of eye-tracking glasses to monitor visual behaviour when reading from electronic devices and paper in free-viewing conditions. The Tobii-Pro-Glasses were used to monitor 20 subjects with normal vision during reading tasks. Reading was performed in a smartphone, computer, paper and tablet. Texts from the IReST-test were read in devices in a random order. Participants read one text in each device and then repeated the same task 1 hour later; in total each participant read eight different texts. The sequence for the devices was randomized. We found differences between devices for saccade amplitude, fixation duration, convergence distance and pupil size. Reading speed between computer and tablet was slightly different (8 words-per-minute) and pupil size reduced up to 20% in electronic devices compared to print. Behavioural changes observed whilst reading from different devices may reflect an attempt from readers to optimize performance. The need to maintain visual performance under different visual condition may lead to increased visual symptoms. Eye-tracking glasses could be a valuable tool to investigate visual aspects of digital strain.
APA, Harvard, Vancouver, ISO, and other styles
24

Kang, Dongwoo, Jin-Ho Choi, and Hyoseok Hwang. "Autostereoscopic 3D Display System for 3D Medical Images." Applied Sciences 12, no. 9 (April 24, 2022): 4288. http://dx.doi.org/10.3390/app12094288.

Full text
Abstract:
Recent advances in autostereoscopic three-dimensional (3D) display systems have led to innovations in consumer electronics and vehicle systems (e.g., head-up displays). However, medical images with stereoscopic depth provided by 3D displays have yet to be developed sufficiently for widespread adoption in diagnostics. Indeed, many stereoscopic 3D displays necessitate special 3D glasses that are unsuitable for clinical environments. This paper proposes a novel glasses-free 3D autostereoscopic display system based on an eye tracking algorithm and explores its viability as a 3D navigator for cardiac computed tomography (CT) images. The proposed method uses a slit-barrier with a backlight unit, which is combined with an eye tracking method that exploits multiple machine learning techniques to display 3D images. To obtain high-quality 3D images with minimal crosstalk, the light field 3D directional subpixel rendering method combined with the eye tracking module is applied using a user’s 3D eye positions. Three-dimensional coronary CT angiography images were volume rendered to investigate the performance of the autostereoscopic 3D display systems. The proposed system was trialed by expert readers, who identified key artery structures faster than with a conventional two-dimensional display without reporting any discomfort or 3D fatigue. With the proposed autostereoscopic 3D display systems, the 3D medical image navigator system has the potential to facilitate faster diagnoses with improved accuracy.
APA, Harvard, Vancouver, ISO, and other styles
25

Mansor, Aida Azlina, and Salmi Mohd Isa. "Areas of Interest (AOI) on marketing mix elements of green and non-green products in customer decision making." Neuroscience Research Notes 5, no. 3 (September 30, 2022): 174. http://dx.doi.org/10.31117/neuroscirn.v5i3.174.

Full text
Abstract:
Technological advancements in eye-tracking have enabled the development of interactive experimental setups for studying consumer behaviour. A common method for examining gaze data is Area of Interest (AOI). Therefore, this study fully utilised eye-tracking tools to measure participants' allocation of visual attention to the marketing mix elements in green and non-green products. This is because the product, price, place, and promotion are still the most crucial factors that customers consider when purchasing. The primary objective of this study is to discover and understand the primary marketing function that directly influences customer decision-making from a neuromarketing perspective. Their eye movements were simultaneously registered using SMI Eye Tracking Glasses 2 Wireless, and the gaze locations of participants were measured from AOI. The findings of this study have a significant impact on the importance of eye movements in decision-making, particularly when choosing important marketing elements before purchasing green and non-green products.
APA, Harvard, Vancouver, ISO, and other styles
26

Rupi and Krizek. "Visual Eye Gaze While Cycling: Analyzing Eye Tracking at Signalized Intersections in Urban Conditions." Sustainability 11, no. 21 (November 1, 2019): 6089. http://dx.doi.org/10.3390/su11216089.

Full text
Abstract:
The manner in which cyclists visually perceive elements of the urban environment plays an important role in bicycle crashes, which have been increasing in recent years. Yet, how visual information is processed by the user while riding a bike is still poorly analyzed by researchers. This study investigates cyclists’ eye gaze behavior at signalized intersections taking into account a set of gaze characteristics. Recording cyclist’s visual fixations by mobile-eye glasses in a real outdoor environment, a total of 13 field tests have been analyzed along a three-kilometer route in the urban center of Bologna, Italy. Findings reveal key differences in gaze behavior by experience level of the cyclist and type of intersection.
APA, Harvard, Vancouver, ISO, and other styles
27

Karp, Emily, Andrew Scott, Katherine Martin, Hanan Zavala, Siva Chinnadurai, and Brianne Roby. "Developing an Eye-Tracking Protocol to Determine Children’s Visual Perception of Secondary Cleft Lip Deformity." Cleft Palate-Craniofacial Journal 57, no. 3 (August 12, 2019): 321–26. http://dx.doi.org/10.1177/1055665619868332.

Full text
Abstract:
Objective: To develop a protocol that will be used to measure children’s perception of secondary cleft lip deformity (SCLD) using objective eye-tracking technology. Design: Cross-sectional study. Data collection May and June of 2018. Setting: Single tertiary care pediatric hospital with a well-established cleft team. Participants: Participants were recruited from a general pediatric otolaryngology clinic. Sixty participants from 4 age groups (5-6, 10, 13, and 16 years) were enrolled on a voluntary basis. Intervention: Pediatric participants viewed images of children’s faces while wearing eye-tracking glasses. Ten images with unilateral SCLD and 2 control images with no facial scarring were viewed as gaze was assessed. Main Outcome and Measure: Successful gaze fixation was recorded across all age groups. Results: This article illustrates the types of data generated from glasses-based eye tracking in children. All children, regardless of age, spent more time with their gaze on a SCLD images (mean = 4.23 seconds; standard deviation [SD] = 1.41 seconds) compared to control images (mean = 3.97 seconds; SD = 1.42). Younger age groups spent less time looking at specific areas of interest in SCLD images. Conclusion: In this pilot study, we were able to successfully use eye-tracking technology in children to demonstrate gaze preference and a trend toward visual perception of SCLD changing with age. This protocol will allow for a future study, with larger and more diverse populations. Better understanding of how SCLD is perceived among children and adolescents has the potential to guide future interventions for SCLD and other facial deformities in pediatric patients.
APA, Harvard, Vancouver, ISO, and other styles
28

Faiella, Filomena, Emiliana Mannese, Giulia Savarese, Antonina Plutino, and Maria Grazia Lombardi. "Eye-tracking glasses for improving teacher education: the e-Teach project." Research on Education and Media 11, no. 1 (June 1, 2019): 85–92. http://dx.doi.org/10.2478/rem-2019-0012.

Full text
Abstract:
Abstract This paper is about “Improvement of teaching techniques by eye tracking in technology enhanced classrooms” (e-Teach), an innovative project funded by the Erasmus Plus Programme (KA2 - Strategic Partnership in the field of School Education). The project aims to study teachers’ eye movements in real teaching situation using eye-tracking glasses and compares the teachers’ use of digital technologies between novices and experts teaching the same school subject. The purpose of this study was to provide indicators of skill gaps between novices and experts which can be addressed appropriately with highly targeted teacher education. The first part of the paper reviews recent developments in conceptual frameworks for digital competence and in digital competence descriptors. The second part describes the project status, the methods and its phases. In conclusion, the paper gives a brief overview of initial findings of ongoing research, focusing largely on the Italian experience, and development tasks for the next project phases. The initial findings suggest that teachers valued the benefits of using digital technologies in classrooms and recognized the necessity of professional development. They also provided specific insights for the purpose of developing an online course for teacher education in four languages: English, Turkish, Italian and Lithuanian.
APA, Harvard, Vancouver, ISO, and other styles
29

Azimi Sotudeh, Mohammad Ali, Hasan Ziafat, and Said Ghafari. "Pupil Detection in Facial Images with Using Bag of Pixels." Advanced Materials Research 468-471 (February 2012): 2941–48. http://dx.doi.org/10.4028/www.scientific.net/amr.468-471.2941.

Full text
Abstract:
To detect and track eye images, distinctive features of user eye are used. Generally, an eye-tracking and detection system can be divided into four steps: Face detection, eye region detection, pupil detection and eye tracking. To find the position of pupil, first, face region must be separated from the rest of the image using bag of pixels, this will cause the images background to be non effective in our next steps. We used from horizontal projection, to separate a region containing eyes and eyebrow. This will result in decreasing the computational complexity and ignoring some factors such as bread. Finally, in proposed method points with the highest values of are selected as the eye candidate's. The eye region is well detected among these points. Color entropy in the eye region is used to eliminate the irrelevant candidates. With a pixel of the iris or pupil can be achieved center of pupil. To find the center of pupil can be used line intersection method in the next step, we perform eye tracking. The proposed method achieve a correct eye detection rate of 97.3% on testing set that gathered from different images of face data. Moreover, in the case of glasses the performance is still acceptable.
APA, Harvard, Vancouver, ISO, and other styles
30

Kortman, Brenton, and Kate Nicholls. "No. 14 Assessing for Unilateral Spatial Neglect Using Eye Tracking Glasses." PM&R 6, no. 8 (August 2014): S84. http://dx.doi.org/10.1016/j.pmrj.2014.08.353.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Gomolka, Zbigniew, Damian Kordos, and Ewa Zeslawska. "The Application of Flexible Areas of Interest to Pilot Mobile Eye Tracking." Sensors 20, no. 4 (February 12, 2020): 986. http://dx.doi.org/10.3390/s20040986.

Full text
Abstract:
Recent progress in the development of mobile Eye Tracking (ET) systems shows that there is a demand for modern flexible solutions that would allow for dynamic tracking of objects in the video stream. The paper describes a newly developed tool for work with ET glasses, and its advantages are outlined with the example of a pilot study. A flight task is performed on the FNTP II MCC simulator, and the pilots are equipped with the Mobile Tobii Glasses. The proposed Smart Trainer tool performs dynamic object tracking in a registered video stream, allowing for an interactive definition of Area of Interest (AOI) with blurred contours for the individual cockpit instruments and for the construction of corresponding histograms of pilot attention. The studies are carried out on a group of experienced pilots with a professional pilot CPL(A) license with instrumental flight (Instrument Rating (IR)) certification and a group of pilots without instrumental training. The experimental section shows the differences in the perception of the flight process between two distinct groups of pilots with varying levels in flight training for the ATPL(A) line pilot license. The proposed Smart Trainer tool might be exploited in order to assess and improve the process of training operators of advanced systems with human machine interfaces.
APA, Harvard, Vancouver, ISO, and other styles
32

OZAWA, Masataka, Akira OIKAWA, and Norihisa MIKI. "2P1-N04 See-Through-Type Eye-Gaze Tracking System of Eye Glasses(VR and Interface)." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2012 (2012): _2P1—N04_1—_2P1—N04_2. http://dx.doi.org/10.1299/jsmermd.2012._2p1-n04_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Kang, Dongwoo, and Hyun Sung Chang. "Low-Complexity Pupil Tracking for Sunglasses-Wearing Faces for Glasses-Free 3D HUDs." Applied Sciences 11, no. 10 (May 11, 2021): 4366. http://dx.doi.org/10.3390/app11104366.

Full text
Abstract:
This study proposes a pupil-tracking method applicable to drivers both with and without sunglasses on, which has greater compatibility with augmented reality (AR) three-dimensional (3D) head-up displays (HUDs). Performing real-time pupil localization and tracking is complicated by drivers wearing facial accessories such as masks, caps, or sunglasses. The proposed method fulfills two key requirements: low complexity and algorithm performance. Our system assesses both bare and sunglasses-wearing faces by first classifying images according to these modes and then assigning the appropriate eye tracker. For bare faces with unobstructed eyes, we applied our previous regression-algorithm-based method that uses scale-invariant feature transform features. For eyes occluded by sunglasses, we propose an eye position estimation method: our eye tracker uses nonoccluded face area tracking and a supervised regression-based pupil position estimation method to locate pupil centers. Experiments showed that the proposed method achieved high accuracy and speed, with a precision error of <10 mm in <5 ms for bare and sunglasses-wearing faces for both a 2.5 GHz CPU and a commercial 2.0 GHz CPU vehicle-embedded system. Coupled with its performance, the low CPU consumption (10%) demonstrated by the proposed algorithm highlights its promise for implementation in AR 3D HUD systems.
APA, Harvard, Vancouver, ISO, and other styles
34

Kortman, Brenton, and Kate Nicholls. "Assessing for Unilateral Spatial Neglect Using Eye-Tracking Glasses: A Feasibility Study." Occupational Therapy In Health Care 30, no. 4 (August 5, 2016): 344–55. http://dx.doi.org/10.1080/07380577.2016.1208858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Law, Brenda Hiu Yan, Po-Yin Cheung, Sylvia van Os, Caroline Fray, and Georg M. Schmölzer. "Effect of monitor positioning on visual attention and situation awareness during neonatal resuscitation: a randomised simulation study." Archives of Disease in Childhood - Fetal and Neonatal Edition 105, no. 3 (August 2, 2019): 285–91. http://dx.doi.org/10.1136/archdischild-2019-316992.

Full text
Abstract:
ObjectivesTo compare situation awareness (SA), visual attention (VA) and protocol adherence in simulated neonatal resuscitations using two different monitor positions.DesignRandomised controlled simulation study.SettingsSimulation lab at the Royal Alexandra Hospital, Edmonton, Canada.ParticipantsHealthcare providers (HCPs) with Neonatal Resuscitation Program (NRP) certification within the last 2 years and trained in neonatal endotracheal intubations.InterventionHCPs were randomised to either central (eye-level on the radiant warmer) or peripheral (above eye-level, wall-mounted) monitor positions. Each led a complex resuscitation with a high-fidelity mannequin and a standardised assistant. To measure SA, situation awareness global assessment tool (SAGAT) was used, where simulations were paused at three predetermined points, with five questions asked each pause. Videos were analysed for SAGAT and adherence to a NRP checklist. Eye-tracking glasses recorded participants’ VA.Main outcome measureThe main outcome was SA as measured by composite SAGAT score. Secondary outcomes included VA and adherence to NRP checklist.ResultsThirty simulations were performed; 29 were completed per protocol and analysed. Twenty-two eye-tracking recordings were of sufficient quality and analysed. Median composite SAGAT was 11.5/15 central versus 11/15 peripheral, p=0.56. Checklist scores 46/50 central versus 46/50 peripheral, p=0.75. Most VA was directed at the mannequin (30.6% central vs 34.1% peripheral, p=0.76), and the monitor (28.7% central vs 20.5% peripheral, p=0.06).ConclusionsSimulation, SAGAT and eye-tracking can be used to evaluate human factors of neonatal resuscitation. During simulated neonatal resuscitation, monitor position did not affect SA, VA or protocol adherence.
APA, Harvard, Vancouver, ISO, and other styles
36

Popelka, Stanislav, and Jiří Komínek. "Visual Inspection of Geological Maps: an eye-tracking Study." Abstracts of the ICA 2 (October 8, 2020): 1. http://dx.doi.org/10.5194/ica-abs-2-22-2020.

Full text
Abstract:
Abstract. The paper describes the analysis of a visual inspection of paper geological maps by three groups of participants: geologists (GEOL), geographers (GEO) and geoinformaticians (GIS). The aim of the study was to identify the differences in how different groups of participants visually inspected geological maps.Geological maps show the distribution of different types of rock at the earth’s surface and are a fundamental tool for geologists. Geology as a distinct discipline is relatively young, and its origins date to the eighteenth century. In recent years, this otherwise relatively stable field has begun to incorporate new technology into its tools and methods. No study as yet has been published to assess geological maps using eye-tracking or cognitive cartography.Eye-tracking glasses SMI Eye Tracking Glasses 2 with a recording frequency of 60 Hz were used to record the eye-movements of participants during the experiment. Two maps at a scale of 1 : 25 000 produced by the Czech Geological Survey were used for the eye-tracking experiment. In the first part of the experiment, free viewing was analyzed. In the second part, participants solved six tasks with a map. The tasks were selected based on consultation with employees of the Czech Geological Survey.In the free viewing section, noticeable differences between groups were observed. The free viewing section revealed that the geoinformaticians group concentrated much less on the map itself and spent more time on the surrounding elements. The geographers and geologists mainly focused on the map field. The second part of the experiment comprised six tasks.The first task was the simplest and involved finding the coordinate system used in the map. The task caused no problems in any group, and completion times were balanced. The GIS group demonstrated the greatest experience in reading maps and was also the quickest to solve the task. The second task was to identify the geological units depicted in the map. The correct answer could be obtained either from the legend or by using the scheme of geological units. No statistically significant differences in time between the groups were recorded. In the third task, respondents were required to identify and mark the boundary between two geological units found in the previous task. The group of geologists was quickest to solve this task, and all other respondents found a solution with no great difficulty. The fourth task was to identify the predominant rock and determine its type according to the legend. The results suggest that ten respondents from GEO and GIS groups who concentrated on the lithostratigraphic scheme did not know where to look for the correct answer. In the fifth task, participants were required to mark an area with multiple landslides. The aim in this task was to find the landslide symbol in the legend and then identify the landslide area on the map. The differences between groups were most apparent in this task. The geologists were significantly quicker in finding the symbol in the legend. The other groups needed a much longer time to identify the symbol in the legend. In the final task, respondents were instructed to identify the predominant rock in the area with the highest amplitude of geomagnetic anomalies. To solve this task, using the diagram in the section at the bottom left was necessary. The GEOL group spent the least amount of time completing this task, indicating the respondents’ knowledge. The experiment and subsequent interviews revealed a different color reading strategy. When identifying a rock, the GEOL group compared colors mainly for quick orientation. However, the decisive factor for identification was the index, which was given for the rock in the map and legend.To conclude, the geologists group was quickest in solving the tasks and recorded the least wrong answers. The GIS and GEO groups achieved similar results in the experiment.
APA, Harvard, Vancouver, ISO, and other styles
37

Volkov, A. K., and V. V. Ionov. "THE IMPROVEMENT OF PROFESSIONAL TRAINING ORGANIZATION OF THE X-RAY SCREENING SYSTEMS OPERATORS BY USING THE EYE MOVEMENTS REGISTRATION SYSTEM AND METHODS OF CLUSTER AND DISCRIMINANT ANALYSIS." Civil Aviation High TECHNOLOGIES 21, no. 3 (July 3, 2018): 45–36. http://dx.doi.org/10.26467/2079-0619-2018-21-3-45-36.

Full text
Abstract:
The X-ray screening systems operators’ professional training is based on the CBT (computer-based training) principle, which has algorithms of adaptive training. These algorithms in existing computer simulators include feedback mechanisms on the basis of trainability exponents – such as the frequency of detecting dangerous objects, the frequency of false alarms and detection time. Further enhancement of the operators’ simulator training effectiveness is associated with the integration of psychophysiological mechanisms providing monitoring of their functional state. Based on the analysis of the particularities of x-ray screening systems operators’ professional training associated with the formation of competences in dangerous objects visual search, the most perspective method is the Eye tracking technology. Domestic and foreign studies of the eye movements characteristics while solving professional tasks in training process are actively developed in various areas. There are no studies of visual search peculiarities in domestic practice in contrast to exterior studies. This research is aimed at considering the usage of Eye tracking technology in the training of x-ray screening systems operators. As the result of the experimental research with the use of mobile eye-tracker Sensomotoric Instruments Eye Tracking Glasses 2.0 the statistical data of eye movement parameters of two groups of subjects with different levels of training have been received. The application of cluster and discriminant analyses methods allowed to identify General classes of these parameters, as well as to obtain the discriminants functions for each group under examination. The theoretical significance of the peculiarities of the operators’ eye movement studies is to identify the patterns of prohibited items visual search. The practical importance of implementation of Eye tracking technology and statistical analysis methods is to increase the reliability of assessment the level of formed competence of x-ray screening systems’ operators in visual search, as well as to develop the potential system of operators’ state monitoring and assessing their visual fatigue.
APA, Harvard, Vancouver, ISO, and other styles
38

McNaughten, Ben, Caroline Hart, Stephen Gallagher, Carol Junk, Patricia Coulter, Andrew Thompson, and Thomas Bourke. "Clinicians’ gaze behaviour in simulated paediatric emergencies." Archives of Disease in Childhood 103, no. 12 (March 7, 2018): 1146–49. http://dx.doi.org/10.1136/archdischild-2017-314119.

Full text
Abstract:
AimDifferences in the gaze behaviour of experts and novices are described in aviation and surgery. This study sought to describe the gaze behaviour of clinicians from different training backgrounds during a simulated paediatric emergency.MethodsClinicians from four clinical areas undertook a simulated emergency. Participants wore SMI (SensoMotoric Instruments) eye tracking glasses. We measured the fixation count and dwell time on predefined areas of interest and the time taken to key clinical interventions.ResultsPaediatric intensive care unit (PICU) consultants performed best and focused longer on the chest and airway. Paediatric consultants and trainees spent longer looking at the defibrillator and algorithm (51 180 ms and 50 551 ms, respectively) than the PICU and paediatric emergency medicine consultants.ConclusionsThis study is the first to describe differences in the gaze behaviour between experts and novices in a resuscitation. They mirror those described in aviation and surgery. Further research is needed to evaluate the potential use of eye tracking as an educational tool.
APA, Harvard, Vancouver, ISO, and other styles
39

Madleňák, Radovan, Jaroslav Mašek, and Lucia Madleňáková. "An experimental analysis of the driver’s attention during train driving." Open Engineering 10, no. 1 (March 10, 2020): 64–73. http://dx.doi.org/10.1515/eng-2020-0011.

Full text
Abstract:
AbstractThe article deals with the experimental monitoring of the driver’s attention during train operation. SMI eye-tracking technology and eye-tracking glasses we used to measure the train driver’s attention. This unique experiment we performed on the Slovak railway (ŽSR) line no. 120 Bratislava - Žilina, in the section Žilina - Púchov. The measurement took place in the spring of 2017, and the passenger train was operated by the ZSSK electric unit series 671. The article analyses in detail the monitoring of the driver’s attention during train operation. We analysed two typical processes during train driver work: driving at the section (without stopping) and driving through the train station (with stopping). Realised analysis can lead to the identification of critical points on the line, to a better understanding of the driver’s way of work and to contribute to increasing railway safety. The unique measurement procedure and the used technology did not affect the safety of the train operation.
APA, Harvard, Vancouver, ISO, and other styles
40

Gomes Paiva, Ana Flavia, Gregorio Sorrentino, Blaise Bignami, Claire Kemlin, Sofia Gueorguieva, Pascale Pradat-Diehl, Philippe Thoumie, and Eleonore Bayen. "Feasibility of assessing post-stroke neglect with eye-tracking glasses during a locomotion task." Annals of Physical and Rehabilitation Medicine 64, no. 5 (September 2021): 101436. http://dx.doi.org/10.1016/j.rehab.2020.09.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Drew, Trafton, Lauren H. Williams, Booth Aldred, Marta E. Heilbrun, and Satoshi Minoshima. "Quantifying the costs of interruption during diagnostic radiology interpretation using mobile eye-tracking glasses." Journal of Medical Imaging 5, no. 03 (March 2, 2018): 1. http://dx.doi.org/10.1117/1.jmi.5.3.031406.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Schindler, Maike, and Achim J. Lilienthal. "Students’ collaborative creative process and its phases in mathematics: an explorative study using dual eye tracking and stimulated recall interviews." ZDM – Mathematics Education 54, no. 1 (January 22, 2022): 163–78. http://dx.doi.org/10.1007/s11858-022-01327-9.

Full text
Abstract:
AbstractIn the age of artificial intelligence where standard problems are increasingly processed by computers, creative problem solving, the ability to think outside the box is in high demand. Collaboration is also increasingly significant, which makes creative collaboration an important twenty-first-century skill. In the research described in this paper, we investigated students’ collaborative creative process in mathematics and explored the collaborative creative process in its phases. Since little is known about the collaborative creative process, we conducted an explorative case study, where two students jointly worked on a multiple solution task. For in-depth insight into the dyad’s collaborative creative process, we used a novel research design in mathematics education, DUET SRI: both students wore eye-tracking glasses during their collaborative work for dual eye-tracking (DUET) and they each participated in a subsequent stimulated recall interview (SRI) where eye-tracking videos from their joint work served as stimulus. Using an inductive data analysis method, we then identified the phases of the students’ collaborative creative process. We found that the collaborative creative process and its phases had similarities to those previously found for solo creative work, yet the process was more complex and volatile and involved different branches. Based on our findings, we present a tentative model of the dyad’s collaborative process in its phases, which can help researchers and educators trace and foster the collaborative creative process more effectively.
APA, Harvard, Vancouver, ISO, and other styles
43

Sano, Mina. "Quantitative Analysis of Eye Movements during Singing of Early Childhood Children." International Journal of Educational Studies 4, no. 3 (December 9, 2021): 95–112. http://dx.doi.org/10.53935/2641-533x.v4i3.161.

Full text
Abstract:
Early childhood children tend to make musical expressions watching other children or the teacher’s piano accompaniment. However, it has not been inspected yet how eye movement is affected by music. To provide the optimized procedure to capture eye movement’s characteristics reflecting music, the statistical technique was used to evaluate effective parameters. In this study, eye trackers (Tobii Glasses 2) were used to acquire data of eye movements during musical expression of early childhood children and to conduct quantitative analysis. 3-year-old, 4-year-old, and 5- year-old children in two early childhood facilities (n=58) participated in eye-tracking while singing multiple songs of major and minor. This paper focuses on saccade (rapid eye movement) and gaze behaviors of early childhood children and mainly conducts, a three-way analysis of variance (ANOVA) on the acquired data (age * facility*tonality). As a result, it was found that the number of occurrences of saccade and the total moving distances of saccade showed a statistical significance between means regarding differences in the tonality of major/minor key of songs, and childcare forms.
APA, Harvard, Vancouver, ISO, and other styles
44

Sano, Mina. "Quantitative Analysis of Eye Movements during Singing of Early Childhood Children." Research Journal of Education, no. 73 (September 2, 2021): 125–40. http://dx.doi.org/10.32861/rje.73.125.140.

Full text
Abstract:
Early childhood children tend to make musical expressions watching other children or the teacher’s piano accompaniment. However, it has not been inspected yet how eye movement is affected by music. To provide the optimized procedure to capture eye movement’s characteristics reflecting music, the statistical technique was used to evaluate effective parameters. In this study, eye trackers (Tobii Glasses 2) were used to acquire data of eye movements during musical expression of early childhood children and to conduct quantitative analysis. 3-year-old, 4-year-old, and 5-year-old children in two early childhood facilities (n=58) participated in eye-tracking while singing multiple songs of major and minor. This paper focuses on saccade (rapid eye movement) and gaze behaviors of early childhood children and mainly conducts, a three-way analysis of variance (ANOVA) on the acquired data (age * facility*tonality). As a result, it was found that the number of occurrences of saccade and the total moving distances of saccade showed a statistical significance between means regarding differences in the tonality of major/minor key of songs, and childcare forms.
APA, Harvard, Vancouver, ISO, and other styles
45

Wertli, Jason, Andreas Schötzau, S. Trauzettel-Klosinski, and Anja Palmowski-Wolfe. "Feasibility of Eye Movement Recordings with the SMI Tracking Bar in 10- to 11-Year-Old Children Performing a Reading Task." Klinische Monatsblätter für Augenheilkunde 237, no. 04 (April 2020): 510–16. http://dx.doi.org/10.1055/a-1101-9204.

Full text
Abstract:
Abstract Introduction Eye movements during reading can be impaired in amblyopia, developmental dyslexia, reduced visual acuity, or visual field defects. To detect pathology, normative values are important for comparison. In healthy children, there is sparse data on eye movements during reading. Therefore, the aim of this study was to, in a first step, explore the feasibility of applying the SMI RED eye tracker bar to record eye movements in 10- and 11-year-old children while reading a text. Materials and Methods Thirty-three (19 aged 10 years, 14 aged 11 years) normally sighted children attending a primary school in Switzerland participated in our study. Visual acuity, the Lang test, and the cover test were performed as a screening for ophthalmologic pathology that might influence the results. Eye movements were recorded with the SMI RED eye tracker bar while the child read aloud two texts from the International Reading Speed Test (IReST), presented on a laptop. Both texts were in German with an equal level of difficulty and were presented in a randomized order. Reading speed (words/minute), number of saccades, number of fixations, and reading errors (mistakes in the reading) were evaluated. Results Screening did not reveal pathology other than refractive errors and children had full corrected visual acuity. Eye movements could be obtained in all but six children where the reflection of the glasses worn prevented a good pupil recording with the tracker. Younger children performed more saccades per word with a mean of 1.41 (SD 0.39) at 10 years of age versus 1.10 (SD 0.21) at 11 years of age. The number of fixations per word was also higher in younger children (mean: 1.63 [SD 0.37]) than in 11-year-old children (mean: 1.32 [SD 0.33]). Ten-year-old children seem to analyze a text in smaller units than 11-year-olds. Thus, 10-year-old children took more time to complete the reading task than the 11-year-olds (mean: 88.8 s [SD 24.1] versus 84.4 s [SD 15.1]). In addition, 10-year-old children made more reading errors compared to 11-year-olds (mean: 4.47 [SD 2.95] versus 2.28 [SD 1.72]). Conclusion It is feasible to record eye movements in children aged 10 – 11, albeit this is more difficult when glasses are worn. As parameters change with age, further data is needed for a representative evaluation regarding eye movements during reading in children of different age groups. The information gained may offer help in recognizing reading difficulties and monitoring of treatment effects.
APA, Harvard, Vancouver, ISO, and other styles
46

Berni, A., S. Altavilla, L. Ruiz-Pastor, C. Nezzi, and Y. Borgianni. "An Eye-Tracking Study to Identify the Most Observed Features in a Physical Prototype of a Tiny House." Proceedings of the Design Society 2 (May 2022): 841–50. http://dx.doi.org/10.1017/pds.2022.86.

Full text
Abstract:
AbstractThis exploratory work aims to understand which elements of a building mostly attract visitors’ attention. An experiment was conducted to allow participants to visit a prototype tiny house while wearing eye-tracking glasses. Identified gazed elements of the prototype were selected and the corresponding dwell times used as variables. The limited dwell times on structural elements show that they can be easily overshadowed by other features present in the building. This leads to a design problem when the novelty and the quality of a new product, markedly a building, reside in the materials used.
APA, Harvard, Vancouver, ISO, and other styles
47

Song, Li, Yechan Lim, Pengyu Chang, Yingnan Guo, Mingyu Zhang, Xi Wang, Xueying Yu, Mark R. Lehto, and Hua Cai. "Ecolabel's role in informing sustainable consumption: A naturalistic decision making study using eye tracking glasses." Journal of Cleaner Production 218 (May 2019): 685–95. http://dx.doi.org/10.1016/j.jclepro.2019.01.283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Albuquerque, Maicon R., Leandro F. Malloy-Diniz, Marco A. Romano-Silva, Jonas J. de Paula, Maila de Castro Neves, and Guilherme M. Lage. "Can Eye Fixation During the Grooved Pegboard Test Distinguish Between Planning and Online Correction?" Perceptual and Motor Skills 124, no. 2 (December 27, 2016): 380–92. http://dx.doi.org/10.1177/0031512516685000.

Full text
Abstract:
The Grooved Pegboard Test, in its standard use, has well-documented utility. However, a revised methodology needs further study, leading us to investigate whether duration of eye fixation could predict performance on different task conditions of the Grooved Pegboard Test (place and remove pegs) with the preferred and nonpreferred hands. Fifty-two right-handed undergraduate students (33 male and 19 female), with a mean age of 22.22 (±3.57) years, performed the Grooved Pegboard Test. SensoMotoric eye-tracking glasses with a binocular time resolution of 30 Hz were used to measure eye fixation. The videos were recorded in iView software, and data were analyzed using BeGaze software. The number and duration of eye fixations were statistically different with preferred and nonpreferred hands and also differed across tasks. Simple linear regression showed eye fixation duration to predict movement time in the place task (preferred hand: R2 = 31%; nonpreferred hand: R2 = 41%) and in the remove task (preferred hand: R2 = 11%; nonpreferred hand: R2 = 25%). Thus, duration of eye fixation during the Grooved Pegboard Test differentially predicted performance with each hand and on preferred and different subtests of this instrument.
APA, Harvard, Vancouver, ISO, and other styles
49

Grima-Murcia, M. D., Francisco Sanchez-Ferrer, Jose Manuel Ramos-Rincón, and Eduardo Fernández. "Use of Eye-Tracking Technology by Medical Students Taking the Objective Structured Clinical Examination: Descriptive Study." Journal of Medical Internet Research 22, no. 8 (August 21, 2020): e17719. http://dx.doi.org/10.2196/17719.

Full text
Abstract:
Background The objective structured clinical examination (OSCE) is a test used throughout Spain to evaluate the clinical competencies, decision making, problem solving, and other skills of sixth-year medical students. Objective The main goal of this study is to explore the possible applications and utility of portable eye-tracking systems in the setting of the OSCE, particularly questions associated with attention and engagement. Methods We used a portable Tobii Glasses 2 eye tracker, which allows real-time monitoring of where the students were looking and records the voice and ambient sounds. We then performed a qualitative and a quantitative analysis of the fields of vision and gaze points attracting attention as well as the visual itinerary. Results Eye-tracking technology was used in the OSCE with no major issues. This portable system was of the greatest value in the patient simulators and mannequin stations, where interaction with the simulated patient or areas of interest in the mannequin can be quantified. This technology proved useful to better identify the areas of interest in the medical images provided. Conclusions Portable eye trackers offer the opportunity to improve the objective evaluation of candidates and the self-evaluation of the stations used as well as medical simulations by examiners. We suggest that this technology has enough resolution to identify where a student is looking at and could be useful for developing new approaches for evaluating specific aspects of clinical competencies.
APA, Harvard, Vancouver, ISO, and other styles
50

Hosp, Benedikt W., Florian Schultz, Oliver Höner, and Enkelejda Kasneci. "Soccer goalkeeper expertise identification based on eye movements." PLOS ONE 16, no. 5 (May 19, 2021): e0251070. http://dx.doi.org/10.1371/journal.pone.0251070.

Full text
Abstract:
By focusing on high experimental control and realistic presentation, the latest research in expertise assessment of soccer players demonstrates the importance of perceptual skills, especially in decision making. Our work captured omnidirectional in-field scenes displayed through virtual reality glasses to 12 expert players (picked by DFB), 10 regional league intermediate players, and13 novice soccer goalkeepers in order to assess the perceptual skills of athletes in an optimized manner. All scenes were shown from the perspective of the same natural goalkeeper and ended after the return pass to that goalkeeper. Based on the gaze behavior of each player, we classified their expertise with common machine learning techniques. Our results show that eye movements contain highly informative features and thus enable a classification of goalkeepers between three stages of expertise, namely elite youth player, regional league player, and novice, at a high accuracy of 78.2%. This research underscores the importance of eye tracking and machine learning in perceptual expertise research and paves the way for perceptual-cognitive diagnosis as well as future training systems.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography