Academic literature on the topic 'EEG, eye-tracking, Human Computer Interface'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'EEG, eye-tracking, Human Computer Interface.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "EEG, eye-tracking, Human Computer Interface"

1

Dekihara, Hiroyuki, and Tatsuya Iwaki. "Development of human computer interface based on eye movement using Emotiv EEG headset." International Journal of Psychophysiology 94, no. 2 (November 2014): 188. http://dx.doi.org/10.1016/j.ijpsycho.2014.08.787.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sotnikov, P. I. "Feature Construction Methods for the Electroencephalogram Signal Analysis in Hybrid “Eye-Brain-Computer” Interface." Mathematics and Mathematical Modeling, no. 2 (May 21, 2018): 33–52. http://dx.doi.org/10.24108/mathm.0218.0000118.

Full text
Abstract:
The hybrid “eye-brain-computer” interface is a new approach to the human-machine interaction. It allows the user to select an object of interest on a screen by tracking the user’s gaze direction. At the same time, the user’s intent to give a command is determined by registering and decoding brain activity. The interface operation is based on the fact that control gaze fixations can be distinguished from spontaneous fixations using electroencephalogram (EEG) signal.The article discusses the recognition of EEG patterns that correspond to the spontaneous and control gaze fixations. To improve the classification accuracy, we suggest using the relatively new feature construction methods for time series analysis. These methods include a selection of optimal frequency bands of the multivariate EEG signal and a modified method of shapelets. The first method constructs the optimal feature space using prior information on a difference in frequency components of the multivariate signal for different classes. The second method uses a genetic algorithm to provide selecting such fragments of the multivariate time-series, which reflect as much as possible the properties of one or more than one class of such time series. Thus, calculating distances between them and a set of k top-best shapelets allows us to provide feature description of the time series.The article consists of five sections. The first one provides a mathematical formulation of the multivariate time-series classification problem. The second section gives a formal description of the proposed methods for feature construction. The third section describes test data, which include the EEG records from the six users of the hybrid “eye-brain-computer” interface. In the fourth section, we evaluate an efficiency of the methods proposed in comparison with other known feature extraction techniques, which include: 1) calculation of the average EEG amplitude values in the overlapping windows; 2) estimation of the power spectral density in the specified frequency bands; 3) selection of the most informative features using a genetic algorithm. In the fifth section, we conduct the statistical analysis of the results obtained. It is shown that the feature construction method, based on the selection of optimal frequency bands of the EEG signal, in efficiency significantly outperforms other techniques considered and opens up the possibility to reduce the number of false positives of the hybrid interface.
APA, Harvard, Vancouver, ISO, and other styles
3

Narejo, Sanam, Eros Pasero, and Farzana Kulsoom. "EEG Based Eye State Classification using Deep Belief Network and Stacked AutoEncoder." International Journal of Electrical and Computer Engineering (IJECE) 6, no. 6 (December 1, 2016): 3131. http://dx.doi.org/10.11591/ijece.v6i6.12967.

Full text
Abstract:
<p>A Brain-Computer Interface (BCI) provides an alternative communication interface between the human brain and a computer. The Electroencephalogram (EEG) signals are acquired, processed and machine learning algorithms are further applied to extract useful information. During EEG acquisition, artifacts are induced due to involuntary eye movements or eye blink, casting adverse effects on system performance. The aim of this research is to predict eye states from EEG signals using Deep learning architectures and present improved classifier models. Recent studies reflect that Deep Neural Networks are trending state of the art Machine learning approaches. Therefore, the current work presents the implementation of Deep Belief Network (DBN) and Stacked AutoEncoders (SAE) as Classifiers with encouraging performance accuracy. One of the designed SAE models outperforms the performance of DBN and the models presented in existing research by an impressive error rate of 1.1% on the test set bearing accuracy of 98.9%. The findings in this study, may provide a contribution towards the state of the art performance on the problem of EEG based eye state classification.</p>
APA, Harvard, Vancouver, ISO, and other styles
4

Narejo, Sanam, Eros Pasero, and Farzana Kulsoom. "EEG Based Eye State Classification using Deep Belief Network and Stacked AutoEncoder." International Journal of Electrical and Computer Engineering (IJECE) 6, no. 6 (December 1, 2016): 3131. http://dx.doi.org/10.11591/ijece.v6i6.pp3131-3141.

Full text
Abstract:
<p>A Brain-Computer Interface (BCI) provides an alternative communication interface between the human brain and a computer. The Electroencephalogram (EEG) signals are acquired, processed and machine learning algorithms are further applied to extract useful information. During EEG acquisition, artifacts are induced due to involuntary eye movements or eye blink, casting adverse effects on system performance. The aim of this research is to predict eye states from EEG signals using Deep learning architectures and present improved classifier models. Recent studies reflect that Deep Neural Networks are trending state of the art Machine learning approaches. Therefore, the current work presents the implementation of Deep Belief Network (DBN) and Stacked AutoEncoders (SAE) as Classifiers with encouraging performance accuracy. One of the designed SAE models outperforms the performance of DBN and the models presented in existing research by an impressive error rate of 1.1% on the test set bearing accuracy of 98.9%. The findings in this study, may provide a contribution towards the state of the art performance on the problem of EEG based eye state classification.</p>
APA, Harvard, Vancouver, ISO, and other styles
5

Antoniou, Evangelos, Pavlos Bozios, Vasileios Christou, Katerina D. Tzimourta, Konstantinos Kalafatakis, Markos G. Tsipouras, Nikolaos Giannakeas, and Alexandros T. Tzallas. "EEG-Based Eye Movement Recognition Using Brain–Computer Interface and Random Forests." Sensors 21, no. 7 (March 27, 2021): 2339. http://dx.doi.org/10.3390/s21072339.

Full text
Abstract:
Discrimination of eye movements and visual states is a flourishing field of research and there is an urgent need for non-manual EEG-based wheelchair control and navigation systems. This paper presents a novel system that utilizes a brain–computer interface (BCI) to capture electroencephalographic (EEG) signals from human subjects while eye movement and subsequently classify them into six categories by applying a random forests (RF) classification algorithm. RF is an ensemble learning method that constructs a series of decision trees where each tree gives a class prediction, and the class with the highest number of class predictions becomes the model’s prediction. The categories of the proposed random forests brain–computer interface (RF-BCI) are defined according to the position of the subject’s eyes: open, closed, left, right, up, and down. The purpose of RF-BCI is to be utilized as an EEG-based control system for driving an electromechanical wheelchair (rehabilitation device). The proposed approach has been tested using a dataset containing 219 records taken from 10 different patients. The BCI implemented the EPOC Flex head cap system, which includes 32 saline felt sensors for capturing the subjects’ EEG signals. Each sensor caught four different brain waves (delta, theta, alpha, and beta) per second. Then, these signals were split in 4-second windows resulting in 512 samples per record and the band energy was extracted for each EEG rhythm. The proposed system was compared with naïve Bayes, Bayes Network, k-nearest neighbors (K-NN), multilayer perceptron (MLP), support vector machine (SVM), J48-C4.5 decision tree, and Bagging classification algorithms. The experimental results showed that the RF algorithm outperformed compared to the other approaches and high levels of accuracy (85.39%) for a 6-class classification are obtained. This method exploits high spatial information acquired from the Emotiv EPOC Flex wearable EEG recording device and examines successfully the potential of this device to be used for BCI wheelchair technology.
APA, Harvard, Vancouver, ISO, and other styles
6

Tran, Dang-Khoa, Thanh-Hai Nguyen, and Thanh-Nghia Nguyen. "Detection of EEG-Based Eye-Blinks Using A Thresholding Algorithm." European Journal of Engineering and Technology Research 6, no. 4 (May 11, 2021): 6–12. http://dx.doi.org/10.24018/ejeng.2021.6.4.2438.

Full text
Abstract:
In the electroencephalography (EEG) study, eye blinks are a commonly known type of ocular artifact that appears most frequently in any EEG measurement. The artifact can be seen as spiking electrical potentials in which their time-frequency properties are varied across individuals. Their presence can negatively impact various medical or scientific research or be helpful when applying to brain-computer interface applications. Hence, detecting eye-blink signals is beneficial for determining the correlation between the human brain and eye movement in this paper. The paper presents a simple, fast, and automated eye-blink detection algorithm that did not require user training before algorithm execution. EEG signals were smoothed and filtered before eye-blink detection. We conducted experiments with ten volunteers and collected three different eye-blink datasets over three trials using Emotiv EPOC+ headset. The proposed method performed consistently and successfully detected spiking activities of eye blinks with a mean accuracy of over 96%.
APA, Harvard, Vancouver, ISO, and other styles
7

Tran, Dang-Khoa, Thanh-Hai Nguyen, and Thanh-Nghia Nguyen. "Detection of EEG-Based Eye-Blinks Using A Thresholding Algorithm." European Journal of Engineering and Technology Research 6, no. 4 (May 11, 2021): 6–12. http://dx.doi.org/10.24018/ejers.2021.6.4.2438.

Full text
Abstract:
In the electroencephalography (EEG) study, eye blinks are a commonly known type of ocular artifact that appears most frequently in any EEG measurement. The artifact can be seen as spiking electrical potentials in which their time-frequency properties are varied across individuals. Their presence can negatively impact various medical or scientific research or be helpful when applying to brain-computer interface applications. Hence, detecting eye-blink signals is beneficial for determining the correlation between the human brain and eye movement in this paper. The paper presents a simple, fast, and automated eye-blink detection algorithm that did not require user training before algorithm execution. EEG signals were smoothed and filtered before eye-blink detection. We conducted experiments with ten volunteers and collected three different eye-blink datasets over three trials using Emotiv EPOC+ headset. The proposed method performed consistently and successfully detected spiking activities of eye blinks with a mean accuracy of over 96%.
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Xiashuang, Guanghong Gong, and Ni Li. "Multimodal fusion of EEG and fMRI for epilepsy detection." International Journal of Modeling, Simulation, and Scientific Computing 09, no. 02 (March 20, 2018): 1850010. http://dx.doi.org/10.1142/s1793962318500101.

Full text
Abstract:
Technology of brain–computer interface (BCI) provides a new way of communication and control without language or physical action. Brain signal tracking and positioning is the basis of BCI research, while brain modeling affects the treatment analysis of (EEG) and functional magnetic resonance imaging (fMRI) directly. This paper proposes human ellipsoid brain modeling method. Then, we use non-parametric spectral estimation method of time–frequency analysis to deal with simulation and real EEG of epilepsy patients, which utilizes both the high spatial and the high time resolution to improve the doctor’s diagnostic efficiency.
APA, Harvard, Vancouver, ISO, and other styles
9

Sahat, Norasyimah, Afishah Alias, and Fouziah Md Yassin. "Wheelchair controlled by human brainwave using brain-computer interface system for paralyzed patient." Bulletin of Electrical Engineering and Informatics 10, no. 6 (December 1, 2021): 3032–41. http://dx.doi.org/10.11591/eei.v10i6.3200.

Full text
Abstract:
Integrated wheelchair controlled by human brainwave using a brain-computer interface (BCI) system was designed to help disabled people. The invention aims to improve the development of integrated wheelchair using a BCI system, depending on the ability individual brain attention level. An electroencephalography (EEG) device called mindwave mobile plus (MW+) has been employed to obtain the attention value for wheelchair movement, eye blink to change the mode of the wheelchair to move forward (F), to the right (R), backward (B) and to the left (L). Stop mode (S) is selected when doing eyebrow movement as the signal quality value of 26 or 51 is produced. The development of the wheelchair controlled by human brainwave using a BCI system for helping a paralyzed patient shows the efficiency of the brainwave integrated wheelchair and improved using human attention value, eye blink detection and eyebrow movement. Also, analysis of the human attention value in different gender and age category also have been done to improve the accuracy of the brainwave integrated wheelchair. The threshold value for male children is 60, male teenager (70), male adult (40) while for female children is 50, female teenager (50) and female adult (30).
APA, Harvard, Vancouver, ISO, and other styles
10

Kubacki, Arkadiusz. "Use of Force Feedback Device in a Hybrid Brain-Computer Interface Based on SSVEP, EOG and Eye Tracking for Sorting Items." Sensors 21, no. 21 (October 30, 2021): 7244. http://dx.doi.org/10.3390/s21217244.

Full text
Abstract:
Research focused on signals derived from the human organism is becoming increasingly popular. In this field, a special role is played by brain-computer interfaces based on brainwaves. They are becoming increasingly popular due to the downsizing of EEG signal recording devices and ever-lower set prices. Unfortunately, such systems are substantially limited in terms of the number of generated commands. This especially applies to sets that are not medical devices. This article proposes a hybrid brain-computer system based on the Steady-State Visual Evoked Potential (SSVEP), EOG, eye tracking, and force feedback system. Such an expanded system eliminates many of the particular system shortcomings and provides much better results. The first part of the paper presents information on the methods applied in the hybrid brain-computer system. The presented system was tested in terms of the ability of the operator to place the robot’s tip to a designated position. A virtual model of an industrial robot was proposed, which was used in the testing. The tests were repeated on a real-life industrial robot. Positioning accuracy of system was verified with the feedback system both enabled and disabled. The results of tests conducted both on the model and on the real object clearly demonstrate that force feedback improves the positioning accuracy of the robot’s tip when controlled by the operator. In addition, the results for the model and the real-life industrial model are very similar. In the next stage, research was carried out on the possibility of sorting items using the BCI system. The research was carried out on a model and a real robot. The results show that it is possible to sort using bio signals from the human body.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "EEG, eye-tracking, Human Computer Interface"

1

Khan, Mubasher Hassan, and Tayyab Laique. "An Evaluation of Gaze and EEG-Based Control of a Mobile Robot." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4625.

Full text
Abstract:
Context: Patients with diseases such as locked in syndrome or motor neuron are paralyzed and they need special care. To reduce the cost of their care, systems need to be designed where human involvement is minimal and affected people can perform their daily life activities independently. To assess the feasibility and robustness of combinations of input modalities, mobile robot (Spinosaurus) navigation is controlled by a combination of Eye gaze tracking and other input modalities. Objectives: Our aim is to control the robot using EEG brain signals and eye gaze tracking simultaneously. Different combinations of input modalities are used to control the robot and turret movement and then we find out which combination of control technique mapped to control command is most effective. Methods: The method includes developing the interface and control software. An experiment involving 15 participants was conducted to evaluate control of the mobile robot using a combination of eye tracker and other input modalities. Subjects were required to drive the mobile robot from a starting point to a goal along a pre-defined path. At the end of experiment, a sense of presence questionnaire was distributed among the participants to take their feedback. A qualitative pilot study was performed to find out how a low cost commercial EEG headset, the Emotiv EPOCTM, can be used for motion control of a mobile robot at the end. Results: Our study results showed that the Mouse/Keyboard combination was the most effective for controlling the robot motion and turret mounted camera respectively. In experimental evaluation, the Keyboard/Eye Tracker combination improved the performance by 9%. 86% of participants found that turret mounted camera was useful and provided great assistance in robot navigation. Our qualitative pilot study of the Emotiv EPOCTM demonstrated different ways to train the headset for different actions. Conclusions: In this study, we concluded that different combinations of control techniques could be used to control the devices e.g. a mobile robot or a powered wheelchair. Gaze-based control was found to be comparable with the use of a mouse and keyboard; EEG-based control was found to need a lot of training time and was difficult to train. Our pilot study suggested that using facial expressions to train the Emotiv EPOCTM was an efficient and effective way to train it.
APA, Harvard, Vancouver, ISO, and other styles
2

Shaw, Daniel. "An Eye-Tracking Evaluation of Multicultural Interface Designs." Thesis, Boston College, 2005. http://hdl.handle.net/2345/390.

Full text
Abstract:
Thesis advisor: James Gips
This paper examines the impact of a multicultural approach on the usability of web and software interface designs. Through the use of an eye-tracking system, the study compares the ability of American users to navigate traditional American and Japanese websites. The ASL R6 eye-tracking system recorded user search latency and the visual scan path in locating specific items on the American and Japanese pages. Experimental results found statistically significant latency values when searching for left- or right-oriented navigation menus. Among the participants, visual observations of scan paths indicated a strong preference for initial movements toward the left. These results demonstrate the importance of manipulating web layouts and navigation menus for American and Japanese users. This paper further discusses the potential strengths resulting from modifications of interface designs to correspond with such cultural search tendencies, and suggestions for further research
Thesis (BA) — Boston College, 2005
Submitted to: Boston College. College of Arts and Sciences
Discipline: Computer Science
Discipline: College Honors Program
APA, Harvard, Vancouver, ISO, and other styles
3

Petrini, Alexander, and Henrik Forslin. "Evaluation of Player Performance with a Brain Computer Interface and Eye TrackingControl in an Entertainment Game Application." Thesis, Blekinge Tekniska Högskola, Institutionen för kreativa teknologier, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-12928.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ahmed, Zaheer, and Aamir Shahzad. "Mobile Robot Navigation using Gaze Contingent Dynamic Interface." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3098.

Full text
Abstract:
Using eyes as an input modality for different control environments is a great area of interest for enhancing the bandwidth of human machine interaction and providing interaction functions when the use of hands is not possible. Interface design requirements in such implementations are quite different from conventional application areas. Both command-execution and feedback observation tasks may be performed by human eyes simultaneously. In order to control the motion of a mobile robot by operator gaze interaction, gaze contingent regions in the operator interface are used to execute robot movement commands, with different screen areas controlling specific directions. Dwell time is one of the most established techniques to perform an eye-click analogous to a mouse click. But repeated dwell time while switching between gaze-contingent regions and feedback-regions decreases the performance of the application. We have developed a dynamic gaze-contingent interface in which we merge gaze-contingent regions with feedback-regions dynamically. This technique has two advantages: Firstly it improves the overall performance of the system by eliminating repeated dwell time. Secondly it reduces fatigue of the operator by providing a bigger area to fixate in. The operator can monitor feedback with more ease while sending commands at the same time.
APA, Harvard, Vancouver, ISO, and other styles
5

Farokhian, Suzana. "Human-computer interaction using eye-gaze : Formation of user interface design guidelines from a cognitive science perspective." Thesis, Södertörns högskola, Medieteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-38506.

Full text
Abstract:
Motor and communication disabilities are common conditions that may implicate restrictions in daily life. With development of eye tracking technology, a solution referred to as eye-gaze interaction has been generated to support people with their limiting conditions to solve communication and computer access issues. By using eye tracking technology, which calculates the user’s eye-gaze location on a computer screen, user’s are able to control computers with their eyes as an input. This interaction method is quite unique and complex since the eyes serves both as an input and output source. Usability aspects revolving human information processing are therefore important to consider when designing user interfaces. In collaboration with Tobii AB, the study evaluated two separate eye-gaze interaction systems for controlling computers. 7 participants conducted user tests, one for each application, and answered interview questions during the tests regarding their usability experience. Based on the collected data,17 design guidelines was established with a purpose to enhance usability for eye-gaze interaction systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Yang, Zhen. "Infrastructure development for a mind attention interface." Master's thesis, 2007. http://hdl.handle.net/1885/6966.

Full text
Abstract:
The “Mind Attention Interface” (MAI) enables sensor measurements of a person’s mind states, and attention states, to be used in a virtual environment. As well as serving as a platform for Human Computer Interface and Brain-Computer Interface research, the MAI aspires to build artistic installations, particularly in computer-generated music. This thesis describes the development of the MAI System: the enabling software infrastructure of the Mind Attention Interface. It discusses the investigation and purchase of hardware and the design of the MAI System architecture. The requirements, design, and implementation of the MAI System, accompanied by a set of profiling tests, demonstrate the effectiveness of the design architecture and the overall usefulness of the system.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "EEG, eye-tracking, Human Computer Interface"

1

Kompatsiaris, Ioannis, Chandan Kumar, and Spiros Nikolopoulos. Signal Processing to Drive Human-Computer Interaction: EEG and Eye-Controlled Interfaces. Institution of Engineering & Technology, 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nikolopoulos, Spiros, Chandan Kumar, and Ioannis Kompatsiaris, eds. Signal Processing to Drive Human-Computer Interaction: EEG and eye-controlled interfaces. Institution of Engineering and Technology, 2020. http://dx.doi.org/10.1049/pbce129e.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "EEG, eye-tracking, Human Computer Interface"

1

Schmidt, Holger, and Gottfried Zimmermann. "Using Eye Tracking as Human Computer Interaction Interface." In Communications in Computer and Information Science, 523–27. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-21380-4_89.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lakshmi Pavani, M., A. V. Bhanu Prakash, M. S. Shwetha Koushik, J. Amudha, and C. Jyotsna. "Navigation Through Eye-Tracking for Human–Computer Interface." In Information and Communication Technology for Intelligent Systems, 575–86. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-1747-7_56.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Akiyoshi, Masanori, and Hidetoshi Takeno. "An Estimation Framework of a User Learning Curve on Web-Based Interface Using Eye Tracking Equipment." In Human-Computer Interaction. Human-Centred Design Approaches, Methods, Tools, and Environments, 159–65. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39232-0_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Matsuda, Keiji, Takeshi Nagami, Yasuko Sugase, Aya Takemura, and Kenji Kawano. "A Widely Applicable Real-Time Mono/Binocular Eye Tracking System Using a High Frame-Rate Digital Camera." In Human-Computer Interaction. User Interface Design, Development and Multimodality, 593–608. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-58071-5_45.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Lan, Guorui Ma, Jian Zhou, and Fang Jia. "Human-Computer Interface Design of Intelligent Spinning Factory Monitoring System Based on Eye Tracking Technology." In Lecture Notes in Networks and Systems, 579–86. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-80091-8_69.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

"Eye tracking for interaction: evaluation methods." In Signal Processing to Drive Human-Computer Interaction: EEG and eye-controlled interfaces, 117–44. Institution of Engineering and Technology, 2020. http://dx.doi.org/10.1049/pbce129e_ch6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

"Eye tracking for interaction: adapting multimedia interfaces." In Signal Processing to Drive Human-Computer Interaction: EEG and eye-controlled interfaces, 83–116. Institution of Engineering and Technology, 2020. http://dx.doi.org/10.1049/pbce129e_ch5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

"Video-based eye tracking." In Human–Computer Interface Technologies for the Motor Impaired, 117–34. CRC Press, 2015. http://dx.doi.org/10.1201/b19274-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Poole, Alex, and Linden J. Ball. "Eye Tracking in HCI and Usability Research." In Encyclopedia of Human Computer Interaction, 211–19. IGI Global, 2006. http://dx.doi.org/10.4018/978-1-59140-562-7.ch034.

Full text
Abstract:
Eye tracking is a technique whereby an individual’s eye movements are measured so that the researcher knows both where a person is looking at any given time and the sequence in which the person’s eyes are shifting from one location to another. Tracking people’s eye movements can help HCI researchers to understand visual and display-based information processing and the factors that may impact the usability of system interfaces. In this way, eye-movement recordings can provide an objective source of interface-evaluation data that can inform the design of improved interfaces. Eye movements also can be captured and used as control signals to enable people to interact with interfaces directly without the need for mouse or keyboard input, which can be a major advantage for certain populations of users, such as disabled individuals. We begin this article with an overview of eye-tracking technology and progress toward a detailed discussion of the use of eye tracking in HCI and usability research. A key element of this discussion is to provide a practical guide to inform researchers of the various eye-movement measures that can be taken and the way in which these metrics can address questions about system usability. We conclude by considering the future prospects for eye-tracking research in HCI and usability testing.
APA, Harvard, Vancouver, ISO, and other styles
10

Jacob, Robert J. K. "Eye Tracking in Advanced Interface Design." In Virtual Environments and Advanced Interface Design. Oxford University Press, 1995. http://dx.doi.org/10.1093/oso/9780195075557.003.0015.

Full text
Abstract:
The problem of human-computer interaction can be viewed as two powerful information processors (human and computer) attempting to communicate with each other via a narrow-bandwidth, highly constrained interface (Tufte, 1989). To address it, we seek faster, more natural, and more convenient means for users and computers to exchange information. The user’s side is constrained by the nature of human communication organs and abilities; the computer’s is constrained only by input/output devices and interaction techniques that we can invent. Current technology has been stronger in the computer-to-user direction than the user-to-computer, hence today’s user-computer dialogues are rather one-sided, with the bandwidth from the computer to the user far greater than that from user to computer. Using eye movements as a user-to-computer communication medium can help redress this imbalance. This chapter describes the relevant characteristics of the human eye, eye-tracking technology, how to design interaction techniques that incorporate eye movements into the user-computer dialogue in a convenient and natural way, and the relationship between eye-movement interfaces and virtual environments. As with other areas of research and design in human-computer interaction, it is helpful to build on the equipment and skills humans have acquired through evolution and experience and search for ways to apply them to communicating with a computer. Direct manipulation interfaces have enjoyed great success largely because they draw on analogies to existing human skills (pointing, grabbing, moving objects in space), rather than trained behaviors. Similarly, we try to make use of natural eye movements in designing interaction techniques for the eye. Because eye movements are so different from conventional computer inputs, our overall approach in designing interaction techniques is, wherever possible, to obtain information from a user’s natural eye movements while viewing the screen, rather than requiring the user to make specific trained eye movements to actuate the system. This requires careful attention to issues of human design, as will any successful work in virtual environments. The goal is for human-computer interaction to start with studies of the characteristics of human communication channels and skills and then develop devices, interaction techniques, and interfaces that communicate effectively to and from those channels.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "EEG, eye-tracking, Human Computer Interface"

1

Liu, Chang, Dingguo Yu, Jiefang Zhang, and Songyun Xie. "A Utility Human Machine Interface Using Low Cost EEG Cap and Eye Tracker." In 2021 9th International Winter Conference on Brain-Computer Interface (BCI). IEEE, 2021. http://dx.doi.org/10.1109/bci51272.2021.9385304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mizuno, F., A. Harada, and T. Yamaguchi. "Development of an Input Interface Using Ocular Potential for Handicapped Users of Health Care Supporting Computer." In ASME 2001 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/imece2001/bed-23098.

Full text
Abstract:
Abstract Numerous attempts to adapt multimedia communication to medical care have been reported recently. It is our view that spiritual support is more important in medical care, while so-called high technology may be necessary for medical practice. Therefore, we proposed the concept of the Hyper Hospital [1–3], to offer patients a means of effective human communication during medical care. The Hyper Hospital is a medical system constructed on a computer and multimedia based-network, which patients use to participate in medical and care activities through improved communication media. It is sometimes difficult for physically handicapped patients, such as PMD (progressive muscular dystrophy), ALS (amyotrophic lateral sclerosis), and traumatic cervical injury patients, to operate a computer, because of their disabilities. Therefore, there is a serious digital divide between physically disabled patients and healthy people. To remedy this, various communication devices, such as those using winking, eye gaze, voice, and electrical biological signals (event-related potential [4–5], electrooculogram, etc.) have been proposed and tested. These are designed to enable seriously handicapped patients to use a computer without using the usual mechanical input devices, such as a keyboard, mouse, or joystick. Although an EEG (electroencephalogram) offers one source of such potential electrical biological signals, it produces a very weak electrical signal that contaminating noise makes difficult to process. On the other hand, the ocular potential generated by the dipolar potential of the eyeball has a much larger gain in potential than the EEG. Moreover, the ocular potential can be easily controlled by the user, and eye-movement ability remains largely intact, even after neurological diseases progress to a very advanced stage. Therefore, this report studied the development of an input interface for computers using an electrooculogram.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhou, Feng, and Jianxin (Roger) Jiao. "An Augmented Affective-Cognition Framework for Usability Studies of In-Vehicle System User Interface." In ASME 2013 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/detc2013-13694.

Full text
Abstract:
Vehicles with better usability have become increasingly popular due to their ease of operations and safety for driving. However, the way how usability of in-vehicle system user interface is studied still needs improvement. This paper concerns how to use advanced computational, neurophysiology- and psychology-based tools and methodologies to determine affective (emotional) states and behavioral data of an individual in real time and in turn how to adapt the human-vehicle interaction to meet the user’s cognitive needs based on this real-time assessment. Specifically, we set up a set of neuro-physiological equipment that is capable of collecting EEG, facial EMG (electromyography), skin conductance response, and respiration data and a set of motion sensing and tracking equipment that is capable of eye ball movement and objects that the user interacts. All hardware components and software is integrated into a cohesive augmented sensor platform that can perform as “one coherent system” to enable multi-modal data processing and information inference for context-aware analysis of affective and cognitive states based on the rough set inference engine. Meanwhile subjective data is also recorded for comparison. A usability study of in-vehicle system UI is shown to demonstrate the potential of the proposed methodology.
APA, Harvard, Vancouver, ISO, and other styles
4

Fei Li, Yu, Sun Woh Lye, Terry Teo, and Alden Tan Shi Yin. "Design Issues in Setting Up of a Highly Synchronous Human-in-the-loop System." In Human Interaction and Emerging Technologies (IHIET-AI 2022) Artificial Intelligence and Future Applications. AHFE International, 2022. http://dx.doi.org/10.54941/ahfe100860.

Full text
Abstract:
In Human-machine interaction, information is communicated between a human and the machine via user interfaces. Consequently, the measures and level of insights that can be derived from such studies will be dependent upon the context under which the experimental setup was designed for. This paper highlights a successful highly synchronous human-in-the-loop (HITL) experimental system that can be used to measure situational aware-ness in Air Traffic Monitoring. The system setup consists of a NARSIM radar interface that displays the relevant aircraft information, a secondary control computer for the recording processing, and storage of the captured neurophysiological inputs, a remote eye tracker, and an electroencephalogram (EEG), the latter two of which provide the input. In setting up such a system, discussion of key design consideration on system compatibility and information transferability, spatial and resolution accuracy, data input sources and synchronization, software selection, and the ease of results validation will be made.
APA, Harvard, Vancouver, ISO, and other styles
5

Antoine Moinnereau, Marc, Tiago Henrique Falk, and Alcyr Alves De Oliveira. "Measuring Human Influential Factors During VR Gaming at Home: Towards Optimized Per-User Gaming Experiences." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002056.

Full text
Abstract:
It is known that human influential factors (HIFs, e.g., sense of presence/immersion; attention, stress, and engagement levels; fun factors) play a crucial role in the gamer’s perceived immersive media experience [1]. To this end, recent research has explored the use of affective brain-/body-computer interfaces to monitor such factors [2, 3]. Typically, studies have been conducted in laboratory settings and have relied on research-grade neurophysiological sensors. Transferring the obtained knowledge to everyday settings, however, is not straightforward, especially since it requires cumbersome and long preparation times (e.g., placing electroencephalography caps, gel, test impedances) which could be overwhelming for gamers. To overcome this limitation, we have recently developed an instrumented “plug-and-play” virtual reality head-mounted display (termed iHMD) [4] which directly embeds a number of dry ExG sensors (electroencephalography, EEG; electrocardiography, ECG; electromyography, EMG; and electrooculography, EoG) into the HMD. A portable bioamplifier is used to collect, stream, and/or store the biosignals in real-time. Moreover, a software suite has been developed to automatically measure signal quality [5], enhance the biosignals [6, 7, 8], infer breathing rate from the ECG [9], and extract relevant HIFs from the post-processed signals [3, 10, 11]. More recently, we have also developed companion software to allow for use and monitoring of the device at the gamer’s home with minimal experimental supervision, hence exploring its potential use truly “in the wild”. The iHMD, VR controllers, and a laptop, along with a copy of the Half-Life: Alyx videogame, were dropped off at the homes of 10 gamers who consented to participate in the study. All public health COVID-19 protocols were followed, including sanitizing the iHMD in a UV-C light chamber and with sanitizing wipes 48h prior to dropping the equipment off. Instructions on how to set up the equipment and the game, as well as a google form with a multi-part questionnaire [12] to be answered after the game were provided via videoconference. The researcher remained available remotely in case any participant questions arose, but otherwise, interventions were minimal. Participants were asked to play the game for around one hour and none of the participants reported cybersickness. This paper details the obtained results from this study and shows the potential of measuring HIFs from ExG signals collected “in the wild,” as well as their use in remote gaming experience monitoring. In particular, we will show the potential of measuring gamer engagement and sense of presence from the collected signals and their influence on overall experience. The next steps will be to use these signals and inferred HIFs to adjust the game in real-time, thus maximizing the experience for each individual gamer.References[1] Perkis, A., et al, 2020. QUALINET white paper on definitions of immersive media experience (IMEx). arXiv preprint arXiv:2007.07032.[2] Gupta, R., et al, 2016. Using affective BCIs to characterize human influential factors for speech QoE perception modelling. Human-centric Computing and Information Sciences, 6(1):1-19.[3] Clerico, A., et al, 2016, Biometrics and classifier fusion to predict the fun-factor in video gaming. In IEEE Conf Comp Intell and Games (pp. 1-8).[4] Cassani, R., et al 2020. Neural interface instrumented virtual reality headsets: Toward next-generation immersive applications. IEEE SMC Mag, 6(3):20-28.[5] Tobon, D. et al, 2014. MS-QI: A modulation spectrum-based ECG quality index for telehealth applications. IEEE TBE, 63(8):1613-1622.[6] Tobón, D. and Falk, T.H., 2016. Adaptive spectro-temporal filtering for electrocardiogram signal enhancement. IEEE JBHI, 22(2):421-428.[7] dos Santos, E., et al, 2020. Improved motor imagery BCI performance via adaptive modulation filtering and two-stage classification. Biomed Signal Proc Control, Vol. 57.[8] Rosanne, O., et al, 2021. Adaptive filtering for improved EEG-based mental workload assessment of ambulant users. Front. Neurosci, Vol.15.[9] Cassani, R., et al, 2018. Respiration rate estimation from noisy electrocardiograms based on modulation spectral analysis. CMBES Proc., Vol. 41.[10] Tiwari, A. and Falk, T.H., 2021. New Measures of Heart Rate Variability based on Subband Tachogram Complexity and Spectral Characteristics for Improved Stress and Anxiety Monitoring in Highly Ecological Settings. Front Signal Proc, Vol.7.[11] Moinnereau, M.A., 2020, Saccadic Eye Movement Classification Using ExG Sensors Embedded into a Virtual Reality Headset. In IEEE Conf SMC, pp. 3494-3498.[12] Tcha-Tokey, K., et al, 2016. Proposition and Validation of a Questionnaire to Measure the User Experience in Immersive Virtual Environments. Intl J Virtual Reality, 16:33-48.
APA, Harvard, Vancouver, ISO, and other styles
6

Shafiei, Somayeh B., Khurshid A. Guru, and Ehsan T. Esfahani. "Using Two-Third Power Law for Segmentation of Hand Movement in Robotic Assisted Surgery." In ASME 2015 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/detc2015-47813.

Full text
Abstract:
In this study, we have developed a robust and accurate algorithm based on concept of two-third power law in human motor control to segment the hand trajectory of robotic surgeons into smaller segments. We hypothesis that tracking a longer trajectory is subjected to higher cognitive workload that may lead in to an imperfect CNS performance in programming muscle activation which will lead to more number of segment trajectories and pause points in hand movements. To test our hypothesis, after segmenting the trajectory, we determine the correlation between affine velocity and workload extracted from Surgeon’s Electroencephalography (EEG) features. EEG features are extracted by using brain waves recorded by wireless brain computer interface (B-Alert X-10 system). In our experimental study, 2 groups of participants three “experts” and five “Competent and Proficient” performed Urethro-vesical Anastomosis on an inanimate model, using the da-Vinci Surgical System® (Sunnyvale, CA).
APA, Harvard, Vancouver, ISO, and other styles
7

Lobo, Jesus L., Javier Del Ser, Flavia De Simone, Roberta Presta, Simona Collina, and Zdenek Moravek. "Cognitive workload classification using eye-tracking and EEG data." In HCI-Aero '16: International Conference on Human-Computer Interaction in Aerospace 2016. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2950112.2964585.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Panev, Stanislav, and Agata Manolova. "Improved multi-camera 3D Eye Tracking for human-computer interface." In 2015 IEEE 8th International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS). IEEE, 2015. http://dx.doi.org/10.1109/idaacs.2015.7340743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hiley, Jonathon B., Andrew H. Redekopp, and Reza Fazel-Rezai. "A Low Cost Human Computer Interface based on Eye Tracking." In Conference Proceedings. Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 2006. http://dx.doi.org/10.1109/iembs.2006.260774.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hiley, Jonathon B., Andrew H. Redekopp, and Reza Fazel-Rezai. "A Low Cost Human Computer Interface based on Eye Tracking." In Conference Proceedings. Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 2006. http://dx.doi.org/10.1109/iembs.2006.4398134.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography