Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: GESTURE CONTROLLING.

Статті в журналах з теми "GESTURE CONTROLLING"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "GESTURE CONTROLLING".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Vijaya, V. Krishna, Puvvala Harsha, Sricharan Murukutla, Kurra Eswar, and Nannapaneni Sravan Kuma. "Hand Gesture Controlling System." International Journal for Research in Applied Science and Engineering Technology 11, no. 1 (January 31, 2023): 669–74. http://dx.doi.org/10.22214/ijraset.2023.48653.

Повний текст джерела
Анотація:
bstract: As a result of the industrial 4.0 revolution, hand gestures are becoming more and more significant in the disciplines of robotics and IoT. Hand gestures are often used in the IoT field in applications for smart homes, wearable technology, vehicles, virtual reality, and other things. Let's try to infuse this work with some of our own originality. Combining Python and Arduino for Laptop/Computer Gesture Control. We'll utilise two Ultrasonic sensors to determine where our hand is with relation to a media player (VLC) and control it. We'll mount two ultrasonic sensors on top of the monitor and use an Arduino to gauge how far away it is from our hand. We will take particular activities in response to this measurement. We can do computer operations using the Python PyautoGUI module. The computer gets commands from the Arduino via the serial port. Currently, scientists are working to develop a hand gesture computer that runs entirely on hand gestures and sensors instead of any hardware. Few researchers have actually shown that we can control the video player, web browser, and text document with hand movements using Arduino and ultrasonic sensors.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Varshika, DSS. "Media-Player Controlling by Hand Gestures." International Journal for Research in Applied Science and Engineering Technology 9, no. VI (June 20, 2021): 2022–30. http://dx.doi.org/10.22214/ijraset.2021.348515421.

Повний текст джерела
Анотація:
In this Project we try to control our media player using hand gestures with the help of OpenCV and Python. Computer applications require interaction between human and computer. This interaction needs to be unrestricted and it has made it challenging to traditional input devices such as keyboard, mouse, pen etc. Hand gesture is an important component of body languages in linguistics. Human computer interaction becomes easy with the use of the hand as a device. Use of hand gestures to operate machines would make interaction interesting. Gesture recognition has gained a lot of importance. Hand gestures are used to control various applications like windows media player, robot control, gaming etc. Use of gesture makes interaction easy, convenient and does not require any extra device. Vision and audio recognition can be used together. But audio commands may not work in noisy environments.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Alyamani, Hasan J. "Gesture Vocabularies for Hand Gestures for Controlling Air Conditioners in Home and Vehicle Environments." Electronics 12, no. 7 (March 23, 2023): 1513. http://dx.doi.org/10.3390/electronics12071513.

Повний текст джерела
Анотація:
With the growing prevalence of modern technologies as part of everyday life, mid-air gestures have become a promising input method in the field of human–computer interaction. This paper analyses the gestures of actual users to define a preliminary gesture vocabulary for home air conditioning (AC) systems and suggests a gesture vocabulary for controlling the AC that applies to both home and vehicle environments. In this study, a user elicitation experiment was conducted. A total of 36 participants were filmed while employing their preferred hand gestures to manipulate a home air conditioning system. Comparisons were drawn between our proposed gesture vocabulary (HomeG) and a previously proposed gesture vocabulary which was designed to identify the preferred hand gestures for in-vehicle air conditioners. The findings indicate that HomeG successfully identifies and describes the employed gestures in detail. To gain a gesture taxonomy that is suitable for manipulating the AC at home and in a vehicle, some modifications were applied to HomeG based on suggestions from other studies. The modified gesture vocabulary (CrossG) can identify the gestures of our study, although CrossG has a less detailed gesture pattern. Our results will help designers to understand user preferences and behaviour prior to designing and implementing a gesture-based user interface.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

SHAIK, Dr ABDUL NABI, E. SAI PRIYA, G. NIKITHA, K. PRACHEEN KUMAR, and N. SHRAVYA SHREE. "CONTROLLING VIDEOLAN CLIENT MEDIA USING LUCAS KANADE OPTIMAL FLOW ALGORITHM AND OPENCV." YMER Digital 21, no. 05 (May 8, 2022): 246–55. http://dx.doi.org/10.37896/ymer21.05/29.

Повний текст джерела
Анотація:
In this project we've discussed a system which uses some dynamic hand gesture recognition technique to manage the media players like VLC media player. This project consist of certain modules which segments the foreground a part of the frame using detection of skin colour and approximate median technique. the popularity of hand gesture is finished by creating a call Tree, that uses certain features extracted from the segmented part. This hand gesture recognition technique introduces a replacement, natural thanks to interact with computers and is beneficial to several folks in our day-to-day life. Hand gestures are associated with the human hands in hand gesture recognition. This project is employed for controlling certain operations (Pause, Play, Volume up, Volume Down, Mute) on video player by mere hand gestures without getting in the rigmarole of pressing buttons or tapping onto the screen. this may be directly associated with our day to life like in presentations. We have considered hand gestures and their directional motion defines a gesture for the applying. during this application image retrieving is finished employing a Webcam. Some functions in VLC media players are used most often and these functions uses some predefined gestures. Shows the defined gestures in keeping with the VLC player control operation. We created a VLC Media Player Controller using Hand Gesture Recognition System to form ‘HUMAN LIFE EASY AND BETTER’. This project is implemented using two steps: (1.) Creation of Hand Gesture Recognition System: this is often done using image processing using OpenCV library. (2.) Controlling VLC Media Player using hand gestures: during this step we controlled the player using shell commands which were recalled using python commands through OS library
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Chakradhar, K. S., Prasanna Rejinthala Lakshmi, salla chowdary Sree Rama Brunda, Bharathi Pola, and Bhargava Petlu. "Controlling Media Player Using Hand Gestures." Electrical and Automation Engineering 2, no. 1 (April 1, 2023): 45–54. http://dx.doi.org/10.46632/eae/2/1/7.

Повний текст джерела
Анотація:
Computer usage is increasing rapidly day by day but the input devices are limited and to access them, we need to be near the screen. To overcome this problem and control the screen, we can use hand gestures. For every operation, we used different hand gestures. We proposed a python program to control the media player through hand gestures. In this method, we used libraries like OpenCV, Media Pipe, PyAuto GUI, and other libraries to capture the video, provide ready-to-use ML solutions and automate your GUI and programmatically control your keyboard and mouse. Hand gestures will be used as the input for providing natural interaction by reducing external hardware interaction. The whole process is divided into two steps. Firstly, gesture recognition through the camera is done by OpenCV and media Pipe helps to identify the gesture b its position, and the respective command is executed. Secondly, PyAuto GUI is used to automate the keyboard and controls the media player.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Chakradhar, K. S., Lakshmi Prasanna Rejinthala, chowdary Sree Rama Brunda salla, Bharathi Pola, and Bhargava Petlu. "Controlling Media Player Using Hand Gestures." Electrical and Automation Engineering 2, no. 1 (April 1, 2023): 45–54. http://dx.doi.org/10.46632/ese/2/1/7.

Повний текст джерела
Анотація:
Computer usage is increasing rapidly day by day but the input devices are limited and to access them, we need to be near the screen. To overcome this problem and control the screen, we can use hand gestures. For every operation, we used different hand gestures. We proposed a python program to control the media player through hand gestures. In this method, we used libraries like OpenCV, Media Pipe, PyAuto GUI, and other libraries to capture the video, provide ready-to-use ML solutions and automate your GUI and programmatically control your keyboard and mouse. Hand gestures will be used as the input for providing natural interaction by reducing external hardware interaction. The whole process is divided into two steps. Firstly, gesture recognition through the camera is done by OpenCV and media Pipe helps to identify the gesture b its position, and the respective command is executed. Secondly, PyAuto GUI is used to automate the keyboard and controls the media player.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Labhane, Nishant M., Prashant Harsh, and Meghan Kulkarni. "Multipoint Hand Gesture Recognition For Controlling Bot." International Journal of Scientific Research 1, no. 1 (June 1, 2012): 46–48. http://dx.doi.org/10.15373/22778179/jun2012/16.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Monisha Sampath, PriyadarshiniVelraj, Vaishnavii Raghavendran, and M Sumithra. "Controlling media player using hand gestures with VLC media player." World Journal of Advanced Research and Reviews 14, no. 3 (June 30, 2022): 466–72. http://dx.doi.org/10.30574/wjarr.2022.14.3.0565.

Повний текст джерела
Анотація:
In today's international, anyone opts for instant interaction with complicated structures that ensure a brief response. Thus, with increasing improvement in technology, reaction time and ease of operations are the issues. Here is where human-computer interaction comes into play. This interplay is unrestricted and challenges the used gadgets consisting of the keyboard and mouse for input. Gesture recognition has been gaining tons of attention. Gestures are instinctive and are often utilized in everyday interactions. Therefore, communicating using gestures with computer systems creates an entire new trend of interaction. In this assignment, with the help of laptop vision and deep studying techniques, person hand movements (gestures) are used in real-time to manipulate the media player. In this project, seven gestures are defined to control the media gamers' usage of hand gestures. The proposed internet application permits the person to use their neighborhood device digicam to become aware of their gesture and execute the control over the media participant and comparable packages (with no extra hardware). It will increase performance and make interaction convenient through letting the user manage his/her pc/laptop from a distance.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Holder, Sherrie, and Leia Stirling. "Effect of Gesture Interface Mapping on Controlling a Multi-degree-of-freedom Robotic Arm in a Complex Environment." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 64, no. 1 (December 2020): 183–87. http://dx.doi.org/10.1177/1071181320641045.

Повний текст джерела
Анотація:
There are many robotic scenarios that require real-time function in large or unconstrained environments, for example, the robotic arm on the International Space Station (ISS). Use of fully-wearable gesture control systems are well-suited to human-robot interaction scenarios where users are mobile and must have hands free. A human study examined operation of a simulated ISS robotic arm using three different gesture input mappings compared to the traditional joystick interface. Two gesture mappings permitted multiple simultaneous inputs (multi-input), while the third was a single-input method. Experimental results support performance advantages of multi-input gesture methods over single input. Differences between the two multi-input methods in task completion and workload display an effect of user-directed attention on interface success. Mappings based on natural human arm movement are promising for gesture interfaces in mobile robotic applications. This study also highlights challenges in gesture mapping, including how users align gestures with their body and environment.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Lorang, Emily, Audra Sterling, and Bianca Schroeder. "Maternal Responsiveness to Gestures in Children With Down Syndrome." American Journal of Speech-Language Pathology 27, no. 3 (August 6, 2018): 1018–29. http://dx.doi.org/10.1044/2018_ajslp-17-0138.

Повний текст джерела
Анотація:
Purpose This study compared gesture use in young children with Down syndrome (DS) and typical development (TD) as well as how mothers respond to child gestures based on child age and diagnosis. Method Twenty-two mother–child dyads with DS and 22 mother–child dyads with TD participated. The child participants were between 22 and 63 months and were matched on chronological age. We coded child gesture use and whether mothers recoded child gestures (i.e., provided a verbal translation) during naturalistic interactions. Results The children with DS used more gestures than peers with TD. After controlling for expressive language ability, the two groups were not significantly different on child gesture use. Regardless of child diagnosis, mothers recoded approximately the same percentage of child gestures. There was a significant interaction between child diagnosis and child age when predicting the percentage of maternal gesture recodes; mothers of children with DS did not demonstrate differences in the percentage of maternal gesture recodes based on child age, but there was a negative relationship between the percentage of maternal gesture recodes and child age for the children with TD. Conclusions Young children with DS gesture more than chronological age–matched children with TD, therefore providing numerous opportunities for caregivers to recode child gestures and support language development. Early intervention should focus on increasing parent responsiveness to child gestures earlier in life in order to provide additional word-learning opportunities for children with DS.
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Mahna, Shivanku, Ketan Sethi, and Sravan Ch. "Controlling Mouse using Hand Gesture Recognition." International Journal of Computer Applications 113, no. 5 (March 18, 2015): 1–4. http://dx.doi.org/10.5120/19819-1652.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Yu, Hengcheng, and Zhengyu Chen. "Research on contactless control of elevator based on machine vision." Highlights in Science, Engineering and Technology 7 (August 3, 2022): 89–94. http://dx.doi.org/10.54097/hset.v7i.1022.

Повний текст джерела
Анотація:
Aiming at the problem of cross-infection caused by elevator public buttons during the COVID-19 epidemic, a non-contact elevator button control gesture recognition system based on machine vision is designed. In order to improve the detection speed of gesture recognition, combined with the Spatial Pyramid Pooling (SPP) and replaced the Backbone in YOLOv5 with the lightweight model ShuffleNetV2, an improved YOLOv5_shff algorithm was proposed. After testing, in the task of recognizing gestures, the detection speed of the YOLOv5_shff algorithm is 14% higher than the original model, and the detection accuracy is 0.1% higher than the original model. Taking the improved YOLOv5_shff algorithm as the core, a gesture recognition system that can be applied to elevator button control is designed. The experimental data shows that the gesture recognition accuracy for controlling elevator buttons reaches 99.3%, which can meet the requirements of non-contact control of public elevators. Aiming at the problem of cross-infection caused by elevator public buttons during the COVID-19 epidemic, a non-contact elevator button control gesture recognition system based on machine vision is designed. In order to improve the detection speed of gesture recognition, combined with the Spatial Pyramid Pooling (SPP) and replaced the Backbone in YOLOv5 with the lightweight model ShuffleNetV2, an improved YOLOv5_shff algorithm was proposed. After testing, in the task of recognizing gestures, the detection speed of the YOLOv5_shff algorithm is 14% higher than the original model, and the detection accuracy is 0.1% higher than the original model. Taking the improved YOLOv5_shff algorithm as the core, a gesture recognition system that can be applied to elevator button control is designed. The experimental data shows that the gesture recognition accuracy for controlling elevator buttons reaches 99.3%, which can meet the requirements of non-contact control of public elevators.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Thorat, Sakshi. "HCI Based Virtual Controlling System." International Journal for Research in Applied Science and Engineering Technology 10, no. 6 (June 30, 2022): 630–35. http://dx.doi.org/10.22214/ijraset.2022.43645.

Повний текст джерела
Анотація:
Abstract: Researchers around the globe are working on making our devices more interactive and making them function with minimal physical contact in this research project. The proposed system is an interactive computer system that can operate without a physical keyboard or mouse. This system will benefit everyone, particularly immobilized people with special needs operating a physical keyboard and mouse. So in the system, they have developed an interface that uses visual hand-gesture analysis. These gestures are used to assist those who are having trouble controlling or operating computers or gadgets. The model is being developed in such a way that it can assist us in recognizing and implementing it. Regarding hand gestures, our interface uses OpenCV, Python, and computer vision algorithms that can detect different finger orientations, distinguish the user's hand from the background, and distinguish significant hand movements from unwanted hand movements. Keywords: Machine Learning, Computer Vision, Image Processing, OpenCV, Human-Computer Interaction (HCI)
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Valle, Chelsea La, Karen Chenausky, and Helen Tager-Flusberg. "How do minimally verbal children and adolescents with autism spectrum disorder use communicative gestures to complement their spoken language abilities?" Autism & Developmental Language Impairments 6 (January 2021): 239694152110350. http://dx.doi.org/10.1177/23969415211035065.

Повний текст джерела
Анотація:
Background and aims Prior work has examined how children and adolescents with autism spectrum disorder who are minimally verbal use their spoken language abilities during interactions with others. However, social communication includes other aspects beyond speech. To our knowledge, no studies have examined how minimally verbal children and adolescents with autism spectrum disorder are using their gestural communication during social interactions. Such work can provide important insights into how gestures may complement their spoken language abilities. Methods Fifty minimally verbal children and adolescents with autism spectrum disorder participated ( Mage = 12.41 years; 38 males). Gestural communication was coded from the Autism Diagnostic Observation Schedule. Children ( n = 25) and adolescents ( n = 25) were compared on their production of gestures, gesture–speech combinations, and communicative functions. Communicative functions were also assessed by the type of communication modality: gesture, speech, and gesture–speech to examine the range of communicative functions across different modalities of communication. To explore the role gestures may play the relation between speech utterances and gestural production was investigated. Results Analyses revealed that (1) minimally verbal children and adolescents with autism spectrum disorder did not differ in their total number of gestures. The most frequently produced gesture across children and adolescents was a reach gesture, followed by a point gesture (deictic gesture), and then conventional gestures. However, adolescents produced more gesture–speech combinations (reinforcing gesture-speech combinations) and displayed a wider range of communicative functions. (2) Overlap was found in the types of communicative functions expressed across different communication modalities. However, requests were conveyed via gesture more frequently compared to speech or gesture–speech. In contrast, dis/agree/acknowledging and responding to a question posed by the conversational partner was expressed more frequently via speech compared to gesture or gesture–speech. (3) The total number of gestures was negatively associated with total speech utterances after controlling for chronological age, receptive communication ability, and nonverbal IQ. Conclusions Adolescents may be employing different communication strategies to maintain the conversational exchange and to further clarify the message they want to convey to the conversational partner. Although overlap occurred in communicative functions across gesture, speech, and gesture–speech, nuanced differences emerged in how often they were expressed across different modalities of communication. Given their speech production abilities, gestures may play a compensatory role for some individuals with autism spectrum disorder who are minimally verbal. Implications Findings underscore the importance of assessing multiple modalities of communication to provide a fuller picture of their social communication abilities. Our results identified specific communicative strengths and areas for growth that can be targeted and expanded upon within gesture and speech to optimize social communication development.
Стилі APA, Harvard, Vancouver, ISO та ін.
15

KALLIO, SANNA, JUHA KELA, PANU KORPIPÄÄ, and JANI MÄNTYJÄRVI. "USER INDEPENDENT GESTURE INTERACTION FOR SMALL HANDHELD DEVICES." International Journal of Pattern Recognition and Artificial Intelligence 20, no. 04 (June 2006): 505–24. http://dx.doi.org/10.1142/s0218001406004776.

Повний текст джерела
Анотація:
Accelerometer-based gesture recognition facilitates a complementary interaction modality for controlling mobile devices and home appliances. Using gestures for the task of home appliance control requires use of the same device and gestures by different persons, i.e. user independent gesture recognition. The practical application in small embedded low-resource devices also requires high computational performance. The user independent gesture recognition accuracy was evaluated with a set of eight gestures and seven users, with a total of 1120 gestures in the dataset. Twenty-state continuous HMM yielded an average of 96.9% user independent recognition accuracy, which was cross-validated by leaving one user in turn out of the training set. Continuous and discrete five-state HMM computational performances were compared with a reference test in a PC environment, indicating that discrete HMM is 20% faster. Computational performance of discrete five-state HMM was evaluated in an embedded hardware environment with a 104 MHz ARM-9 processor and Symbian OS. The average recognition time per gesture calculated from 1120 gesture repetitions was 8.3 ms. With this result, the computational performance difference between the compared methods is considered insignificant in terms of practical application. Continuous HMM is hence recommended as a preferred method due to its better suitability for a continuous-valued signal, and better recognition accuracy. The results suggest that, according to both evaluation criteria, HMM is feasible for practical user independent gesture control applications in mobile low-resource embedded environments.
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Purushothaman, Amritha, and Suja Palaniswamy. "Development of Smart Home Using Gesture Recognition for Elderly and Disabled." Journal of Computational and Theoretical Nanoscience 17, no. 1 (January 1, 2020): 177–81. http://dx.doi.org/10.1166/jctn.2020.8647.

Повний текст джерела
Анотація:
Smart home has gained popularity not only as a luxury but also due to the numerous advantages. It is especially useful for senior citizens and children with disabilities. In this work, home automation is achieved using gesture for controlling appliances. Gesture recognition is an area in which lot of research and innovations are blooming. This paper discusses the development of a wearable device which captures hand gestures. The wearable device uses accelerometer and gyroscopes to sense and capture tilting, rotation and acceleration of the hand movement. Four different hand gestures are captured using this wearable device and machine learning algorithm namely Support Vector Machine has been used for classification of gestures to control ON/OFF of appliances.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Huu, Phat Nguyen, and Tan Phung Ngoc. "Hand Gesture Recognition Algorithm Using SVM and HOG Model for Control of Robotic System." Journal of Robotics 2021 (June 16, 2021): 1–13. http://dx.doi.org/10.1155/2021/3986497.

Повний текст джерела
Анотація:
In this study, we propose the gesture recognition algorithm using support vector machines (SVM) and histogram of oriented gradient (HOG). Besides, we also use the CNN model to classify gestures. We approach and select techniques of applying problem controlling for the robotic system. The goal of the algorithm is to detect gestures with real-time processing speed, minimize interference, and reduce the ability to capture unintentional gestures. Static gesture controls are used in this study including on, off, increasing, and decreasing. Besides, it uses motion gestures including turning on the status switch and increasing and decreasing the volume. Results show that the algorithm is up to 99% accuracy with a 70-millisecond execution time per frame that is suitable for industrial applications.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Lu, Cong, Haoyang Zhang, Yu Pei, Liang Xie, Ye Yan, Erwei Yin, and Jing Jin. "Online Hand Gesture Detection and Recognition for UAV Motion Planning." Machines 11, no. 2 (February 1, 2023): 210. http://dx.doi.org/10.3390/machines11020210.

Повний текст джерела
Анотація:
Recent advances in hand gesture recognition have produced more natural and intuitive methods of controlling unmanned aerial vehicles (UAVs). However, in unknown and cluttered environments, UAV motion planning requires the assistance of hand gesture interaction in complex flight tasks, which remains a significant challenge. In this paper, a novel framework based on hand gesture interaction is proposed, to support efficient and robust UAV flight. A cascading structure, which includes Gaussian Native Bayes (GNB) and Random Forest (RF), was designed, to classify hand gestures based on the Six Degrees of Freedom (6DoF) inertial measurement units (IMUs) of the data glove. The hand gestures were mapped onto UAV’s flight commands, which corresponded to the direction of the UAV flight.The experimental results, which tested the 10 evaluated hand gestures, revealed the high accuracy of online hand gesture recognition under asynchronous detection (92%), and relatively low latency for interaction (average recognition time of 7.5 ms; average total time of 3 s).The average time of the UAV’s complex flight task was about 8 s shorter than that of the synchronous hand gesture detection and recognition. The proposed framework was validated as efficient and robust, with extensive benchmark comparisons in various complex real-world environments.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

RAUTARAY, SIDDHARTH S., and ANUPAM AGRAWAL. "VISION-BASED APPLICATION-ADAPTIVE HAND GESTURE RECOGNITION SYSTEM." International Journal of Information Acquisition 09, no. 01 (March 2013): 1350007. http://dx.doi.org/10.1142/s0219878913500071.

Повний текст джерела
Анотація:
With the increasing role of computing devices, facilitating natural human computer interaction (HCI) will have a positive impact on their usage and acceptance as a whole. For long time, research on HCI has been restricted to techniques based on the use of keyboard, mouse, etc. Recently, this paradigm has changed. Techniques such as vision, sound, speech recognition allow for much richer form of interaction between the user and machine. The emphasis is to provide a natural form of interface for interaction. Gestures are one of the natural forms of interaction between humans. As gesture commands are found to be natural for humans, the development of gesture control systems for controlling devices have become a popular research topic in recent years. Researchers have proposed different gesture recognition systems which act as an interface for controlling the applications. One of the drawbacks of present gesture recognition systems is application dependence which makes it difficult to transfer one gesture control interface into different applications. This paper focuses on designing a vision-based hand gesture recognition system which is adaptive to different applications thus making the gesture recognition systems to be application adaptive. The designed system comprises different processing steps like detection, segmentation, tracking, recognition, etc. For making the system as application-adaptive, different quantitative and qualitative parameters have been taken into consideration. The quantitative parameters include gesture recognition rate, features extracted and root mean square error of the system while the qualitative parameters include intuitiveness, accuracy, stress/comfort, computational efficiency, user's tolerance, and real-time performance related to the proposed system. These parameters have a vital impact on the performance of the proposed application adaptive hand gesture recognition system.
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Narayanpethkar, Sangamesh. "Computer Vision based Media Control using Hand Gestures." International Journal for Research in Applied Science and Engineering Technology 11, no. 5 (May 31, 2023): 6642–46. http://dx.doi.org/10.22214/ijraset.2023.52881.

Повний текст джерела
Анотація:
Abstract: Hand gestures are a form of nonverbal communication that can be used in several fields such as communication between deaf-mute people, robot control, human–computer interaction (HCI), home automation and medical applications. At this time and age, working with a computer in some capacity is a common task. In most situations, the keyboard and mouse are the primary input devices. However, there are several problems associated with excessive usage of the same interaction medium, such as health problems brought on by continuous use of input devices, etc. Humans basically communicate using gestures and it is indeed one of the best ways to communicate. Gesture-based real-time gesture recognition systems received great attention in recent years because of their ability to interact with systems efficiently through human-computer interaction. This project implements computer vision and gesture recognition techniques and develops a vision based low-cost input software for controlling the media player through gestures.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Moldovan, Constantin Catalin, and Ionel Staretu. "Real-Time Gesture Recognition for Controlling a Virtual Hand." Advanced Materials Research 463-464 (February 2012): 1147–50. http://dx.doi.org/10.4028/www.scientific.net/amr.463-464.1147.

Повний текст джерела
Анотація:
Object tracking in three dimensional environments is an area of research that has attracted a lot of attention lately, for its potential regarding the interaction between man and machine. Hand gesture detection and recognition, in real time, from video stream, plays a significant role in the human-computer interaction and, on the current digital image processing applications, this represent a difficult task. This paper aims to present a new method for human hand control in virtual environments, by eliminating the need of an external device currently used for hand motion capture and digitization. A first step in this direction would be the detection of human hand, followed by the detection of gestures and their use to control a virtual hand in a virtual environment.
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Tawfeeq, Mohammed, and Ayam Abbass. "Control of Robot Directions Based on Online Hand Gestures." Iraqi Journal for Electrical and Electronic Engineering 14, no. 1 (June 1, 2018): 41–50. http://dx.doi.org/10.37917/ijeee.14.1.5.

Повний текст джерела
Анотація:
The evolution of wireless communication technology increases human machine interaction capabilities especially in controlling robotic systems. This paper introduces an effective wireless system in controlling the directions of a wheeled robot based on online hand gestures. The hand gesture images are captured and processed to be recognized and classified using neural network (NN). The NN is trained using extracted features to distinguish five different gestures; accordingly it produces five different signals. These signals are transmitted to control the directions of the cited robot. The main contribution of this paper is, the technique used to recognize hand gestures is required only two features, these features can be extracted in very short time using quite easy methodology, and this makes the proposed technique so suitable for online interaction. In this methodology, the preprocessed image is partitioned column-wise into two half segments; from each half one feature is extracted. This feature represents the ratio of white to black pixels of the segment histogram. The NN showed very high accuracy in recognizing all of the proposed gesture classes. The NN output signals are transmitted to the robot microcontroller wirelessly using Bluetooth. Accordingly the microcontroller guides the robot to the desired direction. The overall system showed high performance in controlling the robot movement directions.
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Oppenheim, Matthew. "HeadBanger: controlling switchable software with head gesture." Journal of Assistive Technologies 10, no. 1 (March 21, 2016): 2–10. http://dx.doi.org/10.1108/jat-04-2015-0015.

Повний текст джерела
Анотація:
Purpose – The purpose of this paper is to present a novel non-contact method of using head movement to control software without the need for wearable devices. Design/methodology/approach – A webcam and software are used to track head position. When the head is moved through a virtual target, a keystroke is simulated. The system was assessed by participants with impaired mobility using Sensory Software’s Grid 2 software as a test platform. Findings – The target user group could effectively use this system to interact with switchable software. Practical implications – Physical head switches could be replaced with virtual devices, reducing fatigue and dissatisfaction. Originality/value – Using a webcam to control software using head gestures where the participant does not have to wear any specialised technology or a marker. This system is shown to be of benefit to motor impaired participants for operating switchable software.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Gentilucci, Maurizio, Elisa De Stefani, and Alessandro Innocenti. "From Gesture to Speech." Biolinguistics 6, no. 3-4 (November 28, 2012): 338–53. http://dx.doi.org/10.5964/bioling.8925.

Повний текст джерела
Анотація:
One of the major problems concerning the evolution of human language is to understand how sounds became associated to meaningful gestures. It has been proposed that the circuit controlling gestures and speech evolved from a circuit involved in the control of arm and mouth movements related to ingestion. This circuit contributed to the evolution of spoken language, moving from a system of communication based on arm gestures. The discovery of the mirror neurons has provided strong support for the gestural theory of speech origin because they offer a natural substrate for the embodiment of language and create a direct link between sender and receiver of a message. Behavioural studies indicate that manual gestures are linked to mouth movements used for syllable emission. Grasping with the hand selectively affected movement of inner or outer parts of the mouth according to syllable pronunciation and hand postures, in addition to hand actions, influenced the control of mouth grasp and vocalization. Gestures and words are also related to each other. It was found that when producing communicative gestures (emblems) the intention to interact directly with a conspecific was transferred from gestures to words, inducing modification in voice parameters. Transfer effects of the meaning of representational gestures were found on both vocalizations and meaningful words. It has been concluded that the results of our studies suggest the existence of a system relating gesture to vocalization which was precursor of a more general system reciprocally relating gesture to word.
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Barrett, Anna M., Anne L. Foundas, and Kenneth M. Heilman. "Speech and gesture are mediated by independent systems." Behavioral and Brain Sciences 28, no. 2 (April 2005): 125–26. http://dx.doi.org/10.1017/s0140525x05220034.

Повний текст джерела
Анотація:
Arbib suggests that language emerged in direct relation to manual gestural communication, that Broca's area participates in producing and imitating gestures, and that emotional facial expressions contributed to gesture-language coevolution. We discuss functional and structural evidence supporting localization of the neuronal modules controlling limb praxis, speech and language, and emotional communication. Current evidence supports completely independent limb praxis and speech/language systems.
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Rana, Harsh Ganpatbhai, and Aayushi Sanjay Soni. "An IoT based Game Controlling Device (JoyGlove)." International Journal on Recent and Innovation Trends in Computing and Communication 8, no. 4 (April 30, 2020): 09–11. http://dx.doi.org/10.17762/ijritcc.v8i4.5366.

Повний текст джерела
Анотація:
Nowadays, there is a huge inclination of youngsters and adults towards video games and of course in our childhood everyone might definitely have experienced it. However, we are controlling the game using typical input devices such as mouse, keyboard, joystick, etc but how about, if we control the game using our hand gestures ? These days we have lots of game controllers that are surpassing the gaming experience however they are quite expensive too. Through this project we have designed our own game controlling glove using Arduino. In addition we have also developed a car game using UNITY 3D. The areas of IOT and hand gesture recognition will be explored by this project.
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Shukla, Anita, Ankit Jain, Prajjwal Mishra, and Rahul Kushwaha. "Human Gesture Controlled Car Robot." SAMRIDDHI : A Journal of Physical Sciences, Engineering and Technology 11, no. 02 (December 25, 2019): 115–22. http://dx.doi.org/10.18090/samriddhi.v11i02.5.

Повний текст джерела
Анотація:
In present era of development and growth, the technology has made possible for people to operate electronic devices more conveniently. Now we are able to operate machines without giving it a touch with the help of technology called Hand Gesture Recognition. Here we have devised a gesture controlled car robot which uses PIC16F738 microcontroller and accelerometer to achieve human computer interaction. In this paper we deal with development and implementation of wireless and an accelerometer based hand gesture controlled car robot using RF transmitter and receiver. Transmitter detects the movement of hand and sends the command to the receiver by RF; receiver receives the command and moves the robot accordingly. Apart from conventional approach of controlling mechanism of car robots via buttons etc., here we have developed a definite and effective algorithm for identification of the gestures.
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Yang, Geng, Honghao Lv, Feiyu Chen, Zhibo Pang, Jin Wang, Huayong Yang, and Junhui Zhang. "A Novel Gesture Recognition System for Intelligent Interaction with a Nursing-Care Assistant Robot." Applied Sciences 8, no. 12 (November 22, 2018): 2349. http://dx.doi.org/10.3390/app8122349.

Повний текст джерела
Анотація:
The expansion of nursing-care assistant robots in smart infrastructure has provided more applications for homecare services, which has raised new demands for smart and natural interaction between humans and robots. This article proposed an innovative hand motion trajectory (HMT) gesture recognition system based on background velocity features. Here, a new wearable wrist-worn camera prototype for gesture’s video collection was designed, and a new method for the segmentation of continuous gestures was shown. Meanwhile, a nursing-care assistant robot prototype was designed for assisting the elderly, which is capable of carrying the elderly with omnidirectional motion and grabbing the specified object at home. In order to evaluate the performance of the gesture recognition system, 10 special gestures were defined as the move commands for interaction with the robot, and 1000 HMT gesture samples were obtained from five subjects for leave-one-subject-out (LOSO) cross-validation classification with an average recognition accuracy of up to 97.34%. Moreover, the performance and practicability of the proposed system were further demonstrated by controlling the omnidirectional movement of the nursing-care assistant robot using the predefined gesture commands.
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Chu, T. S., A. Y. Chua, and Emanuele Lindo Secco. "A Wearable MYO Gesture Armband Controlling Sphero BB-8 Robot." HighTech and Innovation Journal 1, no. 4 (December 1, 2020): 179–86. http://dx.doi.org/10.28991/hij-2020-01-04-05.

Повний текст джерела
Анотація:
In this paper we present the development and preliminary validation of a wearable system which is combined with an algorithm interfacing the MYO gesture armband with a Sphero BB-8 robotic device. The MYO armband is a wearable device which measures real-time EMG signals of the end user’s forearm muscles as the user is executing a set of upper limb gestures. These gestures are interpreted and transmitted to a computing hardware via a Bluetooth Low Energy IEEE 802.15.1 wireless protocol. The algorithm analyzes and sorts the data and sends a set of commands to the Sphero robotic device while performing navigation movements. After designing and integrating the software and hardware architecture, we have validated the system with two sets of trials involving a series of commands performed in multiple iterations. The consequent reactions of the robots, due to these commands, were recorded and the performance of the system was analyzed in a confusion matrix to obtain an average accuracy of the system outcome vs. the expected and desired actions. Results show that our integrated system can satisfactorily interface with the system in an intuitive way with an accuracy rating of 85.7 % and 92.9 % for the two tests, respectively. Doi: 10.28991/HIJ-2020-01-04-05 Full Text: PDF
Стилі APA, Harvard, Vancouver, ISO та ін.
30

ZAMMIT, MARIA, and GRAHAM SCHAFER. "Maternal label and gesture use affects acquisition of specific object names." Journal of Child Language 38, no. 1 (March 10, 2010): 201–21. http://dx.doi.org/10.1017/s0305000909990328.

Повний текст джерела
Анотація:
ABSTRACTTen mothers were observed prospectively, interacting with their infants aged 0 ; 10 in two contexts (picture description and noun description). Maternal communicative behaviours were coded for volubility, gestural production and labelling style. Verbal labelling events were categorized into three exclusive categories: label only; label plus deictic gesture; label plus iconic gesture. We evaluated the predictive relations between maternal communicative style and children's subsequent acquisition of ten target nouns. Strong relations were observed between maternal communicative style and children's acquisition of the target nouns. Further, even controlling for maternal volubility and maternal labelling, maternal use of iconic gestures predicted the timing of acquisition of nouns in comprehension. These results support the proposition that maternal gestural input facilitates linguistic development, and suggest that such facilitation may be a function of gesture type.
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Rani, Mrs E. Uma, P. Swarag Reddy, Sampath Valluri, Anil Chavalla, and Manda Anvesh. "Face Gesture Recognition for Amputees Like Eye Winking for Controlling Mouse Events." International Journal of Research Publication and Reviews 4, no. 6 (June 13, 2023): 3033–36. http://dx.doi.org/10.55248/gengpi.4.623.46777.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
32

KAMADA, Shoichiro, Youngwoo KIM, Goro OBINATA, and Kazunori HASE. "1A1-D09 New gesture interface for controlling manipulators." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2008 (2008): _1A1—D09_1—_1A1—D09_4. http://dx.doi.org/10.1299/jsmermd.2008._1a1-d09_1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Fernández-Flecha, María, María Blume, Andrea Junyent, and Talía Tijero Neyra. "Gesture development in Peruvianchildren and its relationship with vocalizations and vocabulary." Gesture 20, no. 1 (November 22, 2021): 1–29. http://dx.doi.org/10.1075/gest.18010.fer.

Повний текст джерела
Анотація:
Abstract We examine gestural development, and correlations between gesture types, vocalizations and vocabulary at ages 8 to 15 months, employing data from MacArthur-Bates Communicative Development Inventories for Peruvian Spanish, in the first such study with Peruvian children. Results show (1) significant change with age in the production of gesture types, with older children producing more; (2) important correlations between gesture types, and both vocalization types and vocabulary after controlling for age effects; and (3) correlations between the trajectory of the pointing gesture in its two modalities (whole-hand and index-finger) with age, vocalizations, and vocabulary, an effect that persists with respect to vocalizations after controlling for age. Our findings, based on a sample from a non-weird population, support a key role for gesture production in early communicative and linguistic development.
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Vázquez, J. Emmanuel, Manuel Martin-Ortiz, Ivan Olmos-Pineda, and Arturo Olvera-Lopez. "Wheelchair Control Based on Facial Gesture Recognition." International Journal of Information Technologies and Systems Approach 12, no. 2 (July 2019): 104–22. http://dx.doi.org/10.4018/ijitsa.2019070106.

Повний текст джерела
Анотація:
In this article, an approach for controlling a wheelchair using gestures from the user's face is presented, particularly some commands for the basic control operations required for driving a wheelchair are recognized. In order to recognize the face gestures an Artificial Neural Network which is trained since it is one of the most successful classifiers in Pattern Recognition. In particular, the authors' proposed method is useful for controlling a wheelchair when the user has restricted (or zero) mobility in some parts of the body such as: legs, arms or hands. According to their experimental results, the proposed approach provides a successful tool for controlling a wheelchair through a Natural User Interface based on machine learning.
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Swamy, J. Kumara, and Mrs Navya V. K. "AI Based Virtual Mouse with Hand Gesture and AI Voice Assistant Using Computer Vision and Neural Networks." International Journal for Research in Applied Science and Engineering Technology 11, no. 8 (August 31, 2023): 1651–61. http://dx.doi.org/10.22214/ijraset.2023.55412.

Повний текст джерела
Анотація:
Abstract: The use of hand gesture recognition in controlling virtual devices has become popular due to the advancement of artificial intelligence technology. A hand gesture-controlled virtual mouse system that utilizes AI algorithms to recognize hand gestures and translate them into mouse movements is proposed in this paper. The system is designed to provide an alternative interface for people who have difficulty using a traditional mouse or keyboard. The proposed system uses a camera to capture images of the user’s hand, which are processed by an AI algorithm to recognize the gestures being made. The system is trained using a dataset of hand gestures to recognize different gestures. Once the gesture is recognized, it is translated into a corresponding mouse movement, which is then executed on the virtual screen. The system is designed to be scalable and adaptable to different types of environments and devices. All the input operations can be virtually controlled by using dynamic/static hand gestures along with a voice assistant. In our work we make use of ML and Computer Vision algorithms to recognize hand gestures and voice commands, which works without any additional hardware requirements. The model is implemented using CNN and mediapipe framework. This system has potential applications like enabling hand-free operation of devices in hazardous environments and providing an alternative interface for hardware mouse. Overall, the hand gesturecontrolled virtual mouse system offers a promising approach to enhance user experience and improve accessibility through human-computer interaction.
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Wang, Xian, Paula Tarrío, Ana María Bernardos, Eduardo Metola, and José Ramón Casar. "User-independent accelerometer-based gesture recognition for mobile devices." ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal 1, no. 3 (July 1, 2013): 11–25. http://dx.doi.org/10.14201/adcaij20121311125.

Повний текст джерела
Анотація:
Many mobile devices embed nowadays inertial sensors. This enables new forms of human-computer interaction through the use of gestures (movements performed with the mobile device) as a way of communication. This paper presents an accelerometer-based gesture recognition system for mobile devices which is able to recognize a collection of 10 different hand gestures. The system was conceived to be light and to operate in a user-independent manner in real time. The recognition system was implemented in a smart phone and evaluated through a collection of user tests, which showed a recognition accuracy similar to other state-of-the art techniques and a lower computational complexity. The system was also used to build a human-robot interface that enables controlling a wheeled robot with the gestures made with the mobile phone
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Pallavi, Pallavi, Vinod Kumar, Praveen Kumar, and Anuragh Singh. "Gesture Controlled Mouse." International Journal of Online Engineering (iJOE) 13, no. 04 (April 28, 2017): 70. http://dx.doi.org/10.3991/ijoe.v13i04.6898.

Повний текст джерела
Анотація:
Gesture controlled mouse is a device in which we can move the mouse by giving the direction from our wrist and turn the direction of mouse through it. After using a accelerometer we can able to move and change the direction of our mouse and by the help of accelerometer we can also change the speed of the mouse. This is a new technology for controlling any device through our wrist movement. After upgrading it we can make many type of other devices which can be very useful for the human being can create a turn in our life. It can be used at many places and also be the part of the future.The gesture will be main part as the movements of our body will play the important role for operations. A stationary point will be taken as the directions will be in x-axis, y-axis and as well as the z-axis. An accelerometer will be used as the direction indicators.
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Park, Ki-Chang, Seong-Chae Seo, Seung-Moon Jeong, Im-Cheol Kang, and Byung-Gi Kim. "Design of Gesture based Interfaces for Controlling GUI Applications." Journal of the Korea Contents Association 13, no. 1 (January 28, 2013): 55–63. http://dx.doi.org/10.5392/jkca.2013.13.01.055.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Güttler, Jörg, Dany Bassily, Christos Georgoulas, Thomas Linner, and Thomas Bock. "Unobtrusive Tremor Detection While Gesture Controlling a Robotic Arm." Journal of Robotics and Mechatronics 27, no. 1 (February 20, 2015): 103–4. http://dx.doi.org/10.20965/jrm.2015.p0103.

Повний текст джерела
Анотація:
<div class=""abs_img""><img src=""[disp_template_path]/JRM/abst-image/00270001/12.jpg"" width=""300"" />Gesture based validation</div> A light weight robotic arm (Jaco) has been interfaced with a novel gesture detection sensor (Leap Motion Controller), substituting complicated conventional input devices, i.e., joysticks and pads. Due to the enhanced precision and high throughput capabilities of the Leap Motion Controller, the unobtrusive measurement of physiological tremor can be extracted. An algorithm was developed to constantly detect and indicate potential user hand tremor patterns in real-time. Additionally a calibration algorithm was developed to allow an optimum mapping between the user hand movement, tracked by the Leap Motion Controller, and the Jaco arm, by filtering unwanted oscillations, allowing for a more natural human-computer interaction. </span>
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Hyusein, Gyulten, and Tilbe Göksun. "The creative interplay between hand gestures, convergent thinking, and mental imagery." PLOS ONE 18, no. 4 (April 6, 2023): e0283859. http://dx.doi.org/10.1371/journal.pone.0283859.

Повний текст джерела
Анотація:
Using hand gestures benefits children’s divergent thinking and enhances verbal improvisation in adults. In the present study, we asked whether gestures were also associated with convergent thinking by activating individuals’ verbal lexicon and maintaining their visuospatial imagery. We tested young adults on verbal and visual convergent thinking, controlling for their mental imagery skills. Results showed that gestures and mental imagery skills play a role in verbal but not visual convergent thinking. Regardless of whether gestures were spontaneous or encouraged, we found a negative association between overall gesture frequency and verbal convergent thinking or individuals with low mental imagery, and a positive association for individuals with high mental imagery. Representational gestures benefited verbal convergent thinking for everyone except those who had low mental imagery and no experience with the task. Performing beat gestures hampered verbal convergent thinking in people with lower mental imagery capacity and helped those who had high mental imagery and previous experience with the task. We also found that gesturing can benefit people with lower verbal abilities on verbal convergent thinking, however, high spatial imagery abilities were required for gestures to boost verbal convergent thinking. The current study adds a new dimension to both the embodied creativity literature and the kaleidoscope of individual differences in gesture research.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Yeh, Shih-Ching, Eric Hsiao-Kuang Wu, Ying-Ru Lee, R. Vaitheeshwari, and Chen-Wei Chang. "User Experience of Virtual-Reality Interactive Interfaces: A Comparison between Hand Gesture Recognition and Joystick Control for XRSPACE MANOVA." Applied Sciences 12, no. 23 (November 29, 2022): 12230. http://dx.doi.org/10.3390/app122312230.

Повний текст джерела
Анотація:
This research intends to understand whether users would adopt the interactive interface of hand gesture recognition for XRSPACE MANOVA in the virtual-reality environment. Different from the traditional joystick control and external sensors, XRSPACE MANOVA’s hand gesture recognition relies on cameras built into the head-mount display to detect users’ hand gestures and interact with the system to provide a more life-like immersive experience. To better understand if users would accept this hand gesture recognition, the current experiment compares users’ experiences with hand gesture recognition and joystick control for XRSPACE MANOVA while controlling for the effects of gender, college major, and the completion time. The results suggest that users of hand gesture recognition have better perceptions of enjoyment, satisfaction, and confirmation, which means that they have a relatively fun and satisfying experience and that their expectations of the system/technology confirm their actual usage. Based on the parametric statistical analyses, user assessments show that perceived usefulness, perceived ease-of-use, attitude, and perception of internal control suggest that, in terms of operating performance, users are more accepting of the traditional joystick control. When considering the length of usage time, this study finds that, when hand gesture recognition is used for a relatively longer time, users’ subjective evaluations of internal control and behavioral intention to use are reduced. This study has, therefore, identified potential issues with hand gesture recognition for XRSPACE MANOVA and discussed how to improve this interactive interface. It is hoped that users of hand gesture recognition will obtain the same level of operating experience as if they were using the traditional joystick control.
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Mohammed Musharaf Z, Meril Akash. J, and M. Malleswari. "Dynamic virtual assistance of I/O functionalities." World Journal of Advanced Engineering Technology and Sciences 8, no. 2 (March 30, 2023): 023–33. http://dx.doi.org/10.30574/wjaets.2023.8.2.0061.

Повний текст джерела
Анотація:
With significant advancements being witnessed in the engineering industry daily, it has become increasingly vital for society to seek out particular new ways of interacting with computer technology and automation as their demand grows in society. Today, every device is developing the use of touch screen technology on its computer systems, although it is not cost-effective to use in all applications. A specialized system, similar to a virtual device, that provides object pursuit (tracking) and Gestures to let us engage; it might be an effective alternative to the standard touch screen and also the solid physical gadgets. The goal is to create an object pursuit (tracking) program that communicates with the computer system. This proposed model is a computer vision-based control system that involves hand movements taken from a digital camera via a hand detection technique implemented with OpenCV libraries. Our project applies gesture recognition as a topic that comes under two computer science fields augmented reality and human-computer interaction and we have created a virtual gesture system to elucidate human gestures through mathematical algorithms. Users can use simple finger or hand gestures to control or interact with the system without physically touching them and also included voice assistance to start and end the gesture controlling system. Gesture recognition can be viewed as a way for computers to begin to recognize human body language and signs, thus stuffing the void between computing systems and humans than the earliest text user interfaces or even graphical user interfaces, which still limit the majority of input to keyboard and mouse are may not be very efficient at all times. The algorithm is focused on deep learning for detecting the gestures. Hence, the proposed system will avoid the pandemic situation of COVID-19 spread by reducing the human interaction with the devices to control the system
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Fegade, Toshika, Yogesh Kurle, Sagar Nikale, and Praful Kalpund. "Wireless Gesture Controlled Semi-Humanoid Robot." IAES International Journal of Robotics and Automation (IJRA) 5, no. 4 (December 1, 2016): 237. http://dx.doi.org/10.11591/ijra.v5i4.pp237-243.

Повний текст джерела
Анотація:
<p>Robotics is a field concerned with the “intelligent connection of perception of action”. The most common manufacturing robot is the robotic arm with different degree of freedoms. Today, these humanoids perform many functions to assist humans in different undertakings such as space missions, driving and monitoring high speed vehicles. They are called semi-humanoids because they resemble to upper part of human body.</p><p> The idea of this paper is to change perception of controlling robotic arm. This paper provides a way to get rid of old fashioned remote controls and gives an intuitive technique for implementation of Semi-Humanoid Gesture controlled robot. It includes two robot arms which are exactly similar to human arm (5 fingers) increasing sensitivity of the system. It includes motion sensors -flex and accelerometer (used in mobile phones for tilting motion). The system design is divided into 3 parts namely: Robotic Arm, Real time video and Platform.</p> The prime aim of the design is that the robot arm and platform starts the movement as soon as the operator makes hand and leg gesture. The Robotic arm is synchronized with the gestures (hand postures) of the operator and the platform part is controlled by the leg gestures of the operator. The robot and the Gesture device are connected wireless via RF. The wireless communication enables user to interact with the robot in an effortless way.
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Ranjan, Shudhanshu, Sumit Anand, Vinayak Gontia, Aman Chandra, Vineet Sharan, and Spandana SG. "Virtual Mouse Control using Hand Gesture." International Journal of Engineering Research in Computer Science and Engineering 9, no. 11 (November 1, 2022): 20–23. http://dx.doi.org/10.36647/ijercse/09.11.art006.

Повний текст джерела
Анотація:
This research proposes a way for controlling the cursor's placement using only one's hands and no electronic equipment. While movements like clicking and dragging things will be done using various hand gestures. The suggested system will just require a webcam as an input device. OpenCV and Python, as well as additional tools, will be required for the suggested system.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

T. Kalai Selvi, S. Sasirekha, M. Manikandan, M. Obath Solomon, and M. Vignesh. "Virtual Mouse using OpenCV and VNC." June 2023 5, no. 2 (June 2023): 169–79. http://dx.doi.org/10.36548/jitdw.2023.2.007.

Повний текст джерела
Анотація:
Virtual Network Computing (VNC) plays a significant role in advanced remote access by allowing users to remotely control another computer or virtual machine over a network connection, and applies to real-world entities. Virtual remote control is the ability to use software with a graphical user interface to remotely operate a computer or virtual machine. One of the key advantages of controlling a virtual remote is that it allows users to interact with the remote system as if they were physically present at the remote location. This research proposes a design of virtual mouse that relies on Hand Gesture Recognition using OpenCV and VNC. It tracks hand movements and recognizes the gestures to perform the mouse tasks. Hand gesture recognition technology uses computer vision algorithms and machine learning techniques to analyse and interpret human hand movements.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Liu, Yi, Chenglei Dong, Min Zhang, Xinxin Chen, and Yanjun Li. "A Novel Music Player Controlling Design Based on Gesture Recognition." Research Journal of Applied Sciences, Engineering and Technology 7, no. 2 (January 10, 2014): 374–78. http://dx.doi.org/10.19026/rjaset.7.264.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Idiatullov, T. T., N. A. Romantsov, and E. B. Chabanenko. "Contactless copying H2C interface for controlling an anthropomorphic robot." Izvestiya MGTU MAMI 8, no. 2-2 (March 20, 2014): 37–43. http://dx.doi.org/10.17816/2074-0530-67647.

Повний текст джерела
Анотація:
The article deals with the problems of creation and functioning of human-oriented control systems that use contactless gesture interfaces. The analysis of existing approaches and assumptions was made and the most effective approaches to build such systems were underlined.
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Vijay Joseph, Arockia, Akshat Mathur, Jatin Verma, and Ankita Singh. "Gesture based wireless control for a robotic manipulator." International Journal of Engineering & Technology 7, no. 2.31 (May 29, 2018): 231. http://dx.doi.org/10.14419/ijet.v7i2.31.13449.

Повний текст джерела
Анотація:
This project plays a very important role to complement the industrial and automation field. Nowadays, robots are used in several fields of engineering and manufacturing and the systems for controlling or actuating them have also enhanced from the past. The use of gestures for controlling them has been the new trend to control the movement of robotic manipulators. The various methodologies for controlling them are motion tracking, image processing and by using Kinect sensors. All these methods can be used as a teach pendant where one can provide the movement of the manipulator as a preset and the manipulator can carry out the same motion repetitively, or in the case of motion tracking and while using Kinect sensors, the user is bound to a confined area where the cameras can monitor the user’s body. Here, we propose a wireless controlled robotic arm system for tool handling (pick and place) and many other applications where human reach is elusive. The result is that the gestures of the human hand are in sync with the manipulator’s movement. Further, this robotic arm has been implanted beneath a drone which would then have the ability to reach certain heights where human reach is impervious or might put a human’s life in jeopardy. In this case, the user can maneuver along with manipulator wherever it is used.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Glory Veronica, Parsipogu, Ravi Kumar Mokkapati, Lakshmi Prasanna Jagupilla, and Chella Santhosh. "Static Hand Gesture Recognition Using Novel Convolutional Neural Network and Support Vector Machine." International Journal of Online and Biomedical Engineering (iJOE) 19, no. 09 (July 7, 2023): 131–41. http://dx.doi.org/10.3991/ijoe.v19i09.39927.

Повний текст джерела
Анотація:
Hand tracking and identification through visual means pose a challenging problem. To simplify the identification of hand gestures, some systems have incorporated position markers or colored bands, which are not ideal for controlling robots due to their inconvenience. The motion recognition problem can be solved by combining object identification, recognition, and tracking using image processing techniques. A wide variety of target detection and recognition image processing methods are available. This paper proposes novel CNN-based methods to create a user-free hand gesture detection system. The use of synthetic techniques is recommended to improve recognition accuracy. The proposed method offers several advantages over existing methods, including higher accuracy and real-time hand gesture recognition suitable for sign language recognition and human-computer interaction. The CNN automatically extracts high-level characteristics from the source picture, and the SVM is used to classify these features. This study employed a CNN to automatically extract traits from raw EMG images, which is different from conventional feature extractors. The SVM classifier then determines which hand gestures are being made. Our tests demonstrate that the proposed strategy achieves superior accuracy compared to using only CNN.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Khaksar, Siavash, Luke Checker, Bita Borazjan, and Iain Murray. "Design and Evaluation of an Alternative Control for a Quad-Rotor Drone Using Hand-Gesture Recognition." Sensors 23, no. 12 (June 9, 2023): 5462. http://dx.doi.org/10.3390/s23125462.

Повний текст джерела
Анотація:
Gesture recognition is a mechanism by which a system recognizes an expressive and purposeful action made by a user’s body. Hand-gesture recognition (HGR) is a staple piece of gesture-recognition literature and has been keenly researched over the past 40 years. Over this time, HGR solutions have varied in medium, method, and application. Modern developments in the areas of machine perception have seen the rise of single-camera, skeletal model, hand-gesture identification algorithms, such as media pipe hands (MPH). This paper evaluates the applicability of these modern HGR algorithms within the context of alternative control. Specifically, this is achieved through the development of an HGR-based alternative-control system capable of controlling of a quad-rotor drone. The technical importance of this paper stems from the results produced during the novel and clinically sound evaluation of MPH, alongside the investigatory framework used to develop the final HGR algorithm. The evaluation of MPH highlighted the Z-axis instability of its modelling system which reduced the landmark accuracy of its output from 86.7% to 41.5%. The selection of an appropriate classifier complimented the computationally lightweight nature of MPH whilst compensating for its instability, achieving a classification accuracy of 96.25% for eight single-hand static gestures. The success of the developed HGR algorithm ensured that the proposed alternative-control system could facilitate intuitive, computationally inexpensive, and repeatable drone control without requiring specialised equipment.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії