Academic literature on the topic 'GESTURE CONTROLLING'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'GESTURE CONTROLLING.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "GESTURE CONTROLLING"

1

Vijaya, V. Krishna, Puvvala Harsha, Sricharan Murukutla, Kurra Eswar, and Nannapaneni Sravan Kuma. "Hand Gesture Controlling System." International Journal for Research in Applied Science and Engineering Technology 11, no. 1 (January 31, 2023): 669–74. http://dx.doi.org/10.22214/ijraset.2023.48653.

Full text
Abstract:
bstract: As a result of the industrial 4.0 revolution, hand gestures are becoming more and more significant in the disciplines of robotics and IoT. Hand gestures are often used in the IoT field in applications for smart homes, wearable technology, vehicles, virtual reality, and other things. Let's try to infuse this work with some of our own originality. Combining Python and Arduino for Laptop/Computer Gesture Control. We'll utilise two Ultrasonic sensors to determine where our hand is with relation to a media player (VLC) and control it. We'll mount two ultrasonic sensors on top of the monitor and use an Arduino to gauge how far away it is from our hand. We will take particular activities in response to this measurement. We can do computer operations using the Python PyautoGUI module. The computer gets commands from the Arduino via the serial port. Currently, scientists are working to develop a hand gesture computer that runs entirely on hand gestures and sensors instead of any hardware. Few researchers have actually shown that we can control the video player, web browser, and text document with hand movements using Arduino and ultrasonic sensors.
APA, Harvard, Vancouver, ISO, and other styles
2

Varshika, DSS. "Media-Player Controlling by Hand Gestures." International Journal for Research in Applied Science and Engineering Technology 9, no. VI (June 20, 2021): 2022–30. http://dx.doi.org/10.22214/ijraset.2021.348515421.

Full text
Abstract:
In this Project we try to control our media player using hand gestures with the help of OpenCV and Python. Computer applications require interaction between human and computer. This interaction needs to be unrestricted and it has made it challenging to traditional input devices such as keyboard, mouse, pen etc. Hand gesture is an important component of body languages in linguistics. Human computer interaction becomes easy with the use of the hand as a device. Use of hand gestures to operate machines would make interaction interesting. Gesture recognition has gained a lot of importance. Hand gestures are used to control various applications like windows media player, robot control, gaming etc. Use of gesture makes interaction easy, convenient and does not require any extra device. Vision and audio recognition can be used together. But audio commands may not work in noisy environments.
APA, Harvard, Vancouver, ISO, and other styles
3

Alyamani, Hasan J. "Gesture Vocabularies for Hand Gestures for Controlling Air Conditioners in Home and Vehicle Environments." Electronics 12, no. 7 (March 23, 2023): 1513. http://dx.doi.org/10.3390/electronics12071513.

Full text
Abstract:
With the growing prevalence of modern technologies as part of everyday life, mid-air gestures have become a promising input method in the field of human–computer interaction. This paper analyses the gestures of actual users to define a preliminary gesture vocabulary for home air conditioning (AC) systems and suggests a gesture vocabulary for controlling the AC that applies to both home and vehicle environments. In this study, a user elicitation experiment was conducted. A total of 36 participants were filmed while employing their preferred hand gestures to manipulate a home air conditioning system. Comparisons were drawn between our proposed gesture vocabulary (HomeG) and a previously proposed gesture vocabulary which was designed to identify the preferred hand gestures for in-vehicle air conditioners. The findings indicate that HomeG successfully identifies and describes the employed gestures in detail. To gain a gesture taxonomy that is suitable for manipulating the AC at home and in a vehicle, some modifications were applied to HomeG based on suggestions from other studies. The modified gesture vocabulary (CrossG) can identify the gestures of our study, although CrossG has a less detailed gesture pattern. Our results will help designers to understand user preferences and behaviour prior to designing and implementing a gesture-based user interface.
APA, Harvard, Vancouver, ISO, and other styles
4

SHAIK, Dr ABDUL NABI, E. SAI PRIYA, G. NIKITHA, K. PRACHEEN KUMAR, and N. SHRAVYA SHREE. "CONTROLLING VIDEOLAN CLIENT MEDIA USING LUCAS KANADE OPTIMAL FLOW ALGORITHM AND OPENCV." YMER Digital 21, no. 05 (May 8, 2022): 246–55. http://dx.doi.org/10.37896/ymer21.05/29.

Full text
Abstract:
In this project we've discussed a system which uses some dynamic hand gesture recognition technique to manage the media players like VLC media player. This project consist of certain modules which segments the foreground a part of the frame using detection of skin colour and approximate median technique. the popularity of hand gesture is finished by creating a call Tree, that uses certain features extracted from the segmented part. This hand gesture recognition technique introduces a replacement, natural thanks to interact with computers and is beneficial to several folks in our day-to-day life. Hand gestures are associated with the human hands in hand gesture recognition. This project is employed for controlling certain operations (Pause, Play, Volume up, Volume Down, Mute) on video player by mere hand gestures without getting in the rigmarole of pressing buttons or tapping onto the screen. this may be directly associated with our day to life like in presentations. We have considered hand gestures and their directional motion defines a gesture for the applying. during this application image retrieving is finished employing a Webcam. Some functions in VLC media players are used most often and these functions uses some predefined gestures. Shows the defined gestures in keeping with the VLC player control operation. We created a VLC Media Player Controller using Hand Gesture Recognition System to form ‘HUMAN LIFE EASY AND BETTER’. This project is implemented using two steps: (1.) Creation of Hand Gesture Recognition System: this is often done using image processing using OpenCV library. (2.) Controlling VLC Media Player using hand gestures: during this step we controlled the player using shell commands which were recalled using python commands through OS library
APA, Harvard, Vancouver, ISO, and other styles
5

Chakradhar, K. S., Prasanna Rejinthala Lakshmi, salla chowdary Sree Rama Brunda, Bharathi Pola, and Bhargava Petlu. "Controlling Media Player Using Hand Gestures." Electrical and Automation Engineering 2, no. 1 (April 1, 2023): 45–54. http://dx.doi.org/10.46632/eae/2/1/7.

Full text
Abstract:
Computer usage is increasing rapidly day by day but the input devices are limited and to access them, we need to be near the screen. To overcome this problem and control the screen, we can use hand gestures. For every operation, we used different hand gestures. We proposed a python program to control the media player through hand gestures. In this method, we used libraries like OpenCV, Media Pipe, PyAuto GUI, and other libraries to capture the video, provide ready-to-use ML solutions and automate your GUI and programmatically control your keyboard and mouse. Hand gestures will be used as the input for providing natural interaction by reducing external hardware interaction. The whole process is divided into two steps. Firstly, gesture recognition through the camera is done by OpenCV and media Pipe helps to identify the gesture b its position, and the respective command is executed. Secondly, PyAuto GUI is used to automate the keyboard and controls the media player.
APA, Harvard, Vancouver, ISO, and other styles
6

Chakradhar, K. S., Lakshmi Prasanna Rejinthala, chowdary Sree Rama Brunda salla, Bharathi Pola, and Bhargava Petlu. "Controlling Media Player Using Hand Gestures." Electrical and Automation Engineering 2, no. 1 (April 1, 2023): 45–54. http://dx.doi.org/10.46632/ese/2/1/7.

Full text
Abstract:
Computer usage is increasing rapidly day by day but the input devices are limited and to access them, we need to be near the screen. To overcome this problem and control the screen, we can use hand gestures. For every operation, we used different hand gestures. We proposed a python program to control the media player through hand gestures. In this method, we used libraries like OpenCV, Media Pipe, PyAuto GUI, and other libraries to capture the video, provide ready-to-use ML solutions and automate your GUI and programmatically control your keyboard and mouse. Hand gestures will be used as the input for providing natural interaction by reducing external hardware interaction. The whole process is divided into two steps. Firstly, gesture recognition through the camera is done by OpenCV and media Pipe helps to identify the gesture b its position, and the respective command is executed. Secondly, PyAuto GUI is used to automate the keyboard and controls the media player.
APA, Harvard, Vancouver, ISO, and other styles
7

Labhane, Nishant M., Prashant Harsh, and Meghan Kulkarni. "Multipoint Hand Gesture Recognition For Controlling Bot." International Journal of Scientific Research 1, no. 1 (June 1, 2012): 46–48. http://dx.doi.org/10.15373/22778179/jun2012/16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Monisha Sampath, PriyadarshiniVelraj, Vaishnavii Raghavendran, and M Sumithra. "Controlling media player using hand gestures with VLC media player." World Journal of Advanced Research and Reviews 14, no. 3 (June 30, 2022): 466–72. http://dx.doi.org/10.30574/wjarr.2022.14.3.0565.

Full text
Abstract:
In today's international, anyone opts for instant interaction with complicated structures that ensure a brief response. Thus, with increasing improvement in technology, reaction time and ease of operations are the issues. Here is where human-computer interaction comes into play. This interplay is unrestricted and challenges the used gadgets consisting of the keyboard and mouse for input. Gesture recognition has been gaining tons of attention. Gestures are instinctive and are often utilized in everyday interactions. Therefore, communicating using gestures with computer systems creates an entire new trend of interaction. In this assignment, with the help of laptop vision and deep studying techniques, person hand movements (gestures) are used in real-time to manipulate the media player. In this project, seven gestures are defined to control the media gamers' usage of hand gestures. The proposed internet application permits the person to use their neighborhood device digicam to become aware of their gesture and execute the control over the media participant and comparable packages (with no extra hardware). It will increase performance and make interaction convenient through letting the user manage his/her pc/laptop from a distance.
APA, Harvard, Vancouver, ISO, and other styles
9

Holder, Sherrie, and Leia Stirling. "Effect of Gesture Interface Mapping on Controlling a Multi-degree-of-freedom Robotic Arm in a Complex Environment." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 64, no. 1 (December 2020): 183–87. http://dx.doi.org/10.1177/1071181320641045.

Full text
Abstract:
There are many robotic scenarios that require real-time function in large or unconstrained environments, for example, the robotic arm on the International Space Station (ISS). Use of fully-wearable gesture control systems are well-suited to human-robot interaction scenarios where users are mobile and must have hands free. A human study examined operation of a simulated ISS robotic arm using three different gesture input mappings compared to the traditional joystick interface. Two gesture mappings permitted multiple simultaneous inputs (multi-input), while the third was a single-input method. Experimental results support performance advantages of multi-input gesture methods over single input. Differences between the two multi-input methods in task completion and workload display an effect of user-directed attention on interface success. Mappings based on natural human arm movement are promising for gesture interfaces in mobile robotic applications. This study also highlights challenges in gesture mapping, including how users align gestures with their body and environment.
APA, Harvard, Vancouver, ISO, and other styles
10

Lorang, Emily, Audra Sterling, and Bianca Schroeder. "Maternal Responsiveness to Gestures in Children With Down Syndrome." American Journal of Speech-Language Pathology 27, no. 3 (August 6, 2018): 1018–29. http://dx.doi.org/10.1044/2018_ajslp-17-0138.

Full text
Abstract:
Purpose This study compared gesture use in young children with Down syndrome (DS) and typical development (TD) as well as how mothers respond to child gestures based on child age and diagnosis. Method Twenty-two mother–child dyads with DS and 22 mother–child dyads with TD participated. The child participants were between 22 and 63 months and were matched on chronological age. We coded child gesture use and whether mothers recoded child gestures (i.e., provided a verbal translation) during naturalistic interactions. Results The children with DS used more gestures than peers with TD. After controlling for expressive language ability, the two groups were not significantly different on child gesture use. Regardless of child diagnosis, mothers recoded approximately the same percentage of child gestures. There was a significant interaction between child diagnosis and child age when predicting the percentage of maternal gesture recodes; mothers of children with DS did not demonstrate differences in the percentage of maternal gesture recodes based on child age, but there was a negative relationship between the percentage of maternal gesture recodes and child age for the children with TD. Conclusions Young children with DS gesture more than chronological age–matched children with TD, therefore providing numerous opportunities for caregivers to recode child gestures and support language development. Early intervention should focus on increasing parent responsiveness to child gestures earlier in life in order to provide additional word-learning opportunities for children with DS.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "GESTURE CONTROLLING"

1

Mráz, Stanislav. "Rozpoznání gest ruky v obrazu." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2011. http://www.nusl.cz/ntk/nusl-219059.

Full text
Abstract:
This master’s thesis is dealing with recognition of an easy static gestures in order to computer controlling. First part of this work is attended to the theoretical review of methods used to hand segmentation from the image. Next methods for hang gesture classification are described. The second part of this work is devoted to choice of suitable method for hand segmentation based on skin color and movement. Methods for hand gesture classification are described in next part. Last part of this work is devoted to description of proposed system.
APA, Harvard, Vancouver, ISO, and other styles
2

Miniotaité, Jura. "JoyTilt : Beyond GUI App Design for Embodied Experience of Controlling a Robot Vacuum Cleaner." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-294338.

Full text
Abstract:
Domestic IoT appliances like smart speakers, smart locks and robot vacuum cleaners are usually connected through smartphone apps to provide additional functionality and remote control. Although smartphones have many different sensors and actuators, these apps provide a primarily graphical user interface with these appliances. To explore a more somatically engaging experience the prototype JoyTilt was designed, developed and tested with users. JoyTilt enables its user to control a robot vacuum cleaner like a toy car by tilting their phone in the direction they want it to go. This study found that the participants had their gaze focused on the robotic vacuum cleaner while controlling it and that the participants had both positive and negative bodily experiences with JoyTilt. Interviews with the participants provide suggestions for balancing control of robot vacuum cleaners and keeping the robot’s autonomy using JoyTilt. In this study the aesthetics of the material, choice of materials and choice of interaction model come together in the design of JoyTilt to shape the human-robot relationship. Lastly, the thesis highlights the values of further considering the bodily experience when designing apps.
Uppkopplade produkter för hemmabruk så som smarta högtalare, smarta lås och larm eller robotdammsugare kopplas till våra smartphones via appar som möjliggör ytterligare funktioner, exempelvis fjärrstyrning och schemaläggning. Trots att smartphones innehåller en mängd olika sensorer används i regel primärt grafiskt gränssnitt i dessa appar. Genom att designa, utveckla och testa prototypen JoyTilt utforskas en somaestetisk användarupplevelse med fokus på den fysiska upplevelsen. Med appen JoyTilt kan en robotdammsugare styras likt en radiostyrd bil genom att luta smartphonen. I testerna med JoyTilt hade deltagarna sin blick fästa på robotdammsugaren när de styrde den. Deltagarna hade både positiva och negativa kroppsliga upplevelser av Joytilt. Från intervjuerna med deltagarna kom förslag på sätt att balansera kontroll av robotdammsugaren med dess autonomi med hjälp av JoyTilt. Val av designmaterial, dess estetik och interaktionsmodell bidrog till utformningen av JoyTilt som i sin tur påverkade relationen mellan människa och robot. Slutligen understryker den här studien värdet av att ta den kroppsliga upplevelsen i beaktning och nyttja det ofta förbisedda fysiska designutrymmet i appdesign.
APA, Harvard, Vancouver, ISO, and other styles
3

Weinsteinová, Adéla. "Negativní aspekty nasazování ICT." Master's thesis, Vysoká škola ekonomická v Praze, 2013. http://www.nusl.cz/ntk/nusl-199727.

Full text
Abstract:
This diploma thesis exmines the negative aspects of using information and communication technologies. The main attention is fosused on virtual reality, especially 3D projection in order to decide wheather the using of it has negative impact on phycal and psychological state of the user or not, which exactly are these impacts and what probably cause them. This first part is dedicated to expain concept and history of ICT, explonation of the multidimensional princip and content of each dimension. The following is determination of which ICT areas are currently struggling in their use with negative effects. Detected list of these areas is reduced to six specific technologies which common user has opportunities to experience. These particular technologies were examined as a form of questionnaire, which resulted in a determination of the most used one, ie virtual reality. Shortly afterwards was conducted an experiment imparting undesirable effects caused by using virtual reality. The other five selected technologies are discussed for the most important negatives with which has to face today. The main finding of this study is the identification the side effects of virtual reality based on different types of technologies which are anaglyf 3D projection, passive 3D projection, active 3D projection and comparing if adverse effects depends on the type. It also includes determining the rate of uptake in virtual reality.
APA, Harvard, Vancouver, ISO, and other styles
4

DISODIA, MANEESH. "GESTURE CONTROLLED TELEOPERATION." Thesis, 2016. http://dspace.dtu.ac.in:8080/jspui/handle/repository/15361.

Full text
Abstract:
This work represents the development of gesture based control over large distance as an interface to integrate various control signal. Here we are integrating user interface framework based on various current input device. The purpose is to tele operate a puppet (various servo) using human gesture as a controlee. The advancement in robotics and mechanics and software engineering, along with human ergonomics, allowed us to take control real world entity as an object to control real world coordinate. By simulating real condition like human based mundane task as a stored procedure after that we can execute that task to make thing happen later on. Now we are controlling objects by joystick UI or over internet. But in real world control are like haptic feedback are more constraints towards perfection in control mechanics. Video game developer are using gesture tracking to adapt new game design technique. Also various movies are also embedded in gesture to make animation, so here we are representing gesture as a control sense. Open source and hardware have acquired a much essential place in the today’s technological word. The development and revolution of open source is very fast as the blue prints are spread wide world and changes in the further development is very easy. This revolutionary step has tended all the experts and new projects to align towards the open source and open hardware environment. The aim of this experiment is to control the real time objects acquiring gestures from the user sitting at one end of the world and the object in any other part separated by distance but connected by internet. Gestures from one device are sent to other the device sending the gestures in form of signals is the server and the one receiving it is the client located at any remote place. Gesture controlling as various future uses like it can become a less bandwidth requiring telephony system for deaf people as communication by webcams need high speed data transfer rate which is not required in gesture communication and v saving bandwidth can increase the data transfer rate. Yet another use can be in the field of medicine where minute operations which are carried on by the robots can be controlled at very miniature level as the interaction is based on the coordinate transfer which would in turn reduce the error of any kind. Time utilization sitting anywhere can be done by communicating with your device which will also reduce the work pressure in the future. So, giving input to devices through our gesture and controlling their work is the coming future of human race.
APA, Harvard, Vancouver, ISO, and other styles
5

Lin, Chun-wei, and 林君瑋. "A Research on User`s Mental Model in Gesture Controlling." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/80318078053562329279.

Full text
Abstract:
碩士
南台科技大學
多媒體與電腦娛樂科學系
98
1801, Joseph Marie Jacquard improved the designment of loom, he took a series of punched card a kind of the input tool of calculator. Until now, by the pronunciation identification system and touch-control system, interface developers constantly looking for more natural, more suitable interaction to shorten the distance between human and machine. 1981, the Graph User Interface (GUI) developed by Palo Alto research centre of Xerox Company is more close to user-friendly, easy visibility, transparency…ect. The iPhone cell phone of Apple crop. in 2007, it shortened the distance between human and machine by directly input devices. Facial expression and hand gesture have the function of communicating with people, and now Hitachi, Toshiba and other technology companies published the conceptual products by hand gesture control; it showed the hand gesture control will become the tendency. This research probed into the movements and causes of hand gesture control, then found out the mental way of hand gesture control. This research's concrete purposes are following: (1) Recording the methods and causes of the action of hand gesture control by explorative experiments. (2) Observing the operation of the system Windows 7 which has the nearly 70% market share. (3) Realizing the movements and causes which was the trend of experimental subject by the interval scale. (4) Finding out the mental way of hand gesture control by sorting out the experiment results. Designing 14 experiment tasks, and tested by the users whom didn't familiar to the computer, then, analyzing the results by experimental observation. From the experiment date, the most commonly used movements were "finger-touch" and "wave the hand"; the actions preferred to the small scope. Many gesture operations carry on the one-handed operation, and it also had nothing to do with dominant hand. Specially, the factors of the operation based on the origination of movement, then, analyzed and inducted to three types: "Imitation and Reappearance Behavior", "Association and Materialize Behavior" and "Trying and Feedback Behavior". The most action which often be used was "Imitation and Reappearance Behavior", and the next was "Association and Materialize Behavior. " In the Imitation and Reappearance Behavior, it was analyzed and inducted to three concrete courses: (1) imitating the operation of mouse. (2) imitating the operation of touch-control devise (3) imitating the methods of using commodities. In the Association and Materialize Behavior, it was analyzed and inducted to three concrete courses: (1) imagining that hand touch the virtual thing and control it. (2) bringing the imagination by the meanings of the symbols. (3) the hand performance was similar to semantics. In the Trying and Feedback Behavior, it was analyzed and inducted to two concrete courses: (1) operating directly by instinct. (2) trying to be more influential movements.
APA, Harvard, Vancouver, ISO, and other styles
6

Fourney, Adam. "Design and Evaluation of a Presentation Maestro: Controlling Electronic Presentations Through Gesture." Thesis, 2009. http://hdl.handle.net/10012/4847.

Full text
Abstract:
Gesture-based interaction has long been seen as a natural means of input for electronic presentation systems; however, gesture-based presentation systems have not been evaluated in real-world contexts, and the implications of this interaction modality are not known. This thesis describes the design and evaluation of Maestro, a gesture-based presentation system which was developed to explore these issues. This work is presented in two parts. The first part describes Maestro's design, which was informed by a small observational study of people giving talks; and Maestro's evaluation, which involved a two week field study where Maestro was used for lecturing to a class of approximately 100 students. The observational study revealed that presenters regularly gesture towards the content of their slides. As such, Maestro supports several gestures which operate directly on slide content (e.g., pointing to a bullet causes it to be highlighted). The field study confirmed that audience members value these content-centric gestures. Conversely, the use of gestures for navigating slides is perceived to be less efficient than the use of a remote. Additionally, gestural input was found to result in a number of unexpected side effects which may hamper the presenter's ability to fully engage the audience. The second part of the thesis presents a gesture recognizer based on discrete hidden Markov models (DHMMs). Here, the contributions lie in presenting a feature set and a factorization of the standard DHMM observation distribution, which allows modeling of a wide range of gestures (e.g., both one-handed and bimanual gestures), but which uses few modeling parameters. To establish the overall robustness and accuracy of the recognition system, five new users and one expert were asked to perform ten instances of each gesture. The system accurately recognized 85% of gestures for new users, increasing to 96% for the expert user. In both cases, false positives accounted for fewer than 4% of all detections. These error rates compare favourably to those of similar systems.
APA, Harvard, Vancouver, ISO, and other styles
7

Tseng, Yu-Lin, and 曾鈺琳. "A 2D Controlling Interface Based on the Double-Handed Computer Vision Gesture Recognition." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/qrt5xr.

Full text
Abstract:
碩士
國立臺北科技大學
創新設計研究所
100
The Movie “Minority Report” brought out a whole new concept of human computer technology, which has been evoking numerous related researches ever since. The evolution of input devices began with keyboards. There then ensued the invention of “mouse cursors”. Since that decade, multi-touch devices have been all the rage. Thus, more and more HCI technologies have been developed by researchers. According to the related studies, we learnt that there are still technical and conceptual obstacles existing in the recognition process. Hence, we reevaluate the related technical methods, and then we build up a new model of real time two-handed vision-based gesture recognition system based on the existing methods. In addition, the system provides an auto skin tone learning function, and allows users to “rotate” and “zoom” 2D objects on screen.
APA, Harvard, Vancouver, ISO, and other styles
8

LU, YU-MING, and 盧佑銘. "Research on the building and controlling adjustment of the quadcopter combined with gesture recognition." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/y58hq6.

Full text
Abstract:
碩士
國立高雄師範大學
工業科技教育學系
107
The quadcopter has been popular in recent years because it is stable and easy to control. Therefore, it is applied to many fields, such as shooting, competitive flight, rescue and so on. In this article, Arduino NANO and MPU6050 module were used to make the quadcopter. Understanding the four-axis system flight principles and theories as well as adjusting the PID controller helped to achieve flight stablility of the four-axis aircraft. With the rise of wireless induction devices, the gesture recognition could be realized. Through the Leap Motion device, we could understand the logic of gestures and test the success rate of each gestures. Finally, with combination of Parrot Ardrone2.0 and gesture recognition control, we found the connection problem. The program code in gesture control was changed to solve it, and this experiment was completed then. The experimental results showed the high performance of gesture recognition and the success in controling the quadcopter.
APA, Harvard, Vancouver, ISO, and other styles
9

Chao, Chun-Yi, and 趙峻毅. "Impact on Elementary School Students’ Science and Technology Learning Motivation and Performance in a Game-based Brainwave and Gesture Controlling Learning Environment." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/h6ynu9.

Full text
Abstract:
碩士
國立高雄應用科技大學
資訊管理研究所碩士班
103
In the nowadays society, the learning method of school children becomes more complex than before. With diversity of courses, the traditional way of teaching methods is out of vogue. Improving children's learning outcome with the new information technology is necessary and important now. From the early distance learning and digital game-based learning are all of the kinds of technology-enhanced learning. The major characteristic of digital game-based learning is to attract school children's interests in learning and to improve the efficiency in students learning achievements and learning motivation. The purpose of this study is to develop a digital game-based learning system with motion and brainwave controlling technologies. With the gesture controlling technology, school children can use hand gesture to interact with the system in a game-based learning environment. By the brainwave controlling technology, the system can also collect the brainwave pattern from the learners and provide feedback about learning attention for learners. Combined with gesture and brainwave controlling technologies, the system was designed and implemented for improve the school children's learning achievement and promote their learning motivation with a fun and challenging approach to learning. In addition to the combination with the above technologies in the system, the learning materials in the system also follows the standard of Ministry of Education in Natural Science subjects and are designed for fifth grade teaching school children to learn. The author employed a quasi-experimental method. The subjects were 40 elementary school students, All subjects divided into two group: Experiment group and Control group. Each group has 20 students. The author adopts pre-test and post-test in learning motivation of Nature Science and Nature Science achievement test to collect information. Everyone learn three weeks, a total of 120 minutes. After experiment of experiment group and control group learning, an analysis of the results indicated that learning achievement for both groups were significance (p value of experiment group is .000, p value of control group is .009). And the difference between two groups, learning motivation in the experimental group and control group difference were no significance (p value of experiment group is .093, p value of control group is .44). In conclusion we can found the experiment group in learning achievement is better than control group. Both groups were no significance in learning motivation. So we can surmise the experiment group and control group’s manipulating methods were similar that leading two groups shows difference were no significance in learning motivation. Both groups subjects all felt fantastic and interesting in operated system by gesture-controlling.
APA, Harvard, Vancouver, ISO, and other styles
10

Pais, João Pedro Santos. "Análise do comportamento dos gestores de projeto em contexto de derrapagem." Master's thesis, 2018. http://hdl.handle.net/10071/18718.

Full text
Abstract:
A gestão de projetos está cada vez mais presente nas empresas que pretendem estar na vanguarda do mercado. Devido à enorme competitividade entre as empresas nos vários setores de atividade, em particular na área de tecnologias de informação, o papel do gestor de projeto é cada vez mais visto como fundamental para as organizações. O estudo remete para o comportamento do gestor de projeto num contexto de derrapagem e de que forma este atua. Uma das conclusões deste trabalho de investigação, foi verificar que o contexto do projeto, interno ou externo, influência a forma como o gestor de projeto atua num contexto de derrapagem. Em contexto de projetos internos a dimensões de atuação âmbito é a mais priorizada, ou seja, é escolhida por norma em primeiro lugar, enquanto que em projetos externos a dimensões de atuação priorizada é o custo. Verificou-se também, que existe uma relação entre os fatores chave identificados e as dimensões de atuação. Nesta situação, perante um determinado fator chave, os gestores de projeto comportam-se de maneira semelhante, o que demonstra, que independentemente do perfil do gestor ou do perfil do projeto, há uma certa coerência por parte dos gestores de projeto na atuação sobre determinados fatores chave. Por fim, outra conclusão que se retirou deste trabalho de investigação, foi a relação que existe entre as dimensões de atuação e o perfil do gestor de projeto. Na análise feita, observou-se que os gestores de projeto que têm maior experiência privilegiam as dimensões de atuação custo e tempo. Enquanto que gestores de projeto com menor experiência privilegiam a dimensão de atuação custo.
Project management is increasingly present in the world of information technology. The increase of competitiveness between companies, size and complexity of projects requires a greater focus on project management. One of the conclusions of this research was to verify if the project context, internal or external, influences the way the project manager work in a context of project slippage. In the context of internal projects, the dimension of action scope is the most prioritized, that is, it is chosen before any other, while in external projects the dimension of action prioritized is cost. It was also verified that there is a relationship between the identified key factors and action dimensions. In this situation, faced with a certain key factor, project managers behave in a similar way, which demonstrates that regardless of manager profile or project profile, there is some consistency on the part of project managers performance on some key factors. Finally, another conclusion drawn from this research was the relationship between the performance dimensions and the profile of the project manager. In the analysis, it was observed that the managers of projects that have more experience privilege the dimensions of cost and time of operation. While less experienced project managers only prioritize the cost dimension.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "GESTURE CONTROLLING"

1

Santi, Francesco. Amministrazione e controlli: Società di persone, imprese gestite da enti collettivi, consorzi, gruppi europei di interesse economico... [Padova]: CEDAM, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "GESTURE CONTROLLING"

1

Chandankhede, Pragati, and Sana Haji. "Gesture-Based Media Controlling Using Haar Cascade." In Advances in Intelligent Systems and Computing, 541–51. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-3071-2_44.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Dehankar, A. V., Sanjeev Jain, and V. M. Thakare. "Hand Gesture-Based User Interface for Controlling Mobile Applications." In Lecture Notes in Electrical Engineering, 577–86. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-1906-8_59.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kim-Tien, Nguyen, Nguyen Truong-Thinh, and Trinh Duc Cuong. "A Method for Controlling Wheelchair Using Hand Gesture Recognition." In Advances in Intelligent Systems and Computing, 961–70. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-37374-9_93.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Morisaki, Tao, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. "Controlling Robot Vehicle Using Hand-Gesture with Mid-Air Haptic Feedback." In Lecture Notes in Electrical Engineering, 268–71. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-3194-7_60.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dalka, Piotr, and Andrzej Czyżewski. "Controlling Computer by Lip Gestures Employing Neural Networks." In Rough Sets and Current Trends in Computing, 80–89. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13529-3_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Santos, Jedid, Ivo Martins, and João M. F. Rodrigues. "Framework for Controlling KNX Devices Based on Gestures." In Lecture Notes in Computer Science, 507–18. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78095-1_37.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Tartari, G., D. Stødle, J. M. Bjørndalen, P. H. Ha, and O. J. Anshus. "Controlling and Coordinating Computers in a Room with In-Room Gestures." In Advances in Intelligent Systems and Computing, 103–16. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-08491-6_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Schäfer, Alexander, Gerd Reis, and Didier Stricker. "Controlling Continuous Locomotion in Virtual Reality with Bare Hands Using Hand Gestures." In Virtual Reality and Mixed Reality, 191–205. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16234-3_11.

Full text
Abstract:
AbstractMoving around in a virtual world is one of the essential interactions for Virtual Reality (VR) applications. The current standard for moving in VR is using a controller. Recently, VR Head Mounted Displays integrate new input modalities such as hand tracking which allows the investigation of different techniques to move in VR. This work explores different techniques for bare-handed locomotion since it could offer a promising alternative to existing freehand techniques. The presented techniques enable continuous movement through an immersive virtual environment. The proposed techniques are compared to each other in terms of efficiency, usability, perceived workload, and user preference.
APA, Harvard, Vancouver, ISO, and other styles
9

Singh, Mangal, Ishan Vijay Tewari, and Labdhi Sheth. "Skin-Colour-Based Hand Segmentation Techniques." In Challenges and Applications for Hand Gesture Recognition, 1–26. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-7998-9434-6.ch001.

Full text
Abstract:
People have been continuously mesmerized by the sci-fi notion of controlling things based on hand gestures. However, only a few ponder upon the working of these gesture-based systems. This chapter explains how these gesture-based systems generally use skin-colour-based hand segmentation techniques. This chapter sheds light on and explains various colour models, various methods to use these colour models and perform skin-colour-based hand segmentation, various real-life applications of segmenting areas using skin-colour with multiple examples of hand gesture recognition, as well as other applications where skin-colour based segmentation is used.
APA, Harvard, Vancouver, ISO, and other styles
10

Baltazar, André, and Luís Gustavo Martins. "ZatLab." In Innovative Teaching Strategies and New Learning Paradigms in Computer Programming, 224–54. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-7304-5.ch011.

Full text
Abstract:
Computer programming is not an easy task, and as with all difficult tasks, it can be faced as tedious, impossible to do, or as a challenge. Therefore, learning to program with a purpose enables that “challenge mindset” and encourages the student to apply himself in overcoming his handicaps and exploring different theories and methods to achieve his goal. This chapter describes the process of programming a framework with the purpose of achieving real time human gesture recognition. Just this is already a good challenge, but the ultimate goal is to enable new ways of Human-Computer Interaction through expressive gestures and to allow a performer the possibility of controlling (with his gestures), in real time, creative artistic events. The chapter starts with a review on human gesture recognition. Then it presents the framework architecture, its main modules, and algorithms. It closes with the description of two artistic applications using the ZatLab framework.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "GESTURE CONTROLLING"

1

Görmez, Sinem, and Carsten Röcker. "Exploring the Potential of Gestures for Controlling Doors and Windows in Smart Homes." In 14th International Conference on Applied Human Factors and Ergonomics (AHFE 2023). AHFE International, 2023. http://dx.doi.org/10.54941/ahfe1003674.

Full text
Abstract:
This paper explores the potential of gesture interaction as an alternative control concept for doors, windows and sliding systems in smart homes. In a first step, a technical prototype was built that enables to open and close door and window elements with a hand-swiping gesture. In a second step, a user study with N = 95 participants was conducted to explore the perceived usefulness of the developed solution using a questionnaire with 24 items. The results showed that 78 percent of the participants liked the concept of contactless gesture control of doors, windows and sliding systems. The reluctance of the remaining group could be traced back to a missing experience with smart control concepts (e.g., voice assistants) (t-test: Spearman’s rs = .27, p = .044) and the belief that gestures are hard to remember (chi-square test: α < .01, p = .007). The study also confirmed that the implemented control concept and gestures were perceived as natural and intuitively understandable.
APA, Harvard, Vancouver, ISO, and other styles
2

He, Yanming, Shumeng Hou, and Peiyao Cheng. "Generating a Gesture Set Using the User-defined Method in Smart Home Contexts." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002181.

Full text
Abstract:
Gesture interaction is a natural interaction method and it has been widely applied in various smart contexts. Smart home system is a promising area to integrate gesture interaction. Under this background, it is necessary to generate a set of gestures that can support users’ intuitive interaction with smart home devices. Gesture elicitation study (GES) is an effective method used for generating gestures. In this study, by following GES, we develop a gesture set for controlling a smart TV via a smart speaker, which was common in smart home contexts. Two studies were conducted. In study 1, we conducted a diary study to generate target tasks, resulting in fifteen most frequent tasks in domestic contexts. In study 2, GES was conducted to generate gestures for each command by involving twelve participants. The generated gestures were analyzed by combining frequency, match, ease of use, learnability, memorability and preference, resulting in a set of gestures for smart home contexts.Keywords: Gesture Interaction, Smart Home System, Gesture Elicitation Study
APA, Harvard, Vancouver, ISO, and other styles
3

Siean, Alexandru-Ionut. "A Set of Smart Ring Gestures for Drone Control." In 12th International Conference on Electronics, Communications and Computing. Technical University of Moldova, 2022. http://dx.doi.org/10.52326/ic-ecco.2022/cs.12.

Full text
Abstract:
We present in this paper the results of a frequency analysis of gesture commands frequently employed for human- drone interaction in the scientific literature, and we propose a set of gestures for controlling drones that can be performed with smart rings. Our method consists in the analysis of thirty-seven articles, which we examined closely to extract commands for human-drone interaction, including voice, gesture, and multi- modal input. Based on our meta-analysis, we present a set of six groups of commands for human-drone interaction together with a set of smart ring gestures to interact with and control drones. Our results can be used to inform the design of new interactive applications for controlling smart-ring drones.
APA, Harvard, Vancouver, ISO, and other styles
4

Aditya, Wisnu, Noorkholis Luthfil Hakim, Timothy K. Shih, Avirmed Enkhbat, and Tipajin Thaipisutikul. "IC4Windows–Hand Gesture for Controlling MS Windows." In 2020 5th International Conference on Information Technology (InCIT). IEEE, 2020. http://dx.doi.org/10.1109/incit50588.2020.9310967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Huang, En-Wei, and Li-Chen Fu. "Segmented gesture recognition for controlling character animation." In the 2008 ACM symposium. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1450579.1450623.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Akagi, John, Brady Moon, Xingguang Chen, and Cameron K. Peterson. "Gesture Commands for Controlling High-Level UAV Behavior." In 2019 International Conference on Unmanned Aircraft Systems (ICUAS). IEEE, 2019. http://dx.doi.org/10.1109/icuas.2019.8797743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sideridis, Vasileios, Andrew Zacharakis, George Tzagkarakis, and Maria Papadopouli. "GestureKeeper: Gesture Recognition for Controlling Devices in IoT Environments." In 2019 27th European Signal Processing Conference (EUSIPCO). IEEE, 2019. http://dx.doi.org/10.23919/eusipco.2019.8903044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sihombing, Poltak, Rifky B. Muhammad, Herriyance Herriyance, and Elviwani Elviwani. "Robotic Arm Controlling Based on Fingers and Hand Gesture." In 2020 3rd International Conference on Mechanical, Electronics, Computer, and Industrial Technology (MECnIT). IEEE, 2020. http://dx.doi.org/10.1109/mecnit48290.2020.9166592.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Desai, Jeet K., and Lifford Mclauchlan. "Controlling a wheelchair by gesture movements and wearable technology." In 2017 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 2017. http://dx.doi.org/10.1109/icce.2017.7889371.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ge, Zhaojie, Weiming Liu, Zhile Wu, Mei Cai, and Ping Zhao. "Gesture Recognition and Master-Slave Control of a Manipulator Based on sEMG and CNN-GRU." In ASME 2022 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/imece2022-94788.

Full text
Abstract:
Abstract Surface electromyography signal (sEMG) is the bioelectric signal accompanied by muscle contraction. In gesture recognition, sEMG is a non-invasive, efficient and fast recognition method. For patients with hand amputation, their upper limb EMG signals can be collected, and these EMG signals correspond to the patient’s hand movement intention. Therefore, by wearing the prosthetic hand integrated with the EMG signal recognition module, patients with hand amputation can also make gestures meet their needs of daily life. In this paper, gesture recognition is carried out based on sEMG and deep learning, and the master-slave control of manipulator is realized. At the same time, gesture recognition can also be applied to remote control. Controlling the end of the manipulator at a certain distance with a specific gesture can complete some tasks in complex and high-risk environments with higher efficiency. Based on Convolutional Neural Network (CNN) and Gate Recurrent Unit (GRU), this paper constructs three neural networks with different structures, including single CNN, single GRU and CNN-GRU, and then train the collected gesture data set. According to the results of test set, the input type with the highest accuracy of gesture classification and recognition can be obtained. Among the three neural networks, CNN-GRU has the highest accuracy on the test set, reaching 92%, so it is used as the selected gesture recognition network. Finally, combined with the integrated manipulator, the EMG signals collected in real time by the myo EMG signal acquisition armband are classified by the upper computer, and the results are obtained. Then the control signal of the manipulator corresponding to the gesture is sent to the Arduino control module of the manipulator, and the master-slave control of the manipulator using the EMG signal can be realized.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography