Teses / dissertações sobre o tema "Touch-Based interaction"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Veja os 17 melhores trabalhos (teses / dissertações) para estudos sobre o assunto "Touch-Based interaction".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Veja as teses / dissertações das mais diversas áreas científicas e compile uma bibliografia correta.
Gauci, Francesca. "Game Accessibility for Children with Cognitive Disabilities : Comparing Gesture-based and Touch Interaction". Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-107102.
Texto completo da fonteFard, Hossein Ghodosi, e Bie Chuangjun. "Braille-based Text Input for Multi-touch Screen Mobile Phones". Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-5555.
Texto completo da fonteHardy, Robert. "Exploring touch-based phone interaction with tagged physical objects : exemplified using near-field communication". Thesis, Lancaster University, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.661129.
Texto completo da fonteYan, Zhixin. "A Unified Multi-touch Gesture based Approach for Efficient Short-, Medium-, and Long-Distance Travel in VR". Digital WPI, 2016. https://digitalcommons.wpi.edu/etd-theses/392.
Texto completo da fonteCavez, Vincent. "Designing Pen-based Interactions for Productivity and Creativity". Electronic Thesis or Diss., université Paris-Saclay, 2025. http://www.theses.fr/2025UPASG013.
Texto completo da fonteDesigned with the mouse and keyboard in mind, productivity tools and creativity support tools are powerful on desktop computers, but their structure becomes an obstacle when brought to interactive surfaces supporting pen and touch input.Indeed, the opportunities provided by the pen for precision and expressivity have been demonstrated in the HCI literature, but productivity and creativity tools require a careful redesign leveraging these unique affordances to take benefit from the intuitiveness they offer while keeping the advantages of structure. This delicate articulation between pen and structure has been overlooked in the literature.My thesis work focuses on this articulation with two use cases to answer the broad research question: “How to design pen-based interactions for productivity and creativity on interactive surfaces?” I argue that productivity depends on efficiency while creativity depends on both efficiency and flexibility, and explore interactions that promote these two dimensions.My first project, TableInk, explores a set of pen-based interaction techniques designed for spreadsheet programs and contributes guidelines to promote efficiency on interactive surfaces. I first conduct an analysis of commercial spreadsheet programs and an elicitation study to understand what users can do and what they would like to do with spreadsheets on interactive surfaces. Informed by these, I design interaction techniques that leverage the opportunities of the pen to mitigate friction and enable more operations by direct manipulation on and through the grid. I prototype these interaction techniques and conduct a qualitative study with information workers who performed a variety of spreadsheet operations on their own data. The observations show that using the pen to bypass the structure is a promising mean to promote efficiency with a productivity tool.My second project, EuterPen, explores a set of pen-based interaction techniques designed for music notation programs and contributes guidelines to promote both efficiency and flexibility on interactive surfaces. I first conduct a series of nine interviews with professional composers in order to take a step back and understand both their thought process and their work process with their current desktop tools. Building on this dual analysis, I derive guidelines for the design of features which have the potential to promote both efficiency with frequent or complex operations and flexibility in regard to the exploration of ideas. Then, I act on these guidelines by engaging in an iterative design process for interaction techniques that leverage the opportunities of the pen: two prototyping phases, a participatory design workshop, and a final series of interviews with eight professional composers. The observations show that on top of using the pen to leverage the structure for efficiency, using its properties to temporarily break the structure is a promising mean to promote flexibility with a creativity support tool.I conclude this manuscript by discussing several ways to interact with structure, presenting a set of guidelines to support the design of pen-based interactions for productivity and creativity tools, and elaborating on the future applications this thesis opens
Tikkanen, Marjo. "What makes the flow - Understanding the immateriality of screen-based interactions". Thesis, Malmö högskola, Fakulteten för kultur och samhälle (KS), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-21577.
Texto completo da fonteKuhlman, Lane M. "Gesture Mapping for Interaction Design: An Investigative Process for Developing Interactive Gesture Libraries". The Ohio State University, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=osu1244003264.
Texto completo da fonteVargas, Gonzalez Andres. "SketChart: A Pen-Based Tool for Chart Generation and Interaction". Master's thesis, University of Central Florida, 2014. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/6375.
Texto completo da fonteM.S.
Masters
Electrical Engineering and Computer Science
Engineering and Computer Science
Computer Science
Junior, José Augusto Costa Martins. "Interação usuário-TV digital interativa: contribuições via controle remoto". Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-28062011-115007/.
Texto completo da fonteThe tradicional Brazilian TV system is being replaced by an interactive digital platform. The Ginga middleware, responsible for allowing the presentation of interactive programs, is able to support user interactions with TV applications by means of key presses on a remote control. Since traditional remotes have usability limitations, this work aimed at investigating the application of ubiquitous computing concepts, such as natural and multimodal interfaces, to provide alternatives for the interaction among users and TV applications. Considering the availability of mobile devices such as smartphones, prototype interfaces based on touch screens, as well as gesture-based, accelerometer-based, and voice-based interfaces have been designed and implemented to allow the interaction usually provided by remote controls. The implementation of those interfaces was supported by the previous development of components providing multimodal interaction and peer-to-peer communication in the context of the Brazilian interactive digital TV system middleware
Weigel, Martin [Verfasser], e Jürgen [Akademischer Betreuer] Steimle. "Interactive on-skin devices for expressive touch-based interactions / Martin Weigel ; Betreuer: Jürgen Steimle". Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2017. http://d-nb.info/1136607838/34.
Texto completo da fonteJung, Annkatrin. "The Breathing Garment : Exploring Breathing-Based Interactions through Deep Touch Pressure". Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-284203.
Texto completo da fonteDjuptrycksterapi (Deep Touch Pressure, DTP) används för att behandla personer som har problem med att processa sensoriska upplevelser. Detta genom att applicera ett fast tryck på kroppen för att aktivera nervsystemet och lindra ångest. Jag genomförde en långtidsutforskning av DTP ur ett första-persons-perspektiv, med hjälp av formförändrande tryckluftsaktuatorer, andnings sensorer och EKG-elektroder. Dess syfte var att undersöka ifall DTP kan guida användare till att engageras i semiautonoma interaktioner med sin andning och främja en större introspektion och kroppsmedvetenhet. Baserat på ett initialt samarbete kring undersökning av olika material, designade jag “the breathing garment” - en bärbar väst som guidar användaren genom djupandningstekniker. Andningsvästen visar på en ny användning av DTP som en modalitet av haptisk andningsfeedback, och den möjliggör ett stödjande av interoceptisk medvetenhet och avslappning. Andningsvästen tillät mig att delta i en dialog med min egen kropp, och fungerade som en ständig påminnelse att vända mig inåt och uppmärksamma mina somatiska upplevelser. Genom att trycka min bröstkorg framåt kunde aktuatorerna engagera hela min kropp när de svarade mot min andning, vilket skapade en känsla av intimitet, trygghet och att vara omhändertagen. Detta examensarbete uppmärksammar ett område som tidigare varit outforskat inom HCI av djuptrycksterapi och biosensorteknik kring den subjektiva upplevelsen av dess emotionella och kognitiva påverkan. Det långvariga engagemanget med aktivt upplevande av olika andningstekniker öppnade upp en stor designrymd kring tryckbaserade aktuatorer i en kontext av andning. Det visar på ett flertal experimentella kvaliteter och affordances av de formförändrande tryckluftsaktuatorerna, såsom: att applicera ett gradvis ökande och markant tryck för att dra uppmärksamheten till specifika kroppsdelar, men också för att bryta det vanliga andningsmönstret genom asynkron och asymmetrisk mönsterpåverkan; att ta en ledande eller följande roll i interaktionen, ibland båda samtidigt; och att agera som en tröstande följeslagare, eller som en kommunikationskanal mellan två människor, likväl som mellan en person och hennes soma.
Al-Showarah, Suleyman. "Effects of age on smartphone and tablet usability, based on eye-movement tracking and touch-gesture interactions". Thesis, University of Buckingham, 2015. http://bear.buckingham.ac.uk/29/.
Texto completo da fonteTong, Xin. "Interactive Visual Clutter Management in Scientific Visualization". The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1471612150.
Texto completo da fonteWang, Huang-Jen, e 王皇仁. "The Performance and User Experience of Touch-based Interaction for Older Adults in Editng Tasks". Thesis, 2014. http://ndltd.ncl.edu.tw/handle/26822110490750390857.
Texto completo da fonte義守大學
工業管理學系
102
According to advance of technology, improvement of hygiene and high quality of care, those changes gradually extend people’s life span, so the demographic structure also turns traditional society into aging society. Instead of the conventional input devices, the keyboard and mouse, touch panel becomes an upward trend in technology because of smart phone and multi-touched products widely used in the world. In the age of high technology, the older adults should possess basic computer skills to adapt to ever-changing world. It is quite difficult for older adults to learn how to utilize computers and peripherals well due to their mantel and physical conditions, decreasing learning abilities by age. Therefore, this study is going to reveal that comparing touch panel with traditional editing operation to figure out which function is more suitable for use by the older adults. This study analyzes 15 senior subjects’ reaction after using different interfaces. Each experimental subject has been tested 9 kinds of blended trials; modal is consisted of 3x3 factors. There are two inter factor, panel-touched interface and editing operation. That has mouse, touchpad and touchscreen to be processing level, this has global selection, specific selection and rotation to be the processing level in this study. The result proves that touchscreen is easier than touchpad because subjects spend less time operating and make few errors; however, the mouse depends on difference of editing operation. It is not significantly different from touchpad in the global selection, but mouse shrinks the operation time. As a result, in global selection of editing operation, only could not the mouse be instead of. But when using the mouse to operation of rotation, it is significantly different in a result of analysis that touchscreen is more efficient than mouse, consequently decreasing operation time and errors; thus, the touchscreen is the most suitable in operation of rotation. Besides, the factors of subjective questionnaire-level of fatigue, relaxation, confidence and distractions-indicate that older adults prefer operating touchscreen interface to using other interfaces.
Chen, Hsu-Huan, e 陳緒桓. "Deep-learning SSD-based Smart Touch Interface and Interactive Robot Design". Thesis, 2019. http://ndltd.ncl.edu.tw/handle/cbxmkw.
Texto completo da fonte國立臺灣海洋大學
電機工程學系
107
The main purpose of this thesis is to study the application of the deep learning object recognition to the control of several devices including omnidirectional mobile robot, delta robot arm and household appliances. The SSD (Single Shot multibox Detector) neural network is used to train and identify different objects to be controlled and then the corresponding manipulations can be performed. The system architecture can mainly be divided into a main control console and several controlled devices. The main control module uses the tablet computer combined with the neural network model to process the images captured through the front lens. Multiple target objects appearing in the input image can be identified for the category and position respectively. Furthermore, user can connect and control his/her interested identified item on the screen with Bluetooth interface. The three items on the current list of controlled devices all use Arduino pro mini as the control core and HC-05 Bluetooth to communicate with console. Omnidirectional mobile robot and delta robot arm both use three servo motors. The former uses servo motors to drive car and the latter to control the robot arm operations. When the tablet computer is connected with omnidirectional mobile robot, user can drag it to move in all directions with one finger or rotate body posture with two fingers on the screen. If system is connected with delta robot, user can click on the screen for cat or dog pictures placed under the robot arm and robot arm will perform classification and collection. Finally, if connected with desk lamp, user can click the lamp on the screen to control it open or close
Franco, Jéssica Spínola. "Augmented reality selection through smart glasses". Master's thesis, 2019. http://hdl.handle.net/10400.13/2663.
Texto completo da fonteThe smart glasses’ market continues growing. It enables the possibility of someday smart glasses to have a presence as smartphones have already nowadays in people's daily life. Several interaction methods for smart glasses have been studied, but it is not clear which method could be the best to interact with virtual objects. In this research, it is covered studies that focus on the different interaction methods for reality augmented applications. It is highlighted the interaction methods for smart glasses and the advantages and disadvantages of each interaction method. In this work, an Augmented Reality prototype for indoor was developed, implementing three different interaction methods. It was studied the users’ preferences and their willingness to perform the interaction method in public. Besides that, it is extracted the reaction time which is the time between the detection of a marker and the user interact with it. An outdoor Augmented Reality application was developed to understand the different challenges between indoor and outdoor Augmented Reality applications. In the discussion, it is possible to understand that users feel more comfortable using an interaction method similar to what they already use. However, the solution with two interaction methods, smart glass’s tap function, and head movement allows getting results close to the results of the controller. It is important to highlight that was always the first time of the users, so there was no learning before testing. This leads to believe that the future of smart glasses interaction can be the merge of different interaction methods.
Lu, Ming-Kun, e 呂鳴崑. "Multi-Camera Vision-based Finger Detection, Tracking, and Event Identification Techniques for Multi-Touch Sensing and Human Computer Interactive Systems". Thesis, 2012. http://ndltd.ncl.edu.tw/handle/vgss3p.
Texto completo da fonte國立臺北科技大學
資訊工程系研究所
100
Nowadays, multi-touch technology has become a popular issue. Multi-touch has been implemented in several ways including resistive type, capacitive type and so on. However, because of limitations, multi-touch by these implementations cannot support large screens. Therefore, this thesis proposes a multi-camera vision-based finger detection, tracking, and event identification techniques for multi-touch sensing with implementation. The proposed system detects the multi-finger pressing on an acrylic board by capturing the infrared light through four infrared cameras. The captured infrared points, which are equivalent to the multi-finger touched points, can be used for input equipments and supply man-computer interface with convenience. Additionally, the proposed system is a multi-touch sensing with computer vision technology. Compared with the conventional touch technology, multi-touch technology allows users to input complex commands. The proposed multi-touch point detection algorithm identifies the multi-finger touched points by using the bright object segmentation techniques. The extracted bright objects are then tracked, and the trajectories of objects are recorded. Furthermore, the system will analyze the trajectories of objects and identify the corresponding events pre-defined in the proposed system. For applications, this thesis wants to provide a simple human-computer interface with easy operation. Users can access and input commands by touch and move fingers. Besides, the proposed system is implemented with a table-sized screen, which can support multi-user interaction.