Academic literature on the topic 'Mixed reality interfaces'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Mixed reality interfaces.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Mixed reality interfaces"

1

Yoo, Yong-Ho, and Wilhelm Bruns. "ENERGY INTERFACES FOR MIXED REALITY." IFAC Proceedings Volumes 39, no. 3 (2006): 249–54. http://dx.doi.org/10.3182/20060517-3-fr-2903.00140.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lindlbauer, David. "The future of mixed reality is adaptive." XRDS: Crossroads, The ACM Magazine for Students 29, no. 1 (September 2022): 26–31. http://dx.doi.org/10.1145/3558191.

Full text
Abstract:
In a future where we replace our smartphones and notebooks with mixed reality headsets, the way we create user interfaces will change drastically. Future interfaces will need to adapt automatically to users' context, guided by optimization-based methods and machine learning, to become beneficial for end-users.
APA, Harvard, Vancouver, ISO, and other styles
3

White, Martin, Panagiotis Petridis, Fotis Liarokapis, and Daniel Plecinckx. "Multimodal Mixed Reality Interfaces for Visualizing Digital Heritage." International Journal of Architectural Computing 5, no. 2 (June 2007): 321–37. http://dx.doi.org/10.1260/1478-0771.5.2.322.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mountain, David, and Fotis Liarokapis. "Mixed reality (MR) interfaces for mobile information systems." Aslib Proceedings 59, no. 4/5 (July 12, 2007): 422–36. http://dx.doi.org/10.1108/00012530710817618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Poustinchi, Ebrahim. "Mixed Robotic Interface Г : Searching for a hybrid cyber-physical design/experience interface using virtual/actual robots." SHS Web of Conferences 64 (2019): 01008. http://dx.doi.org/10.1051/shsconf/20196401008.

Full text
Abstract:
Mixed Robotic Interface is a project-based design-research investigation, studying new ways of creating hybridized cyber-physical design and experience interfaces, at the intersection of robotics—as its core component, and augmented reality, game design, projection mapping, and digital fabrication. Mixed Robotic Interface Г—as part of Mixed Robotic Interface series of research projects, focuses on using “actual” and “virtual” robot arms as a possible creative medium and extensions of design/gaming environment creating immersive atmospheres for “experiencing” design. This research questions the possibilities of creating an architectural/spatial atmosphere through digitally enhanced experiences. Different from some of the current experiments with augmented reality (AR), virtual reality (VR) and projection-mapping in architecture, Mixed Robotic Interface Г is not looking into “immersive” experience as a way to “blur” the boundaries of digital and physical—similar to virtual reality experience with headsets. Instead, Mixed Robotic Interface Г creates a recognizable gap between real and virtual to open up a creative space for the user/audience to be involved between these two mediums. Mixed Robotic Interface Г uses para-fictional storytelling as a way to engage the audience with the experience and to create continues atmospheric qualities.
APA, Harvard, Vancouver, ISO, and other styles
6

Osipov, Ilya V. "Cubios Transreality Puzzle as a Mixed Reality Object." International Journal of Virtual and Augmented Reality 1, no. 2 (July 2017): 1–17. http://dx.doi.org/10.4018/ijvar.2017070101.

Full text
Abstract:
The author proposes to establish a separate class of electronic mechanical puzzles being the object of “mixed reality”, presents a self-engineered example of such a device, and reviews similar devices produced by other developers. Close relationships of such devices with tangible user interfaces are described. A Cubios device is presented as an illustration of a mixed reality puzzle along with its variants developed by the author. The purpose of this paper is to present a new mixed reality device, review similar devices, propose the classification of such devices, identify their relationships with tangible user interfaces, and discuss the prospects of their development.
APA, Harvard, Vancouver, ISO, and other styles
7

Andolina, Salvatore, Yi-Ta Hsieh, Denis Kalkofen, Antti Nurminen, Diogo Cabral, Anna Spagnolli, Luciano Gamberini, Ann Morrison, Dieter Schmalstieg, and Giulio Jacucci. "Designing for Mixed Reality Urban Exploration." Interaction Design and Architecture(s), no. 48 (June 10, 2021): 33–49. http://dx.doi.org/10.55612/s-5002-048-002.

Full text
Abstract:
This paper introduces a design framework for mixed reality urban exploration (MRUE), based on a concrete implementation in a historical city. The framework integrates different modalities, such as virtual reality (VR), augmented reality (AR), and haptics-audio interfaces, as well as advanced features such as personalized recommendations, social exploration, and itinerary management. It permits to address a number of concerns regarding information overload, safety, and quality of the experience, which are not sufficiently tackled in traditional non-integrated approaches. This study presents an integrated mobile platform built on top of this framework and reflects on the lessons learned.
APA, Harvard, Vancouver, ISO, and other styles
8

Nakayama, Angelica, Daniel Ruelas, Jesus Savage, and Ernesto Bribiesca. "Teleoperated Service Robot with an Immersive Mixed Reality Interface." Informatics and Automation 20, no. 6 (September 10, 2021): 1187–223. http://dx.doi.org/10.15622/ia.20.6.1.

Full text
Abstract:
Teleoperated service robots can perform more complex and precise tasks as they combine robot skills and human expertise. Communication between the operator and the robot is essential for remote operation and strongly affects system efficiency. Immersive interfaces are being used to enhance teleoperation experience. However, latency or time delay can impair the performance of the robot operation. Since remote visualization involves transmitting a large amount of video data, the challenge is to decrease communication instability. Then, an efficient teleoperation system must have a suitable operation interface capable of visualizing the remote environment, controlling the robot, and having a fast response time. This work presents the development of a service robot teleoperation system with an immersive mixed reality operation interface where the operator can visualize the real remote environment or a virtual 3D environment representing it. The virtual environment aims to reduce the latency on communication by reducing the amount of information sent over the network and improve user experience. The robot can perform navigation and simple tasks autonomously or change to the teleoperated mode for more complex tasks. The system was developed using ROS, UNITY 3D, and sockets to be exported with ease to different platforms. The experiments suggest that having an immersive operation interface provides improved usability for the operator. The latency appears to improve when using the virtual environment. The user experience seems to benefit from the use of mixed reality techniques; this may lead to the broader use of teleoperated service robot systems.
APA, Harvard, Vancouver, ISO, and other styles
9

Mourtzis, Dimitris, John Angelopoulos, and Nikos Panopoulos. "Closed-Loop Robotic Arm Manipulation Based on Mixed Reality." Applied Sciences 12, no. 6 (March 14, 2022): 2972. http://dx.doi.org/10.3390/app12062972.

Full text
Abstract:
Robotic manipulators have become part of manufacturing systems in recent decades. However, in the realm of Industry 4.0, a new type of manufacturing cell has been introduced—the so-called collaborative manufacturing cell. In such collaborative environments, communication between a human operator and robotic manipulators must be flawless, so that smooth collaboration, i.e., human safety, is ensured constantly. Therefore, engineers have focused on the development of suitable human–robot interfaces (HRI) in order to tackle this issue. This research work proposes a closed-loop framework for the human–robot interface based on the utilization of digital technologies, such as Mixed Reality (MR). Concretely, the framework can be realized as a methodology for the remote and safe manipulation of the robotic arm in near real-time, while, simultaneously, safety zones are displayed in the field of view of the shop-floor technician. The method is based on the creation of a Digital Twin of the robotic arm and the setup of a suitable communication framework for continuous and seamless communication between the user interface, the physical robot, and the Digital Twin. The development of the method is based on the utilization of a ROS (Robot Operating System) for the modelling of the Digital Twin, a Cloud database for data handling, and Mixed Reality (MR) for the Human–Machine Interface (HMI). The developed MR application is tested in a laboratory-based machine shop, incorporating collaborative cells.
APA, Harvard, Vancouver, ISO, and other styles
10

Pfeiffer, Thies, and Nadine Pfeiffer-Leßmann. "Virtual Prototyping of Mixed Reality Interfaces with Internet of Things (IoT) Connectivity." i-com 17, no. 2 (August 28, 2018): 179–86. http://dx.doi.org/10.1515/icom-2018-0025.

Full text
Abstract:
AbstractOne key aspect of the Internet of Things (IoT) is, that human machine interfaces are disentangled from the physicality of the devices. This provides designers with more freedom, but also may lead to more abstract interfaces, as they lack the natural context created by the presence of the machine. Mixed Reality (MR) on the other hand, is a key technology that enables designers to create user interfaces anywhere, either linked to a physical context (augmented reality, AR) or embedded in a virtual context (virtual reality, VR). Especially today, designing MR interfaces is a challenge, as there is not yet a common design language nor a set of standard functionalities or patterns. In addition to that, neither customers nor future users have substantial experiences in using MR interfaces.Prototypes can contribute to overcome this gap, as they continuously provide user experiences of increasing realism along the design process. We present ExProtoVAR, a tool that supports quick and lightweight prototyping of MR interfaces for IoT using VR technology.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Mixed reality interfaces"

1

Santos, Lages Wallace. "Walk-Centric User Interfaces for Mixed Reality." Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/84460.

Full text
Abstract:
Walking is a natural part of our lives and is also becoming increasingly common in mixed reality. Wireless headsets and improved tracking systems allow us to easily navigate real and virtual environments by walking. In spite of the benefits, walking brings challenges to the design of new systems. In particular, designers must be aware of cognitive and motor requirements so that walking does not negatively impact the main task. Unfortunately, those demands are not yet fully understood. In this dissertation, we present new scientific evidence, interaction designs, and analysis of the role of walking in different mixed reality applications. We evaluated the difference in performance of users walking vs. manipulating a dataset during visual analysis. This is an important task, since virtual reality is increasingly being used as a way to make sense of progressively complex datasets. Our findings indicate that neither option is absolutely better: the optimal design choice should consider both user's experience with controllers and user's inherent spatial ability. Participants with reasonable game experience and low spatial ability performed better using the manipulation technique. However, we found that walking can still enable higher performance for participants with low spatial ability and without significant game experience. In augmented reality, specifying points in space is an essential step to create content that is registered with the world. However, this task can be challenging when information about the depth or geometry of the target is not available. We evaluated different augmented reality techniques for point marking that do not rely on any model of the environment. We found that triangulation by physically walking between points provides higher accuracy than purely perceptual methods. However, precision may be affected by head pointing tremors. To increase the precision, we designed a new technique that uses multiple samples to obtain a better estimate of the target position. This technique can also be used to mark points while walking. The effectiveness of this approach was demonstrated with a controlled augmented reality simulation and actual outdoor tests. Moving into the future, augmented reality will eventually replace our mobile devices as the main method of accessing information. Nonetheless, to achieve its full potential, augmented reality interfaces must support the fluid way we move in the world. We investigated the potential of adaptation in achieving this goal. We conceived and implemented an adaptive workspace system, based in the study of the design space and through user contextual studies. Our final design consists in a minimum set of techniques to support mobility and integration with the real world. We also identified a set of key interaction patterns and desirable properties of adaptation-based techniques, which can be used to guide the design of the next-generation walking-centered workspaces.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
2

Lacoche, Jérémy. "Plasticity for user interfaces in mixed reality." Thesis, Rennes 1, 2016. http://www.theses.fr/2016REN1S034/document.

Full text
Abstract:
Cette thèse s'intéresse à la plasticité des interfaces de Réalité Mixte (RM), c'est-à-dire les applications de Réalité Virtuelle (RV), Réalité Augmentée (RA) et de Virtualité Augmentée (AV). Il y a un réel engouement aujourd’hui pour ce type d’applications notamment grâce à la démocratisation des périphériques tels les lunettes et casques immersifs, les caméras de profondeur et les capteurs de mouvement. La Réalité Mixte trouve notamment ses usages dans le divertissement, la visualisation de données, la formation et la conception en ingénierie. La plasticité d'un système interactif est sa capacité à s'adapter aux contraintes matérielles et environnementales dans le respect de son utilisabilité. La continuité de l'utilisabilité d'une interface plastique est assurée quel que soit le contexte d'usage. Nous proposons ainsi des modèles et une solution logicielle nommée 3DPlasticToolkit afin de permettre aux développeurs de créer des interfaces de Réalité Mixtes plastiques. Tout d'abord, nous proposons trois modèles pour modéliser les sources d'adaptation : un modèle pour représenter les dispositifs d'interaction et les dispositifs d'affichage, un modèle pour représenter les utilisateurs et leurs préférences et un modèle pour représenter la structure et la sémantique des données. Ces sources d'adaptation vont être prises en compte par un processus d'adaptation qui va déployer dans une application les composants applicatifs adaptés au contexte d'usage grâce à des mécanismes de notation. Le déploiement de ces composants va permettre d'adapter à la fois les techniques d'interaction de l'application et également la présentation de son contenu. Nous proposons également un processus de redistribution qui va permettre à l'utilisateur final de changer la distribution des composants de son système sur différentes dimensions : affichage, utilisateur et plateforme. Ce processus va ainsi permettre à l'utilisateur de changer de plateforme dynamiquement ou encore de combiner plusieurs plateformes. L'implémentation de ces modèles dans 3DPlasticToolkit permet de fournir aux développeurs une solution prête à l'usage qui peut gérer les périphériques actuels de Réalité Mixte et qui inclut un certain nombre de techniques d'interaction, d'effets visuels et de métaphores de visualisation de données
This PhD thesis focuses on plasticity for Mixed Reality (MR) User interfaces, which includes Virtual Reality (VR), Augmented Reality (AR) and Augmented Virtuality (AV) applications. Today, there is a growing interest for this kind of applications thanks to the generalization of devices such as Head Mounted Displays, Depth sensors and tracking systems. Mixed Reality application can be used in a wide variety of domains such as entertainment, data visualization, education and training, and engineering. Plasticity refers to the capacity of an interactive system to withstand variations of both the system physical characteristics and the environment while preserving its usability. Usability continuity of a plastic interface is ensured whatever the context of use. Therefore, we propose a set of software models, integrated in a software solution named 3DPlasticToolkit, which allow any developer to create plastic MR user interfaces. First, we propose three models for modeling adaptation sources: a model for the description of display devices and interaction devices, a model for the description of the users and their preferences, a model for the description of data structure and semantic. These adaptation sources are taken into account by an adaptation process that deploys application components adapted to the context of use thanks to a scoring system. The deployment of these application components lets the system adapt the interaction techniques of the application of its content presentation. We also propose a redistribution process that allows the end-user to change the distribution of his/her application components across multiple dimensions: display, user and platform. Thus, it allows the end-user to switch dynamically of platform or to combine multiple platforms. The implementation of these models in 3DPlasticToolkit provides developers with a ready to use solution for the development of plastic MR user interfaces. Indeed, the solution already integrates different display devices and interaction devices and also includes multiple interaction techniques, visual effects and data visualization metaphors
APA, Harvard, Vancouver, ISO, and other styles
3

Marchesi, Marco <1977&gt. "Advanced Technologies for Human-Computer Interfaces in Mixed Reality." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amsdottorato.unibo.it/7522/.

Full text
Abstract:
As human beings, we trust our five senses, that allow us to experience the world and communicate. Since our birth, the amount of data that every day we can acquire is impressive and such a richness reflects the complexity of humankind in arts, technology, etc. The advent of computers and the consequent progress in Data Science and Artificial Intelligence showed how large amounts of data can contain some sort of “intelligence” themselves. Machines learn and create a superimposed layer of reality. How data generated by humans and machines are related today? To give an answer we will present three projects in the context of “Mixed Reality”, the ideal place where Reality, Virtual Reality and Augmented Reality are increasingly connected as long as data enhance the digital experiences, making them more “real”. We will start with BRAVO, a tool that exploits the brain activity to improve the user’s learning process in real time by means of a Brain-Computer Interface that acquires EEG data. Then we will see AUGMENTED GRAPHICS, a framework for detecting objects in the reality that can be captured easily and inserted in any digital scenario. Based on the moments invariants theory, it looks particularly designed for mobile devices, as it assumes a light concept of object detection and it works without any training set. As third work, GLOVR, a wearable hand controller that uses inertial sensors to offer directional controls and to recognize gestures, particularly suitable for Virtual Reality applications. It features a microphone to record voice sequences that then are translated in tasks by means of a natural language web service. For each project we will summarize the main results and we will trace some future directions of research and development.
APA, Harvard, Vancouver, ISO, and other styles
4

Yoo, Yong Ho [Verfasser]. "Mixed Reality Design Using Unified Energy Interfaces / Yong Ho Yoo." Aachen : Shaker, 2007. http://d-nb.info/1166511200/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Englmeier, David [Verfasser], and Andreas [Akademischer Betreuer] Butz. "Spherical tangible user interfaces in mixed reality / David Englmeier ; Betreuer: Andreas Butz." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2021. http://d-nb.info/1238017150/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Panahi, Aliakbar. "Big Data Visualization Platform for Mixed Reality." VCU Scholars Compass, 2017. https://scholarscompass.vcu.edu/etd/5198.

Full text
Abstract:
The visualization of data helps to provide faster and deeper insight into the data. In this work, a system for visualizing and analyzing big data in an interactive mixed reality environment is proposed. Such a system can be used for representing different types of data such as temporal, geospatial, network graph, and high dimensional. Also, an implementation of this system for four different data types are created. The data types include network data, volumetric data, high dimensional, and spectral data for different mixed reality devices such as Microsoft HoloLens, Oculus Rift, Samsung Gear VR, and Android ARCore were created. It was shown that such a system could store and use billions of samples and represent millions of them at once.
APA, Harvard, Vancouver, ISO, and other styles
7

Yannier, Nesra. "Bridging Physical and Virtual Learning: A Mixed-Reality System for Early Science." Research Showcase @ CMU, 2016. http://repository.cmu.edu/dissertations/752.

Full text
Abstract:
Tangible interfaces and mixed-reality environments have potential to bring together the advantages of physical and virtual environments to improve children’s learning and enjoyment. However, there are too few controlled experiments that investigate whether interacting with physical objects in the real world accompanied by interactive feedback may actually improve student learning compared to flat-screen interaction. Furthermore, we do not have a sufficient empirical basis for understanding how a mixed-reality environment should be designed to maximize learning and enjoyment for children. I created EarthShake, a mixed-reality game bridging physical and virtual worlds via a Kinect depth-camera and a specialized computer vision algorithm to help children learn physics. I have conducted three controlled experiments with EarthShake that have identified features that are more and less important to student learning and enjoyment. The first experiment examined the effect of observing physical phenomena and collaboration (pairs versus solo), while the second experiment replicated the effect of observing physical phenomena while also testing whether adding simple physical control, such as shaking a tablet, improves learning and enjoyment. The experiments revealed that observing physical phenomena in the context of a mixed-reality game leads to significantly more learning (5 times more) and enjoyment compared to equivalent screen-only versions, while adding simple physical control or changing group size (solo or pairs) do not have significant effects. Furthermore, gesture analysis provides insight as to why experiencing physical phenomena may enhance learning. My thesis work further investigates what features of a mixed-reality system yield better learning and enjoyment, especially in the context of limited experimental results from other mixed-reality learning research. Most mixed-reality environments, including tangible interfaces (where users manipulate physical objects to create an interactive output), currently emphasize open-ended exploration and problem solving, and are claimed to be most effective when used in a discovery-learning mode with minimal guidance. I investigated how critical to learning and enjoyment interactive guidance and feedback is (e.g. predict/observe/explain prompting structure with interactive feedback), in the context of EarthShake. In a third experiment, I compared the learning and enjoyment outcomes of children interacting with a version of EarthShake that supports guided-discovery, another version that supports exploration in discovery-learning mode, and a version that is a combination of both guideddiscovery and exploration. The results of the experiment reveals that Guided-discovery and Combined conditions where children are exposed to the guided discovery activities with the predict-observe-explain cycle with interactive feedback yield better explanation and reasoning. Thus, having guided-discovery in a mixed-reality environment helps with formulating explanation theories in children’s minds. However, the results also suggest that, children are able to activate explanatory theory in action better when the guided discovery activities are combined with exploratory activities in the mixed-reality system. Adding exploration to guided-discovery activities, not only fosters better learning of the balance/physics principles, but also better application of those principles in a hands-on, constructive problem-solving task. My dissertation contributes to the literatures on the effects of physical observation and mixed-reality interaction on students’ science learning outcomes in learning technologies. Specifically, I have shown that a mixed-reality system (i.e., combining physical and virtual environments) can lead to superior learning and enjoyment outcomes than screen-only alternatives, based on different measures. My work also contributes to the literature of exploration and guided-discovery learning, by demonstrating that having guided-discovery activities in a mixed-reality setting can improve children’s fundamental principle learning by helping them formulate explanations. It also shows that combining an engineering approach with scientific thinking practice (by combining exploration and guided-discovery activities) can lead to better engineering outcomes such as transferring to constructive hands-on activities in the real world. Lastly, my work aims to make a contribution from the design perspective by creating a new mixed-reality educational system that bridges physical and virtual environments to improve children’s learning and enjoyment in a collaborative way, fostering productive dialogue and scientific curiosity in museum and school settings, through an iterative design methodology to ensure effective learning and enjoyment outcomes in these settings.
APA, Harvard, Vancouver, ISO, and other styles
8

Dahl, Tyler. "Real-Time Object Removal in Augmented Reality." DigitalCommons@CalPoly, 2018. https://digitalcommons.calpoly.edu/theses/1905.

Full text
Abstract:
Diminished reality, as a sub-topic of augmented reality where digital information is overlaid on an environment, is the perceived removal of an object from an environment. Previous approaches to diminished reality used digital replacement techniques, inpainting, and multi-view homographies. However, few used a virtual representation of the real environment, limiting their domains to planar environments. This thesis provides a framework to achieve real-time diminished reality on an augmented reality headset. Using state-of-the-art hardware, we combine a virtual representation of the real environment with inpainting to remove existing objects from complex environments. Our work is found to be competitive with previous results, with a similar qualitative outcome under the limitations of available technology. Additionally, by implementing new texturing algorithms, a more detailed representation of the real environment is achieved.
APA, Harvard, Vancouver, ISO, and other styles
9

Ens, Barrett. "Spatial Analytic Interfaces." ACM, 2014. http://hdl.handle.net/1993/31595.

Full text
Abstract:
We propose the concept of spatial analytic interfaces (SAIs) as a tool for performing in-situ, everyday analytic tasks. Mobile computing is now ubiquitous and provides access to information at nearly any time or place. However, current mobile interfaces do not easily enable the type of sophisticated analytic tasks that are now well-supported by desktop computers. Conversely, desktop computers, with large available screen space to view multiple data visualizations, are not always available at the ideal time and place for a particular task. Spatial user interfaces, leveraging state-of-the-art miniature and wearable technologies, can potentially provide intuitive computer interfaces to deal with the complexity needed to support everyday analytic tasks. These interfaces can be implemented with versatile form factors that provide mobility for doing such taskwork in-situ, that is, at the ideal time and place. We explore the design of spatial analytic interfaces for in-situ analytic tasks, that leverage the benefits of an upcoming generation of light-weight, see-through, head-worn displays. We propose how such a platform can meet the five primary design requirements for personal visual analytics: mobility, integration, interpretation, multiple views and interactivity. We begin with a design framework for spatial analytic interfaces based on a survey of existing designs of spatial user interfaces. We then explore how to best meet these requirements through a series of design concepts, user studies and prototype implementations. Our result is a holistic exploration of the spatial analytic concept on a head-worn display platform.
October 2016
APA, Harvard, Vancouver, ISO, and other styles
10

Pederson, Thomas. "From Conceptual Links to Causal Relations — Physical-Virtual Artefacts in Mixed-Reality Space." Doctoral thesis, Umeå : Univ, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-137.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Mixed reality interfaces"

1

Shumaker, Randall. Virtual Augmented and Mixed Reality. Designing and Developing Augmented and Virtual Environments: 5th International Conference, VAMR 2013, Held as Part of HCI International 2013, Las Vegas, NV, USA, July 21-26, 2013, Proceedings, Part I. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Virtual, Augmented and Mixed Reality: Applications of Virtual and Augmented Reality. Springer, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shumaker, Randall. Virtual, Augmented and Mixed Reality: Systems and Applications. Springer, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Shumaker, Randall, and Stephanie Lackey. Virtual, Augmented and Mixed Reality: 7th International Conference, VAMR 2015, Held as Part of HCI International 2015, Los Angeles, CA, USA, August ... Springer, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shumaker, Randall, and Lackey Stephanie. Virtual, Augmented and Mixed Reality : Designing and Developing Augmented and Virtual Environments: 6th International Conference, VAMR 2014, Held as ... I. Springer, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Shumaker, Randall, and Stephanie Lackey. Virtual, Augmented and Mixed Reality: 7th International Conference, VAMR 2015, Held As Part of HCI International 2015, Los Angeles, CA, USA, August 2-7, 2015, Proceedings. Springer London, Limited, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Shumaker, Randall, and Lackey Stephanie. Virtual, Augmented and Mixed Reality : Designing and Developing Augmented and Virtual Environments: 6th International Conference, VAMR 2014, Held As Part of HCI International 2014, Heraklion, Crete, Greece, June 22-27, 2014, Proceedings, Part I. Springer London, Limited, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Mixed reality interfaces"

1

Ishii, Hiroshi. "Tangible Bits: Coupling Physicality and Virtuality Through Tangible User Interfaces." In Mixed Reality, 229–47. Berlin, Heidelberg: Springer Berlin Heidelberg, 1999. http://dx.doi.org/10.1007/978-3-642-87512-0_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Alce, Günter, Philip Alm, Rikard Tyllström, Anthony Smoker, and Diederick C. Niehorster. "Design and Evaluation of Three User Interfaces for Detecting Unmanned Aerial Vehicles Using Virtual Reality." In Virtual Reality and Mixed Reality, 36–49. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16234-3_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Williamson, John, and Roderick Murray-Smith. "Multimodal Excitatory Interfaces with Automatic Content Classification." In The Engineering of Mixed Reality Systems, 233–50. London: Springer London, 2009. http://dx.doi.org/10.1007/978-1-84882-733-2_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Campos, Pedro, and Sofia Pessanha. "Designing Augmented Reality Tangible Interfaces for Kindergarten Children." In Virtual and Mixed Reality - New Trends, 12–19. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22021-0_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Salles Dias, José Miguel, Pedro Santos, and Rafael Bastos. "Gesturing with Tangible Interfaces for Mixed Reality." In Gesture-Based Communication in Human-Computer Interaction, 399–408. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-24598-8_37.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dubois, Emmanuel, Guillaume Gauffre, Cédric Bach, and Pascal Salembier. "Participatory Design Meets Mixed Reality Design Models." In Computer-Aided Design of User Interfaces V, 71–84. Dordrecht: Springer Netherlands, 2007. http://dx.doi.org/10.1007/978-1-4020-5820-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Touyama, Hideaki. "Towards Noninvasive Brain-Computer Interfaces during Standing for VR Interactions." In Virtual and Mixed Reality - New Trends, 290–94. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22021-0_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Dennison, Mark, Mark Mittrick, John Richardson, Theron Trout, Adrienne Raglin, Eric Heilman, and Timothy Hanratty. "Evaluation of Immersive Interfaces for Tactical Decision Support." In Virtual, Augmented and Mixed Reality. Multimodal Interaction, 428–40. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-21607-8_33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sylaiou, Stella, Vassilis Killintzis, Ioannis Paliokas, Katerina Mania, and Petros Patias. "Usability Evaluation of Virtual Museums’ Interfaces Visualization Technologies." In Virtual, Augmented and Mixed Reality. Applications of Virtual and Augmented Reality, 124–33. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07464-1_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Teixeira, Luís, Emília Duarte, Júlia Teles, and Francisco Rebelo. "Evaluation of Human Performance Using Two Types of Navigation Interfaces in Virtual Reality." In Virtual and Mixed Reality - New Trends, 380–86. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22021-0_42.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Mixed reality interfaces"

1

Chrysanthou, Antrea, Styliani Kleanthous, and Elena Matsi. "Interacting in mixed reality." In IUI '20: 25th International Conference on Intelligent User Interfaces. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3377325.3377532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Norman, Mitchell, Gun Lee, Ross T. Smith, and Mark Billinqhurs. "A Mixed Presence Collaborative Mixed Reality System." In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2019. http://dx.doi.org/10.1109/vr.2019.8797966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Oduola, Cassandra. "Assessing Empathy through Mixed Reality." In IUI'16: 21st International Conference on Intelligent User Interfaces. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2876456.2876466.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Plecher, David A., Maximilian Wandinger, and Gudrun Klinker. "Mixed Reality for Cultural Heritage." In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2019. http://dx.doi.org/10.1109/vr.2019.8797846.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Back, Regina, David A. Plecher, Rainer Wenrich, Birgit Dorner, and Gudrun Klinker. "Mixed Reality in Art Education." In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2019. http://dx.doi.org/10.1109/vr.2019.8798101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Liarokapis, Fotis, and Robert M. Newman. "Design experiences of multimodal mixed reality interfaces." In the 25th annual ACM international conference. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1297144.1297152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rebol, Manuel, Claudia Ranniger, Colton Hood, Erin Horan, Adam Rutenberg, Neal Sikka, Yasser Ajabnoor, Safinaz Alshikah, and Krzysztof Pietroszek. "Mixed Reality Communication System for Procedural Tasks." In AVI 2022: International Conference on Advanced Visual Interfaces. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3531073.3534497.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bitzas, Dimitrios, Sokratis Zouras, Agapi Chrysanthakopoulou, Dimitrios Laskos, Konstantinos Kalatzis, Michail Pavlou, Ioanna Balasi, and Konstantinos Moustakas. "VitaZ: Gamified Mixed Reality Multisensorial lnteractions." In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2019. http://dx.doi.org/10.1109/vr.2019.8798133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Laviola, Enricoandrea, and Antonio E. Uva. "From Lab to Reality: Optimization of Industrial Augmented Reality Interfaces." In 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 2022. http://dx.doi.org/10.1109/ismar-adjunct57072.2022.00208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lindlbauer, David, Anna Maria Feit, and Otmar Hilliges. "Context-Aware Online Adaptation of Mixed Reality Interfaces." In UIST '19: The 32nd Annual ACM Symposium on User Interface Software and Technology. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3332165.3347945.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography