Journal articles on the topic '3D IMMERSIVE TOOL'

To see the other types of publications on this topic, follow the link: 3D IMMERSIVE TOOL.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic '3D IMMERSIVE TOOL.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Medeiros, Daniel, Felipe Carvalho, Lucas Teixeira, Priscilla Braz, Alberto Raposo, and Ismael Santos. "Proposal and evaluation of a tablet-based tool for 3D virtual environments." Journal on Interactive Systems 4, no. 2 (January 29, 2014): 1. http://dx.doi.org/10.5753/jis.2013.633.

Full text
Abstract:
The introduction of embedded sensors in smartphones and tablets allowed the use of these devices to interact with virtual environments. These devices also have the possibility of including additional information and performing naturally non-immersive tasks. This work presents a 3D interaction tablet-based tool, which allows the aggregation of all major 3D interaction tasks, such as navigation, selection, manipulation, system control and symbolic input. This tool is for generalpurpose systems, as well as, engineering applications. Generally this kind of application uses specific interaction devices with four or more degrees of freedom and a common keyboard and mouse for tasks that are naturally non-immersive, such as symbolic input (e.g., text or number input). This article proposes a new tablet-based device that can perform all these major tasks in an immersive environment. It also presents a study case of the use of the device and some user tests.
APA, Harvard, Vancouver, ISO, and other styles
2

Avola, Danilo, Luigi Cinque, and Daniele Pannone. "Design of a 3D Platform for Immersive Neurocognitive Rehabilitation." Information 11, no. 3 (February 28, 2020): 134. http://dx.doi.org/10.3390/info11030134.

Full text
Abstract:
In recent years, advancements in human–computer interaction (HCI) have enabled the development of versatile immersive devices, including Head-Mounted Displays (HMDs). These devices are usually used for entertainment activities as video-gaming or augmented/virtual reality applications for tourist or learning purposes. Actually, HMDs, together with the design of ad-hoc exercises, can also be used to support rehabilitation tasks, including neurocognitive rehabilitation due to strokes, traumatic brain injuries, or brain surgeries. In this paper, a tool for immersive neurocognitive rehabilitation is presented. The tool allows therapists to create and set 3D rooms to simulate home environments in which patients can perform tasks of their everyday life (e.g., find a key, set a table, do numerical exercises). The tool allows therapists to implement the different exercises on the basis of a random mechanism by which different parameters (e.g., objects position, task complexity) can change over time, thus stimulating the problem-solving skills of patients. The latter aspect plays a key role in neurocognitive rehabilitation. Experiments obtained on 35 real patients and comparative evaluations, conducted by five therapists, of the proposed tool with respect to the traditional neurocognitive rehabilitation methods highlight remarkable results in terms of motivation, acceptance, and usability as well as recovery of lost skills.
APA, Harvard, Vancouver, ISO, and other styles
3

Sadeghi, Amir H., Wouter Bakhuis, Frank Van Schaagen, Frans B. S. Oei, Jos A. Bekkers, Alexander P. W. M. Maat, Edris A. F. Mahtab, Ad J. J. C. Bogers, and Yannick J. H. J. Taverne. "Immersive 3D virtual reality imaging in planning minimally invasive and complex adult cardiac surgery." European Heart Journal - Digital Health 1, no. 1 (November 1, 2020): 62–70. http://dx.doi.org/10.1093/ehjdh/ztaa011.

Full text
Abstract:
Abstract Aims Increased complexity in cardiac surgery over the last decades necessitates more precise preoperative planning to minimize operating time, to limit the risk of complications during surgery and to aim for the best possible patient outcome. Novel, more realistic, and more immersive techniques, such as three-dimensional (3D) virtual reality (VR) could potentially contribute to the preoperative planning phase. This study shows our initial experience on the implementation of immersive VR technology as a complementary research-based imaging tool for preoperative planning in cardiothoracic surgery. In addition, essentials to set up and implement a VR platform are described. Methods Six patients who underwent cardiac surgery at the Erasmus Medical Center, Rotterdam, The Netherlands, between March 2020 and August 2020, were included, based on request by the surgeon and availability of computed tomography images. After 3D VR rendering and 3D segmentation of specific structures, the reconstruction was analysed via a head mount display. All participating surgeons (n = 5) filled out a questionnaire to evaluate the use of VR as preoperative planning tool for surgery. Conclusion Our study demonstrates that immersive 3D VR visualization of anatomy might be beneficial as a supplementary preoperative planning tool for cardiothoracic surgery, and further research on this topic may be considered to implement this innovative tool in daily clinical practice. Lay summary Over the past decades, surgery on the heart and vessels is becoming more and more complex, necessitating more precise and accurate preoperative planning. Nowadays, operative planning is feasible on flat, two-dimensional computer screens, however, requiring a lot of spatial and three-dimensional (3D) thinking of the surgeon. Since immersive 3D virtual reality (VR) is an upcoming imaging technique with promising results in other fields of surgery, we aimed in this study to explore the additional value of this technique in heart surgery. Our surgeons planned six different heart operations by visualizing computed tomography scans with a dedicated VR headset, enabling them to visualize the patient’s anatomy in an immersive and 3D environment. The outcomes of this preliminary study are positive, with a much more reality-like simulation for the surgeon. In such, VR could potentially be beneficial as a preoperative planning tool for complex heart surgery.
APA, Harvard, Vancouver, ISO, and other styles
4

Papadopoulou, A., D. Kontos, and A. Georgopoulos. "DEVELOPING A VR TOOL FOR 3D ARCHITECTURAL MEASUREMENTS." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVI-2/W1-2022 (February 25, 2022): 421–27. http://dx.doi.org/10.5194/isprs-archives-xlvi-2-w1-2022-421-2022.

Full text
Abstract:
Abstract. Virtual Reality technology has already matured and is capable of offering impressive immersive experiences. AT the same time head mounted devices (HMD) are also offering many possibilities along with the game engine environments. So far, all these impressive technologies have been implemented to increase the popularity of on-line visits and serious games development, as far as their application in the domain of Cultural Heritage is concerned. In this paper we present the development of a set of VR tools, which enable the user to perform accurate measurements within the immersive environment. In this way we believe that these tools will be very helpful and appeal to experts in need of these measurements, as they can perform them in the laboratory instead of visiting the object itself. This toolbox includes measuring the coordinates of single points in 3D space, measuring three-dimensional distances and performing horizontal or vertical cross sections. The first two have been already presented previously (Kontos & Georgopoulos 2020) and this paper focuses on the evaluation of the performance of the toolbox in determining cross sections. The development of the tool is explained in detail and the resulting cross sections of the 3D model of the Holy Aedicule are compared to real measurements performed geodetically. The promising results are discussed and evaluated.
APA, Harvard, Vancouver, ISO, and other styles
5

Byrd, B., M. Warren, J. Fenwick, and P. Bridge. "Development of a novel 3D immersive visualisation tool for manual image matching." Journal of Radiotherapy in Practice 18, no. 4 (May 2, 2019): 318–22. http://dx.doi.org/10.1017/s1460396919000219.

Full text
Abstract:
AbstractAim:The novel Volumetric Image Matching Environment for Radiotherapy (VIMER) was developed to allow users to view both computed tomography (CT) and cone-beam CT (CBCT) datasets within the same 3D model in virtual reality (VR) space. Stereoscopic visualisation of both datasets combined with custom slicing tools and complete freedom in motion enables alternative inspection and matching of the datasets for image-guided radiotherapy (IGRT).Material and methods:A qualitative study was conducted to explore the challenges and benefits of VIMER with respect to image registration. Following training and use of the software, an interview session was conducted with a sample group of six university staff members with clinical experience in image matching.Results:User discomfort and frustration stemmed from unfamiliarity with the drastically different input tools and matching interface. As the primary advantage, the users reported match inspection efficiency when presented with the 3D volumetric renderings of the planning and secondary CBCT datasets.Findings:This study provided initial evidence for the achievable benefits and limitations to consider when implementing a 3D voxel-based dataset comparison VR tool including a need for extensive training and the minimal interruption to IGRT workflow. Key advantages include efficient 3D anatomical interpretation and the capability for volumetric matching.
APA, Harvard, Vancouver, ISO, and other styles
6

Astaneh Asl, Bita, and Carrie Sturts Dossick. "Immersive VR versus BIM for AEC Team Collaboration in Remote 3D Coordination Processes." Buildings 12, no. 10 (September 27, 2022): 1548. http://dx.doi.org/10.3390/buildings12101548.

Full text
Abstract:
Building Information Modeling (BIM) and Virtual Reality (VR) are both tools for collaboration and communication, yet questions still exist as to how and in what ways these tools support technical communication and team decision-making. This paper presents the results of an experimental research study that examined multidisciplinary Architecture, Engineering, and Construction (AEC) team collaboration efficiency in remote asynchronous and synchronous communication methods for 3D coordination processes by comparing BIM and immersive VR both with markup tools. Team collaboration efficiency was measured by Shared Understanding, a psychological method based on Mental Models. The findings revealed that the immersive experience in VR and its markup tool capabilities, which enabled users to draw in a 360-degree environment, supported team communication more than the BIM markup tool features, which allowed only one user to draw on a shared 2D screenshot of the model. However, efficient team collaboration in VR required the members to properly guide each other in the 360-degree environment; otherwise, some members were not able to follow the conversations.
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Yuxuan, Hua Luo, and Yiren Zhou. "Design and Implementation of Virtual Campus Roaming System Based on Unity3d." Journal of Physics: Conference Series 2173, no. 1 (January 1, 2022): 012038. http://dx.doi.org/10.1088/1742-6596/2173/1/012038.

Full text
Abstract:
Abstract With the rapid development of Internet technology and the maturity of 5g technology, virtual reality gradually appears in the public vision, and its involved fields are also expanding. Using virtual reality technology and head mounted display, users’ immersion and authenticity can be improved to the greatest extent. In this paper, an immersive virtual campus roaming system is realized by using 3ds Max tool to create a model, unity 3D tool to build a scene, c# language to write human-computer interaction script, and action one headset device to take Nanchang Institute of technology as an example.
APA, Harvard, Vancouver, ISO, and other styles
8

Guindy, Mary, Attila Barsi, Peter A. Kara, Vamsi K. Adhikarla, Tibor Balogh, and Aniko Simon. "Camera Animation for Immersive Light Field Imaging." Electronics 11, no. 17 (August 27, 2022): 2689. http://dx.doi.org/10.3390/electronics11172689.

Full text
Abstract:
Among novel capture and visualization technologies, light field has made significant progress in the current decade, bringing closer its emergence in everyday use cases. Unlike many other forms of 3D displays and devices, light field visualization does not depend on any viewing equipment. Regarding its potential use cases, light field is applicable to both cinematic and interactive contents. Such contents often rely on camera animation, which is a frequent tool for the creation and presentation of 2D contents. However, while common 3D camera animation is often rather straightforward, light field visualization has certain constraints that must be considered before implementing any variation of such techniques. In this paper, we introduce our work on camera animation for light field visualization. Different types of conventional camera animation were applied to light field contents, which produced an interactive simulation. The simulation was visualized and assessed on a real light field display, the results of which are presented and discussed in this paper. Additionally, we tested different forms of realistic physical camera motion in our study, and based on our findings, we propose multiple metrics for the quality evaluation of light field visualization in the investigated context and for the assessment of plausibility.
APA, Harvard, Vancouver, ISO, and other styles
9

Lütjens, Mona, Thomas Kersten, Boris Dorschel, and Felix Tschirschwitz. "Virtual Reality in Cartography: Immersive 3D Visualization of the Arctic Clyde Inlet (Canada) Using Digital Elevation Models and Bathymetric Data." Multimodal Technologies and Interaction 3, no. 1 (February 20, 2019): 9. http://dx.doi.org/10.3390/mti3010009.

Full text
Abstract:
Due to rapid technological development, virtual reality (VR) is becoming an accessible and important tool for many applications in science, industry, and economy. Being immersed in a 3D environment offers numerous advantages especially for the presentation of geographical data that is usually depicted in 2D maps or pseudo 3D models on the monitor screen. This study investigated advantages, limitations, and possible applications for immersive and intuitive 3D terrain visualizations in VR. Additionally, in view of ever-increasing data volumes, this study developed a workflow to present large scale terrain datasets in VR for current mid-end computers. The developed immersive VR application depicts the Arctic fjord Clyde Inlet in its 160 km × 80 km dimensions at 5 m spatial resolution. Techniques, such as level of detail algorithms, tiling, and level streaming, were applied to run the more than one gigabyte large dataset at an acceptable frame rate. The immersive VR application offered the possibility to explore the terrain with or without water surface by various modes of locomotion. Terrain textures could also be altered and measurements conducted to receive necessary information for further terrain analysis. The potential of VR was assessed in a user survey of persons from six different professions.
APA, Harvard, Vancouver, ISO, and other styles
10

Shojaei, Alireza, Saeed Rokooei, Amirsaman Mahdavian, Lee Carson, and George Ford. "Using immersive video technology for construction management content delivery: a pilot study." Journal of Information Technology in Construction 26 (November 4, 2021): 886–901. http://dx.doi.org/10.36680/j.itcon.2021.047.

Full text
Abstract:
Construction management is considered a hands-on field of study which requires good spatial and visual cognitive ability. Virtual reality and other innovative immersive technologies have been used to facilitate experiential learning and to improve students’ spatial cognitive abilities. Virtual environments have been criticized due to the gamified look of the environment. Static panorama pictures have been previously used to bring a better sense of reality and immersion at the same time in construction education. However, they cannot provide a continuous experience, and the sense of presence (immersion) is not ideal either. Immersive videos such as 360-degree videos can address this shortfall by providing a continuous experience and a better sense of presence. The use of this technology in construction education field is very limited. As a result, this study investigated a pilot experiment where a combination of 360, 180 3D, and flat videos was incorporated as an educational instrument in delivering construction management content. The content was recorded using different configurations from different body postures to further investigate the optimal way of utilizing this technology for content delivery. The content of the videos was focused on construction means and methods. Students reviewed the content using head-mounted display devices and laptop screens and answered a survey designed to capture their perception and experience of using this technology as an educational tool in the construction management field. The results show a positive perception toward using immersive videos in construction education. Furthermore, the students preferred the head-mounted display as their favorite delivery method. As a result, the prospect of incorporating immersive videos to enhance construction management education is promising.
APA, Harvard, Vancouver, ISO, and other styles
11

Orghidan, Radu, Mihaela Gordan, Marius Danciu, and Aurel Vlaicu. "A Prototype for the Creation and Interactive Visualization of 3D Human Face Models." Advanced Engineering Forum 8-9 (June 2013): 45–54. http://dx.doi.org/10.4028/www.scientific.net/aef.8-9.45.

Full text
Abstract:
This paper presents a complete solution for the creation and manipulation of the 3D representationof a human face model using off the shelf components. First, the 3D model is obtained bymeans of a structured light system that is calibrated using only the vanishing points extracted from asimple planar surface. Then, an immersive interaction technique is used to manipulate the 3D model.The interaction tool is located in 3D space using a fuzzy technique with the advantages of a lowmemory usage, real-time operation and low positioning errors as compared to classical solutions. Experimentalresults including accuracy evaluation of the reconstruction and of the interaction tool arepresented. The resulting system can find applications in expression based human-computer interaction,virtual assisted cosmetic surgery, as well as in the teleconferencing, being in line with the currenttrends of intelligent user interfaces.
APA, Harvard, Vancouver, ISO, and other styles
12

Ruiz, Maria, Idoia Mujika, Arantza Arregi, Pablo Aguirrezabal, David Custodio, Mikel Pajares, and Judit Gómez. "EDUKA: Design and development of an intelligent tutor and author tool for the personalised generation of itineraries and training activities in inmersive 3D and 360° educational environments." International Journal of Production Management and Engineering 11, no. 1 (January 31, 2023): 31–42. http://dx.doi.org/10.4995/ijpme.2023.18013.

Full text
Abstract:
Nowadays, Virtual and Augmented Reality have begun to be integrated in the educational field for the creation of immersive learning environments. This research presents the results of a project called “EDUKA: Intelligent tutor and author tool for the personalised generation of itineraries and training activities in immersive 3D and 360º educational environments”, funded by the Basque Government (BG) (Economic Development, Sustainability and Environment Department). The project started in April 2018 and was completed in December 2020. Nowadays, an improved version is being developed in a project called IKASNEED, also supported by the BG. The aim of the study was to develop research around a set of latest generation technologies that offer interdependence to educational centres for the adoption, development and integration of Virtual Reality (VR), Augmented Reality (AR) and immersive content technologies in their study plans.
APA, Harvard, Vancouver, ISO, and other styles
13

Pérez, Gabriel, Xavier F. Rodríguez, Josep María Burgués, Marc Solé, and Julià Coma. "3D immersive learning in architecture and construction areas = Aprendizaje inmersivo 3D en el campo de la arquitectura y construcción." Advances in Building Education 4, no. 2 (November 25, 2020): 9. http://dx.doi.org/10.20868/abe.2020.2.4460.

Full text
Abstract:
Beyond of BIM technology consolidation, the incorporation of virtual reality in the field of architecture and construction implies a new breakthrough which implies several advantages both at professional and academic levels. With the aim to measure the potential of this technology, a set of immersive activities were performed at the University of Lleida in the degree of Technical Architecture and Buildings. From the results, it could be observed the great ease of students in learning the use of virtual reality tools and the great potential of these solutions not only in learning architecture and construction but also as a professional working tool, especially for the design phases.ResumenMás allá de la consolidación de la tecnología BIM, la incorporación de la realidad virtual en el campo de la arquitectura y la construcción implica un nuevo avance que conlleva varias ventajas tanto a nivel profesional como académico. Con el objetivo de medir el potencial de esta tecnología, se realizaron un conjunto de actividades inmersivas en el grado de Arquitectura Técnica y Edificación de la Universidad de Lleida. A partir de los resultados, se pudo observar la gran facilidad de los estudiantes para aprender el uso de herramientas de realidad virtual y el gran potencial de estas soluciones no solo en el aprendizaje de arquitectura y construcción, sino también como una herramienta de trabajo profesional, especialmente para las fases de diseño.
APA, Harvard, Vancouver, ISO, and other styles
14

Walmsley, A., and T. P. Kersten. "LOW-COST DEVELOPMENT OF AN INTERACTIVE, IMMERSIVE VIRTUAL REALITY EXPERIENCE OF THE HISTORIC CITY MODEL STADE 1620." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W17 (November 29, 2019): 405–11. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w17-405-2019.

Full text
Abstract:
Abstract. As virtual reality and 3D documentation and modelling technologies become increasingly powerful and affordable tools for architecture, planning, and cultural heritage preservation and communication, it has become increasingly important to develop low-cost methodologies for the creation of 3D immersive virtual environments and interactive experiences. Doing so makes this technology more viable for institutions such as museums and other cultural institutions, who often work within strict budgets. In this paper, we describe a workflow used to build an interactive, immersive virtual reality experience around a virtual city model of the town of Stade (Germany) in the year 1620. This virtual city model is based on a physical 3D model of the town, exhibited in the Stade town hall. The workflow begins with the digitization of this model using digital photogrammetry, followed by the subsequent low- and high-polygon modelling of the individual architectural assets in Autodesk Maya, texture mapping in Substance Painter and finally visualisation within Unreal Engine 4. The results of this workflow are a detailed 3D historical environment with a high degree of realism and in which interactivity can easily be added. In addition, the workflow takes a highly iterative approach that allows the performance of the virtual environments in the game engine to be monitored at each stage of the process, and that allows adjustments to be made quickly. To increase the potential of the virtual environment as a tool for education and communication, interactive elements and simple game mechanics are currently being integrated.
APA, Harvard, Vancouver, ISO, and other styles
15

Bairamian, David, Shinuo Liu, and Behzad Eftekhar. "Virtual Reality Angiogram vs 3-Dimensional Printed Angiogram as an Educational tool—A Comparative Study." Neurosurgery 85, no. 2 (February 2, 2019): E343—E349. http://dx.doi.org/10.1093/neuros/nyz003.

Full text
Abstract:
Abstract BACKGROUND Three-dimensional (3D) visualization of the neurovascular structures has helped preoperative surgical planning. 3D printed models and virtual reality (VR) devices are 2 options to improve 3D stereovision and stereoscopic depth perception of cerebrovascular anatomy for aneurysm surgery. OBJECTIVE To investigate and compare the practicality and potential of 3D printed and VR models in a neurosurgical education context. METHODS The VR angiogram was introduced through the development and testing of a VR smartphone app. Ten neurosurgical trainees from Australia and New Zealand participated in a 2-part interactive exercise using 3 3D printed and VR angiogram models followed by a questionnaire about their experience. In a separate exercise to investigate the learning curve effect on VR angiogram application, a qualified neurosurgeon was subjected to 15 exercises involving manipulating VR angiograms models. RESULTS VR angiogram outperformed 3D printed model in terms of resolution. It had statistically significant advantage in ability to zoom, resolution, ease of manipulation, model durability, and educational potential. VR angiogram had a higher questionnaire total score than 3D models. The 3D printed models had a statistically significant advantage in depth perception and ease of manipulation. The results were independent of trainee year level, sequence of the tests, or anatomy. CONCLUSION In selected cases with challenging cerebrovascular anatomy where stereoscopic depth perception is helpful, VR angiogram should be considered as a viable alternative to the 3D printed models for neurosurgical training and preoperative planning. An immersive virtual environment offers excellent resolution and ability to zoom, potentiating it as an untapped educational tool.
APA, Harvard, Vancouver, ISO, and other styles
16

Soto-Martin, Ovidia, Alba Fuentes-Porto, and Jorge Martin-Gutierrez. "A Digital Reconstruction of a Historical Building and Virtual Reintegration of Mural Paintings to Create an Interactive and Immersive Experience in Virtual Reality." Applied Sciences 10, no. 2 (January 14, 2020): 597. http://dx.doi.org/10.3390/app10020597.

Full text
Abstract:
Nowadays, virtual reality technologies and immersive virtual reality (VR) apps allow people to view, explore, engage with and learn about historic monuments and buildings, historic sites, and even historic scenes. To preserve our cultural heritage for future generations. it is essential that damaged and dilapidated historic artefacts are accurately documented, and that steps are taken to improve user experiences in the areas of virtual visits, science and education. This paper describes an approach to reconstruct and restore historic buildings and mural paintings. The work process uses digital models that are then inserted into an interactive and immersive VR environment. Windows-Mixed Reality is used to visualize the said VR environment. The work method was applied at a United Nations Educational, Scientific and Cultural Organisation (UNESCO) World Heritage Site in Tenerife (Canary Islands, Spain), thereby creating a virtual three dimensional (3D) rendering of the architectural structures of the St Augustine Church in La Laguna and its murals. A combination of topography and terrestrial photogrammetry was used to reconstruct its architectural features, and the digital imaging tool DStretch® to recover its murals. The resulting 3D model was then inserted into an immersive and interactive VR environment created using the cross-platform game engine Unity. One of the greatest challenges of this project revolved around recovering the dilapidated and virtually nonexistent mural paintings using DStretch®. However, the final result is an immersive and interactive VR environment containing architectural and artistic information created within the video game engine Unity, which thereby allows the user to explore, observe and interact with a cultural heritage site in real time.
APA, Harvard, Vancouver, ISO, and other styles
17

Sun, Shaoming, Tianwei Xu, and Juxiang Zhou. "The Design and Implementation of Computer Hardware Assembling Virtual Laboratory in the VR Environment." MATEC Web of Conferences 232 (2018): 01051. http://dx.doi.org/10.1051/matecconf/201823201051.

Full text
Abstract:
In solving the problems of slowly updated laboratory equipment, heavy hardware wastage and potential danger in the course of traditional computer hardware assembling experiment, this article proposes a design and concrete implementation of virtual laboratory for computer hardware assembling based on immersive VR environment. Adopt 3Ds Max and Unity3d software to create 3D models and build the scene, and Mojing SDK as the VR display effect and the tool of interactive interface, the article achieved the design and implementation of virtual laboratory application in mobile devices. Current virtual laboratories based on Virtools can only run in Windows environment, and this design overcomes such limitation. In addition, with rich scenes and tutorial, this design combines the head-mounted display and somatosensory controller to greatly promote the immersion and interactivity, thus enhancing the interests of students. This is a new trail to improve the traditional experimental teaching effect.
APA, Harvard, Vancouver, ISO, and other styles
18

Jorgensen, Craig, Jeff Ogden, E. Kerry Willis, Mary Blessing, Kathryn Ann Caudell, Graham Patrick, and Thomas P. Caudell. "Locomotion in a Virtual Environment: Performance Measures and Physiological Responses." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 41, no. 2 (October 1997): 1148–51. http://dx.doi.org/10.1177/107118139704100294.

Full text
Abstract:
We are developing and studying a human-computer interface that allows a scientist to interact with complex software systems through immersive virtual reality technology. Virtual tools are being developed to empower the user to view, manipulate, model, diagnose, analyze, navigate through the software simulations and the multidimensional data it generates. For these tools to be truly effective, they must be evaluated in the context of human performance studies. This paper addresses one such category of tool: an efficient and natural means of locomotion in 3D virtual spaces. The current study is investigating three different methods of virtual body locomotion in the following context. Subjects were required to fly through a series of 3D tunnels while performance times and number of wall collisions were used as response measures. In addition, during each trial subjects were continuously monitored for physiological responses and psychological assessments were performed before and after the series of flights. This paper reports on the preliminary findings of the current study and the lessons learned in performing empirical studies on locomotion in virtual environments.
APA, Harvard, Vancouver, ISO, and other styles
19

Gumonan, Kenn Migan Vincent, and Aleta Fabregas. "ASIAVR: Asian Studies Virtual Reality Game a Learning Tool." International Journal of Computing Sciences Research 5, no. 1 (January 1, 2021): 475–88. http://dx.doi.org/10.25147/ijcsr.2017.001.1.53.

Full text
Abstract:
Purpose–The study aims to develop an application that will serve as an alternative learning tool for learning Asian Studies. The delivery of lessons into a virtual reality game dependson the pace of students. The developed application comprises several more features that enable users to get valuable information from an immersive environment.Method–The researchers used Rapid Application Development (RAD) in developing the application. It follows phases such as requirement planning, user design, construction, and cutover. Two sets of questionnaires were developed, one for the teachers and another for the students. Then, testing and evaluation were conducted through purposive sampling to select the respondents.Results–The application was overall rated as 3.56 which is verbally interpreted as very good. The result was based on the system evaluation using ISO 9126 in terms of functionality, usability, content, reliability, and performance. 476Conclusion–The developed application meets the objectives to provide an alternative learning tool for learning Asian Studies. The application is well commended and accepted by the end-users to provide an interactive and immersive environment for students to learn at their own pace. Recommendations–Further enhancement of the audio, gameplay, and graphics of the tool. Schools should take into consideration the adoption of the Asian Studies Virtual Reality as a good alternative tool for their teachers and students to teach and learn Asian Studies. The use of more 3D objects relevant to the given information to enhance game experience may be considered.A databank for the quiz questions that will be loaded into the game should also be considered.Research Implications–The integration of modern technology in education has been a vital part of the learning process, especially when technological resources are available. Development and adaptation of this application will promote an alternative way of independent learning among students and will give them a better understanding of Asian Studies at their own pace.
APA, Harvard, Vancouver, ISO, and other styles
20

Virtanen, Juho-Pekka, Kaisa Jaalama, Tuulia Puustinen, Arttu Julin, Juha Hyyppä, and Hannu Hyyppä. "Near Real-Time Semantic View Analysis of 3D City Models in Web Browser." ISPRS International Journal of Geo-Information 10, no. 3 (March 4, 2021): 138. http://dx.doi.org/10.3390/ijgi10030138.

Full text
Abstract:
3D city models and their browser-based applications have become an increasingly applied tool in the cities. One of their applications is the analysis views and visibility, applicable to property valuation and evaluation of urban green infrastructure. We present a near real-time semantic view analysis relying on a 3D city model, implemented in a web browser. The analysis is tested in two alternative use cases: property valuation and evaluation of the urban green infrastructure. The results describe the elements visible from a given location, and can also be applied to object type specific analysis, such as green view index estimation, with the main benefit being the freedom of choosing the point-of-view obtained with the 3D model. Several promising development directions can be identified based on the current implementation and experiment results, including the integration of the semantic view analysis with virtual reality immersive visualization or 3D city model application development platforms.
APA, Harvard, Vancouver, ISO, and other styles
21

Sancho, Jaime, Pallab Sutradhar, Gonzalo Rosa, Miguel Chavarrías, Angel Perez-Nuñez, Rubén Salvador, Alfonso Lagares, Eduardo Juárez, and César Sanz. "GoRG: Towards a GPU-Accelerated Multiview Hyperspectral Depth Estimation Tool for Medical Applications." Sensors 21, no. 12 (June 14, 2021): 4091. http://dx.doi.org/10.3390/s21124091.

Full text
Abstract:
HyperSpectral (HS) images have been successfully used for brain tumor boundary detection during resection operations. Nowadays, these classification maps coexist with other technologies such as MRI or IOUS that improve a neurosurgeon’s action, with their incorporation being a neurosurgeon’s task. The project in which this work is framed generates an unified and more accurate 3D immersive model using HS, MRI, and IOUS information. To do so, the HS images need to include 3D information and it needs to be generated in real-time operating room conditions, around a few seconds. This work presents Graph cuts Reference depth estimation in GPU (GoRG), a GPU-accelerated multiview depth estimation tool for HS images also able to process YUV images in less than 5.5 s on average. Compared to a high-quality SoA algorithm, MPEG DERS, GoRG YUV obtain quality losses of −0.93 dB, −0.6 dB, and −1.96% for WS-PSNR, IV-PSNR, and VMAF, respectively, using a video synthesis processing chain. For HS test images, GoRG obtains an average RMSE of 7.5 cm, with most of its errors in the background, needing around 850 ms to process one frame and view. These results demonstrate the feasibility of using GoRG during a tumor resection operation.
APA, Harvard, Vancouver, ISO, and other styles
22

Grznár, Patrik, Štefan Mozol, Gabriela Gabajová, and Lucia Mozolová. "APPLICATION OF VIRTUAL REALITY IN THE DESIGN OF PRODUCTION SYSTEMS AND TEACHING." Acta Tecnología 7, no. 2 (June 30, 2021): 67–70. http://dx.doi.org/10.22306/atec.v7i2.110.

Full text
Abstract:
The content of the contribution is a description of the virtual reality use in the design of production processes and in the teaching process. The aim of the article is to describe the methodology for 3D virtual design of production systems as well as a description of the creation of interconnectors for simulation tools Siemens Tecnomatix, Simio, AutoCAD and Dassalut systems by using the moreViz tool. The use in the design of production systems lies precisely in the ability to realize the projected space on a scale of 1:1, which will allow maximum use of space and its orientation. The effect in teaching lies in a better understanding of the elements that can be displayed in fully immersive virtual reality, which helps to teach faster and engage spatial imagination.
APA, Harvard, Vancouver, ISO, and other styles
23

Ayerbe, Victor Manuel Caicedo, Martha Lucía Velasco Morales, Carlos Javier Latorre Rojas, and María Lucía Arango Cortés. "Visualization of 3D Models Through Virtual Reality in the Planning of Congenital Cardiothoracic Anomalies Correction: An Initial Experience." World Journal for Pediatric and Congenital Heart Surgery 11, no. 5 (August 27, 2020): 627–29. http://dx.doi.org/10.1177/2150135120923618.

Full text
Abstract:
We present the case of an nine-year-old girl with double outlet right ventricle with noncommitted ventricular septal defect and malposition of the great arteries who had undergone repair at the age of seven months. Six years later, the patient presented with right ventricular failure, conduit calcification with obstruction, and obstruction of the left ventricular outflow tract. Three-dimensional models reconstructed by Digital Imaging and Communications in Medicine (DICOM) images of the patient were visualized in a virtual reality system to help plan the surgical correction of the intracardiac congenital anomalies. This tool allowed us to inspect the intracardiac anatomy in an immersive environment with a clearer sense of perspective.
APA, Harvard, Vancouver, ISO, and other styles
24

Sines, P., and B. Das. "VRSEMLAB: A Low Cost Virtual Reality System to Illustrate Complex Concepts Involving Spatial Relationships." International Journal of Virtual Reality 5, no. 1 (January 1, 2001): 157–66. http://dx.doi.org/10.20870/ijvr.2001.5.1.2677.

Full text
Abstract:
Virtual Reality (VR) is a powerful education/training tool, and its interactive immersive environment is particularly suitable for the illustration of complex concepts involving spatial relationships. Although many virtual reality applications have been developed for education and training, the wide spread use of this technology has been primarily limited by the high cost of VR systems. We have developed VRSEMLAB, a low cost VR application to teach the complex subject of semiconductor device physics to undergraduate students. VRSEMLAB demonstrates an effective method for using virtual reality to model complex 3D systems in an experimental environment free of concerns of physical safety, equipment limitations, and size limitations.
APA, Harvard, Vancouver, ISO, and other styles
25

Yang, Yu-Sheng, Alicia M. Koontz, Yu-Hsuan Hsiao, Cheng-Tang Pan, and Jyh-Jong Chang. "Assessment of Wheelchair Propulsion Performance in an Immersive Virtual Reality Simulator." International Journal of Environmental Research and Public Health 18, no. 15 (July 29, 2021): 8016. http://dx.doi.org/10.3390/ijerph18158016.

Full text
Abstract:
Maneuvering a wheelchair is an important necessity for the everyday life and social activities of people with a range of physical disabilities. However, in real life, wheelchair users face several common challenges: articulate steering, spatial relationships, and negotiating obstacles. Therefore, our research group has developed a head-mounted display (HMD)-based intuitive virtual reality (VR) stimulator for wheelchair propulsion. The aim of this study was to investigate the feasibility and efficacy of this VR stimulator for wheelchair propulsion performance. Twenty manual wheelchair users (16 men and 4 women) with spinal cord injuries ranging from T8 to L2 participated in this study. The differences in wheelchair propulsion kinematics between immersive and non-immersive VR environments were assessed using a 3D motion analysis system. Subjective data of the HMD-based intuitive VR stimulator were collected with a Presence Questionnaire and individual semi-structured interview at the end of the trial. Results indicated that propulsion performance was very similar in terms of start angle (p = 0.34), end angle (p = 0.46), stroke angle (p = 0.76), and shoulder movement (p = 0.66) between immersive and non-immersive VR environments. In the VR episode featuring an uphill journey, an increase in propulsion speed (p < 0.01) and cadence (p < 0.01) were found, as well as a greater trunk forward inclination (p = 0.01). Qualitative interviews showed that this VR simulator made an attractive, novel impression and therefore demonstrated the potential as a tool for stimulating training motivation. This HMD-based intuitive VR stimulator can be an effective resource to enhance wheelchair maneuverability experiences.
APA, Harvard, Vancouver, ISO, and other styles
26

Uhrík, Martin, Alexander Kupko, Michaela Krpalová, and Roman Hajtmanek. "Augmented reality and tangible user interfaces as an extension of computational design tools." Architecture Papers of the Faculty of Architecture and Design STU 27, no. 4 (December 1, 2022): 18–27. http://dx.doi.org/10.2478/alfa-2022-0021.

Full text
Abstract:
Abstract The paper envisions the use of Augmented Reality (AR) as an interactive and communication tool utilized in the architectural design research, education, and practice. It summarises the current knowledge and various applications of this immersive technology in both the theoretical and practical field and focuses on a particular type of the AR implementation – tangible user interfaces (TUI) – in a computational design context. The outcome of the research is an adaptation of the originally GRASS-GIS-powered Tangible Landscape tool into Grasshopper 3D environment, which is more accurate and suitable for the architectural design workflow with respect to 3D computation, algorithmic modelling and different scale management. The newly prototyped tool is reactive to the modifications of the physical model and projects the computed additional information on it in real time and thus can communicate with the designer or observer, which results in a more interactive, haptic man-machine interface. The projected and visualised data on the physical model are the outcome of the computing algorithm designed in Grasshopper that allows for a wide range of applications, including the visualisation of shadows and solar potential analysis and thus depicts the physical model in multiple dimensions. Furthermore, the article discusses the potential and further development of this tool as well as the possibilities of layering different AR technologies in the subsequent research.
APA, Harvard, Vancouver, ISO, and other styles
27

Alaniz Uribe, Francisco, and Bram Van der Heijden. "Virtual Density." Canadian Planning and Policy / Aménagement et politique au Canada 2022 (August 31, 2022): 49–70. http://dx.doi.org/10.24908/cppapc.v2022i1.15440.

Full text
Abstract:
The process of densification in existing communities is complex and often encounters resistance. Public engagement is a crucial component to this process and requires appropriate visual and spatial communication tools. This pilot project explored the use of Virtual Reality (VR) as a spatial communication tool that uses CAD graphics with new visualization technology to provide the public with an immersive experience in a virtual environment. Using a pair of VR goggles and a digital 3D model, different scenarios were presented to members of the public to test their perception of various density models. Three density scenarios were presented to the public, both in the form of traditional posters and using a headset and VR computer model. We found that the public were able to better understand the scenarios and were more accepting of densification when while visualizing the proposed density scenarios via the VR interface.
APA, Harvard, Vancouver, ISO, and other styles
28

Belaroussi, Rachid, Huiying Dai, Elena Díaz González, and Jorge Martín Gutiérrez. "Designing a Large-Scale Immersive Visit in Architecture, Engineering, and Construction." Applied Sciences 13, no. 5 (February 27, 2023): 3044. http://dx.doi.org/10.3390/app13053044.

Full text
Abstract:
Throughout history, tools for engineering in the building industry have evolved. Due to the arrival of Industry 4.0, Computer-Aided Design (CAD) and Building Information Modeling (BIM) software have replaced the usage of pens, pencils, and paper in the design process. This paper describes the work required to design a large-scale immersive visit of a district under construction in a suburban area of Greater Paris, France. As part of this real estate project, called LaVallée, we have access to its city information model: all the BIMs of the works to be carried out including roads, terrain, street furniture, fountains, and landscaping. This paper describes all the technical operations necessary for the design of an immersive 3D model with a high level of detail of the neighborhood with its surroundings. The objective of this technical report was to provide practitioners with feedback on such an achievement based on industrial-level data. The development of the city model begins with the registration of all the BIMs from different firms in a common Geographic Information System: this gives the opportunity to confront the operational requirement of a construction phase and the actual current practice of architecture firms. A first prototype was developed using the archviz tool TwinMotion. In order to increase the realism of the model, we describe the creation of a pipeline in Unreal Engine with the automated tasks of material and mesh replacement and the lighting and landscape configuration. The main contribution of this work is to give relevant experience on building such a large-scale model, with the Python script when possible, as well as the necessary manual steps. It is a valuable contribution to the making of large-scale immersive visits with a high level of detail and their requirements.
APA, Harvard, Vancouver, ISO, and other styles
29

Wang, Mingming, Anjali Jogeshwar, Gabriel J. Diaz, Jeff B. Pelz, and Susan Farnand. "Demonstration of a Virtual Reality Driving Simulation Platform." Electronic Imaging 2020, no. 9 (January 26, 2020): 39–1. http://dx.doi.org/10.2352/issn.2470-1173.2020.9.iqsp-039.

Full text
Abstract:
A virtual reality (VR) driving simulation platform has been built for use in addressing multiple research interests. This platform is a VR 3D engine (Unity © ) that provides an immersive driving experience viewed in an HTC Vive © head-mounted display (HMD). To test this platform, we designed a virtual driving scenario based on a real tunnel used by Törnros to perform onroad tests [1] . Data from the platform, including driving speed and lateral lane position, was compared the published on-road tests. The correspondence between the driving simulation and onroad tests is assessed to demonstrate the ability of our platform as a research tool. In addition, the drivers’ eye movement data, such as 3D gaze point of regard (POR), will be collected during the test with an Tobii © eye-tracker integrated in the HMD. The data set will be analyzed offline and examined for correlations with driving behaviors in future study.
APA, Harvard, Vancouver, ISO, and other styles
30

Martín-Sacristán, David, Carlos Herranz, Jose F. Monserrat, Andrzej Szczygiel, Nandish P. Kuruvatti, David Garcia-Roger, Danaisy Prado, Mauro Boldi, Jakob Belschner, and Hans D. Schotten. "5G Visualization: The METIS-II Project Approach." Mobile Information Systems 2018 (September 24, 2018): 1–8. http://dx.doi.org/10.1155/2018/2084950.

Full text
Abstract:
One of the main objectives of the METIS-II project was to enable 5G concepts to reach and convince a wide audience from technology experts to decision makers from non-ICT industries. To achieve this objective, it was necessary to provide easy-to-understand and insightful visualization of 5G. This paper presents the visualization platform developed in the METIS-II project as a joint work of researchers and artists, which is a 3D visualization tool that allows viewers to interact with 5G-enabled scenarios, while permitting simulation driven data to be intuitively evaluated. The platform is a game-based customizable tool that allows a rapid integration of new concepts, allows real-time interaction with remote 5G simulators, and provides a virtual reality-based immersive user experience. As a result, the METIS-II visualization platform has successfully contributed to the dissemination of 5G in different fora and its use will be continued after METIS-II.
APA, Harvard, Vancouver, ISO, and other styles
31

Ciasullo, Alessandro. "Universal Design for Learning: the relationship between subjective simulation, virtual environments, and inclusive education." Research on Education and Media 10, no. 1 (June 1, 2018): 42–48. http://dx.doi.org/10.1515/rem-2018-0006.

Full text
Abstract:
Abstract The universality of the educational activities must be in agreement with a series of systems that involve the universality of the subjects who learn and their physical, mental, belief, race and religion differences. The possibilities promoted by an immersive education —made of stimuli aimed at transformation through the use of virtual environments and tools for the use of 3D 360° —are constituted as tools that better interpret the empathic and neurocognitive characteristics of the subjects and therefore the substratum an apprentice on which the cultural dimension is placed, this finding crosses the Universal Design for Learning. Proceeding towards the organization of modular and modular three-dimensional virtual environments responds to all the needs connected to the subject’s formation and constitutes a surprising integrative and inclusive tool in the explanation of the implicit processes of knowledge. We intend to start an experimental phase of study for the possibilities offered verification by this integration, using Federico 3DSU virtual platform to create in its interior, virtual environments involving nine guidelines in CAST in 2008, as well as check out the possibilities pedagogical-didactic.
APA, Harvard, Vancouver, ISO, and other styles
32

Evangelidis, Konstantinos, Stella Sylaiou, and Theofilos Papadopoulos. "Mergin’ Mode: Mixed Reality and Geoinformatics for Monument Demonstration." Applied Sciences 10, no. 11 (May 31, 2020): 3826. http://dx.doi.org/10.3390/app10113826.

Full text
Abstract:
Since smart devices are becoming the primary technological means for daily human activities related to user-location, location-based services constitute a crucial component of the related smart applications. Meanwhile, traditional geospatial tools such as geographic information systems (GIS) in conjunction with photogrammetric techniques and 3D visualization frameworks can achieve immersive virtual reality over custom virtual geospatial worlds. In such environments, 3D scenes with virtual beings and monuments with the assistance of storytelling techniques may reconstruct historical sites and “revive” historical events. Boosting of Internet and wireless network speeds and mixed reality (MR) capabilities generate great opportunities for the development of location-based smart applications with cultural heritage content. This paper presents the MR authoring tool of “Mergin’ Mode” project, aimed at monument demonstration through the merging of the real with the virtual, assisted by geoinformatics technologies. The project does not aim at simply producing an MR solution, but more importantly, an open source platform that relies on location-based data and services, exploiting geospatial functionalities. In the long term, it aspires to contribute to the development of open cultural data repositories and the incorporation of cultural data in location-based services and smart guides, to enable the web of open cultural data, thereby adding extra value to the existing cultural-tourism ecosystem.
APA, Harvard, Vancouver, ISO, and other styles
33

Agudelo-Vélez, Laura, Iván Sarmiento-Ordosgoitia, and Jorge Córdoba-Maquilón. "Virtual reality as a new tool for transport data collection." Archives of Transport 60, no. 4 (December 31, 2021): 23–38. http://dx.doi.org/10.5604/01.3001.0015.5392.

Full text
Abstract:
Transport studies that adopt complex analyses present methodological challenges that lead to the use of innovative techniques to address the limitations of traditional methods. In the Latin American context, people consider security as a relevant variable in their daily lives. Thus, when people travel around the city and choose a mode of transport, secu-rity becomes an important factor and should therefore be included in transport studies. However, the definition of security in terms of transport in the Colombian context remains unclear. Therefore, we examined the security percep-tion effect on transport mode choice by addressing security as a latent variable consisting of three elements: environ-ment, subject and transport mode. We proposed the use of virtual reality (VR) to recreate travel routes and offer partic-ipants a scenario of choice closer to the natural conditions of a trip. The participants were provided routes in the form of immersive 3D videos recreating natural trip conditions to identify their choices and travel behaviour. Recordings were made of daily scenarios and existing urban environments portraying real and active modes of transport, giving respondents an almost-natural experience. The use of 360-degree immersive videos offers a multisensory experience allowing both the capture of socioeconomic and travel information and the collection of journey perception. The experiment evaluated two environments in Medellín, Colombia (secure (E1) and insecure (E2)) and studied the effects of lighting conditions (day (D) and night (N)). A total of four videos (E1D, E1N, E2D and E2N) depicting six transport modes in tandem were assessed by 92 participants from Medellín and Bogotá, Colombia. We found that environment-associated security perception varies depending on the time of the journey (day/night) and one’s familiarity with the environment. The research results position VR as a tool that offers high potential to support transport studies. We found that people’s movements, gestures and expressions while participating in the VR experiments resembled what was expected from journeys in reality. VR constitutes a relevant tool for transport studies, as it allows for an assessment of active transport mode perceptions. It prevents participants from being exposed to the risk associated with travel to specific places and carries out several routes in different transport modes even when participants cannot or have never undertaken journeys in the modes under assessment.
APA, Harvard, Vancouver, ISO, and other styles
34

Riva, Giuseppe, Luca Melis, and Mirco Bolzoni. "Virtual Reality for Assessing Body Image: The Body Image Virtual Reality Scale (Bivrs)." International Journal of Virtual Reality 2, no. 4 (January 1, 1996): 1–11. http://dx.doi.org/10.20870/ijvr.1996.2.4.2613.

Full text
Abstract:
BIVRS, Body Image Virtual Reality Scale, is a prototype of a diagnostic freeware tool designed to assess cognitive and affective components of body image. It consists of a non- immersive 3D graphical interface through which the patient is able to choose between 7 figures which vary in size from underweight to overweight. The software was developed in two architectures, the first (A) running on a single user desktop computer equipped with a standard virtual reality development software, and the second (B) split into a server (B1) accessible via Internet and actually running the same virtual ambient as in (A), and a VRML client (B2) so that anyone can access the application. The importance of a virtual reality based body image scale relies on the possibility to rapidly test one's perceived body image in better and different ways. It also provides an opportunity to easily develop a trans-cultural database on body image data. Furthermore, the possibility of using 3D can improve the effectiveness of the test because it is easier for the subject to perceive the differences between the various proposed silhouettes.
APA, Harvard, Vancouver, ISO, and other styles
35

Alrige, Mayda, Hind Bitar, Waad Al-Suraihi, Kholoud Bawazeer, and Ekram Al-Hazmi. "MicroWorld: An Augmented-Reality Arabian App to Learn Atomic Space." Technologies 9, no. 3 (July 24, 2021): 53. http://dx.doi.org/10.3390/technologies9030053.

Full text
Abstract:
The visualization of objects of an abstract nature has always been a challenge for chemistry learners. Thus, augmented reality (AR) and virtual reality (VR) have been heavily invested in as immersive learning methods for these concepts. This study targets the segment of the chemistry curriculum involving the chemical elements of the periodic table. For this purpose, we developed the AR educational tool called MicroWorld. This Arabic educational AR app was developed in unity with Vuforia SDK. Using MicroWorld, students can visualize chemical elements microstructures in 3D, see 3D models of the elements in their substantial forms, and combine two chemical elements to see how certain chemical compounds can be formed. In this work, MicroWorld’s usability was evaluated by junior high school students and chemistry teachers using the Arabic system usability scale (A-SUS). The A-SUS average score was 71.5 for junior high school students, while the scale for teachers reached 76. This research aims to design, develop, and evaluate the AR app, MicroWorld. This app was built and evaluated through the lens of the design science research paradigm.
APA, Harvard, Vancouver, ISO, and other styles
36

Baker, Steven J., Jenny Waycott, Jeni Warburton, and Frances Batchelor. "THE HIGHWAY OF LIFE: SOCIAL VIRTUAL REALITY AS A REMINISCENCE TOOL." Innovation in Aging 3, Supplement_1 (November 2019): S306. http://dx.doi.org/10.1093/geroni/igz038.1121.

Full text
Abstract:
Abstract A large body of research demonstrates the positive impact that reminiscence activities can have on older adult wellbeing. Within this space, researchers have begun to explore how virtual reality (VR) technology might be used as a reminiscence tool. The immersive characteristics of VR could aid reminiscence by giving the sense of being fully present in a virtual environment that evokes the time being explored in the reminiscence session. However, to date, research into the use of VR as a reminiscence tool has overwhelmingly focussed on static environments that can only be viewed by a single user. This paper reports on a first-of-its-kind research project that used social VR (multiple users co-present in a single virtual environment), and 3D representations of personal artifacts (such as, photographs and recorded anecdotes), to allow a group of older adults to reminisce about their school experiences. Sixteen older adults aged 70-81 participated in a four-month user study, meeting in groups with a facilitator in a social virtual world called the Highway of Life. Results demonstrate how the social experience, tailored environment, and personal artifacts that were features of the social VR environment allowed the older adults to collaboratively reminisce about their school days. We conclude by considering the benefits and challenges associated with using social VR as a reminiscence tool with older adults.
APA, Harvard, Vancouver, ISO, and other styles
37

Bailenson, Jeremy N., Alexandra Davies, Jim Blascovich, Andrew C. Beall, Cade McCall, and Rosanna E. Guadagno. "The Effects of Witness Viewpoint Distance, Angle, and Choice on Eyewitness Accuracy in Police Lineups Conducted in Immersive Virtual Environments." Presence: Teleoperators and Virtual Environments 17, no. 3 (June 1, 2008): 242–55. http://dx.doi.org/10.1162/pres.17.3.242.

Full text
Abstract:
The current study investigated the value of using immersive virtual environment technology as a tool for assessing eyewitness identification. Participants witnessed a staged crime and then examined sequential lineups within immersive virtual environments that contained 3D virtual busts of the suspect and six distractors. Participants either had unlimited viewpoints of the busts in terms of angle and distance, or a unitary view at only a single angle and distance. Furthermore, participants either were allowed to choose the angle and distance of the viewpoints they received, or were given viewpoints without choice. Results demonstrated that unlimited viewpoints improved accuracy in suspect-present lineups but not in suspect-absent lineups. Furthermore, across conditions, post-hoc measurements demonstrated that when the chosen view of the suspect during the lineup was similar to the view during the staged crime in terms of distance, accuracy improved. Finally, participants were more accurate in suspect-absent lineups than in suspect-present lineups. Implications of the findings in terms of theories of eyewitness testimony are discussed, as well as the value of using virtual lineups that elicit high levels of presence in the field. We conclude that digital avatars of higher fidelity may be necessary before actually implementing virtual lineups.
APA, Harvard, Vancouver, ISO, and other styles
38

Bertoldi, Stefano. "C.A.P.I. Project in the Making: 3D Applications at Poggio Imperiale Between Materiality and Virtual Reality (Poggibonsi, IT)." Open Archaeology 7, no. 1 (January 1, 2021): 1444–57. http://dx.doi.org/10.1515/opar-2020-0201.

Full text
Abstract:
Abstract The archaeological project on the hill of Poggio Imperiale began in 1992. From the beginning this project was characterized by intense experimentation with a range of IT applications. During 2014, the University of Siena began a new project focused on the valorisation of archaeological data with the creation of an Open-Air Museum of the Carolingian village, one of the archaeological phases of the settlement. Over the last several years, the use of three-dimensional (3D) data in archaeology has increased exponentially due to the application of photogrammetry to record every stratigraphic unit. This ever-increasing amount of data fostered the development of the C.A.P.I. project (Collina Accessibile di Poggio Imperiale – Accessibility of the Hill of Poggio Imperiale), which involved the construction of a 3D model of the archaeological area of Poggio Imperiale. The project modeled the three main life stages of the hill using 3D computer graphics. Virtual tours can be experienced through PCs, tablets, smartphones, and even virtual reality headsets, offering users a fully immersive experience. However, virtual reality will not be a replacement for the materiality of the archaeological site. On the contrary, it will provide an additional tool to make the site accessible and inclusive to any potential visitor, regardless of physical distance, physical ability, or time zone.
APA, Harvard, Vancouver, ISO, and other styles
39

Jarsaillon, Pierre J., Naohisa Sakamoto, and Akira Kageyama. "Flexible visualization framework for head-mounted display with gesture interaction interface." International Journal of Modeling, Simulation, and Scientific Computing 09, no. 03 (May 24, 2018): 1840002. http://dx.doi.org/10.1142/s1793962318400020.

Full text
Abstract:
As the development of new visualization systems, within the field of simulation, offers their users more insights from their simulations, immersive systems are becoming a part of the visualization techniques. With the recent advancements of the Head-Mounted Displays (HMDs) and popularity of motion sensors, human beings and computer become more interactive. This study aims to evaluate the potential of such systems as a visualization tool through the development of a new flexible framework for visualization within virtual reality (VR) environment, using an Oculus Rift and a Leap Motion. Two approaches are then compared: high-3D object rendering within the virtual scene approach and a user experience-oriented system with an intuitive interface. To assess the quality of the interface and its relevance for the user, different types of gestures are implemented and tested. From an experiment on users to evaluate the developed system as a proper visualization tool, the HMDs, paired with a motion sensor to make a gesture-controlled interface seem to be promising mediums despite various constraints on development regarding the technology limitations.
APA, Harvard, Vancouver, ISO, and other styles
40

Chan, Michael, Alvaro Uribe-Quevedo, Bill Kapralos, Michael Jenkin, Norman Jaimes, and Kamen Kanev. "Virtual and Augmented Reality Direct Ophthalmoscopy Tool: A Comparison between Interactions Methods." Multimodal Technologies and Interaction 5, no. 11 (October 22, 2021): 66. http://dx.doi.org/10.3390/mti5110066.

Full text
Abstract:
Direct ophthalmoscopy (DO) is a medical procedure whereby a health professional, using a direct ophthalmoscope, examines the eye fundus. DO skills are in decline due to the use of interactive diagnostic equipment and insufficient practice with the direct ophthalmoscope. To address the loss of DO skills, physical and computer-based simulators have been developed to offer additional training. Among the computer-based simulations, virtual and augmented reality (VR and AR, respectively) allow simulated immersive and interactive scenarios with eye fundus conditions that are difficult to replicate in the classroom. VR and AR require employing 3D user interfaces (3DUIs) to perform the virtual eye examination. Using a combination of a between-subjects and within-subjects paradigm with two groups of five participants, this paper builds upon a previous preliminary usability study that compared the use of the HTC Vive controller, the Valve Index controller, and the Microsoft HoloLens 1 hand gesticulation interaction methods when performing a virtual direct ophthalmoscopy eye examination. The work described in this paper extends our prior work by considering the interactions with the Oculus Quest controller and Oculus Quest hand-tracking system to perform a virtual direct ophthalmoscopy eye examination while allowing us to compare these methods without our prior interaction techniques. Ultimately, this helps us develop a greater understanding of usability effects for virtual DO examinations and virtual reality in general. Although the number of participants was limited, n = 5 for Stage 1 (including the HTC Vive controller, the Valve Index controller, and the Microsoft HoloLens hand gesticulations), and n = 13 for Stage 2 (including the Oculus Quest controller and the Oculus Quest hand tracking), given the COVID-19 restrictions, our initial results comparing VR and AR 3D user interactions for direct ophthalmoscopy are consistent with our previous preliminary study where the physical controllers resulted in higher usability scores, while the Oculus Quest’s more accurate hand motion capture resulted in higher usability when compared to the Microsoft HoloLens hand gesticulation.
APA, Harvard, Vancouver, ISO, and other styles
41

Hernández-Chávez, Macaria, José M. Cortés-Caballero, Ángel A. Pérez-Martínez, Luis F. Hernández-Quintanar, Karen Roa-Tort, Josué D. Rivera-Fernández, and Diego A. Fabila-Bustos. "Development of Virtual Reality Automotive Lab for Training in Engineering Students." Sustainability 13, no. 17 (August 31, 2021): 9776. http://dx.doi.org/10.3390/su13179776.

Full text
Abstract:
A Virtual Reality application was developed to be used as an immersive virtual learning strategy for Oculus Rift S Virtual Reality glasses and through Leap Motion Controller™ infrared sensors, focused on students of the Automotive Systems Engineering academic program, as a practical teaching-learning tool in the context of Education 4.0 and the pandemic caused by COVID-19 that has kept schools closed since March 2020. The technological pillars of Industry 4.0 were used to profile students so that they can meet the demands of their professional performance at the industrial level. Virtual Reality (VR) plays a very important role for the production-engineering sector in areas such as design and autonomous cars, as well as in training and driving courses. The VR application provides the student with a more immersive and interactive experience, supported by 3D models of both the main parts that make up the four-stroke combustion engine and the mechanical workshop scenario; it allows the student to manipulate the main parts of the four-stroke combustion engine through the Oculus Rift S controls and the Leap Motion Controller™ infrared sensors, and relate them to the operation of the engine, through the animation of its operation and the additional information shown for each part that makes it up in the application.
APA, Harvard, Vancouver, ISO, and other styles
42

Mather, Lisa Ward, and Pamela Robinson. "Civic Crafting in Urban Planning Public Consultation." International Journal of E-Planning Research 5, no. 3 (July 2016): 42–58. http://dx.doi.org/10.4018/ijepr.2016070104.

Full text
Abstract:
Minecraft is a popular video game that allows players to interact with a 3D environment. Users report that it is easy to learn and understand, is engaging and immersive, and is adaptable. Outside North America it has been piloted for urban planning public consultation processes. However, this game has not yet been studied to determine how and whether it could be used for this purpose. Using key informant interviews, this study asked practicing urban planners to assess Minecraft's potential. Key findings address Minecraft's usefulness as a visualization tool, its role in building public trust in local planning processes, the place of play in planning, and the challenges associated with its use in public consultation. The paper concludes with reflections as to how this game could effectively be used for public consultation, and offers key lessons for urban planners whose practice intersects with our digitally-enabled world.
APA, Harvard, Vancouver, ISO, and other styles
43

Yuen, Douglas, Markus Santoso, Stephen Cartwright, and Christian Jacob. "Eukaryo: An AR and VR Application for Cell Biology." International Journal of Virtual Reality 16, no. 1 (January 1, 2016): 7–14. http://dx.doi.org/10.20870/ijvr.2016.16.1.2877.

Full text
Abstract:
Eukaryo is a simulated bio-molecular world that allows users to explore the complex environment within a biological cell. Eukaryo was developed using Unity, leveraging the capabilities and high performance of a commercial game engine. Through the use of MiddleVR, our tool can support a wide variety of interaction platforms including 3D virtual reality (VR) environments, such as head-mounted displays, augmented reality (AR) headsets, and large scale immersive visualization facilities. Our interactive, 3-dimensional model demonstrates key functional elements of a generic eukaryotic cell. Users are able to use multiple modes to explore the cell, its structural elements, its organelles, and some key metabolic processes. In contrast to textbook diagrams and even videos, Eukaryo immerses users directly in the biological environment, giving a more effective demonstration of how cellular processes work, how compartmentalization affects cellular functions, and how the machineries of life operate.
APA, Harvard, Vancouver, ISO, and other styles
44

Métois, Marianne, Jean-Emmanuel Martelat, Jérémy Billant, Muriel Andreani, Javier Escartín, and Frédérique Leclerc. "Deep oceanic submarine fieldwork with undergraduate students: an immersive experience with the Minerve software." Solid Earth 12, no. 12 (December 20, 2021): 2789–802. http://dx.doi.org/10.5194/se-12-2789-2021.

Full text
Abstract:
Abstract. We present the content and scripting of an active tectonic lab session conceived for third-year undergraduate students studying Earth sciences at Observatoire des Sciences de l'Univers in Lyon. This session is based on a research project conducted on the submarine Roseau active fault in the Lesser Antilles. The fault morphology is particularly interesting to map as this structure in the deep ocean is preserved from weathering. Thus, high-resolution models computed from remotely operated vehicle (ROV) videos provide exceptional educational material to link fault morphology and coseismic displacement. This class includes mapping exercises on geographical information systems and virtual fieldwork to provide basic understanding of active tectonics and active fault morphology in particular. The work has been conducted either in a full remote configuration via 3D online models or in virtual reality (VR) in a dedicated room using the Minerve software. During the VR sessions, students were either alone in the VR environment or participated as a group that included the instructor (physically in the classroom or remotely from another location), which is to our knowledge one of the first attempts of this kind in France. We discuss the efficiency of virtual fieldwork using VR based on feedback from teachers and students. We conclude that VR is a promising tool to learn observational skills in Earth sciences, subject to certain improvements that should be possible in the years to come.
APA, Harvard, Vancouver, ISO, and other styles
45

Bruno, F., A. Lagudi, L. Barbieri, M. Cozza, A. Cozza, R. Peluso, B. Davidde Petriaggi, R. Petriaggi, S. Rizvic, and D. Skarlatos. "VIRTUAL TOUR IN THE SUNKEN “VILLA CON INGRESSO A PROTIRO” WITHIN THE UNDERWATER ARCHAEOLOGICAL PARK OF BAIAE." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W10 (April 17, 2019): 45–51. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w10-45-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> The paper presents the application of some Virtual Reality technologies developed in the Horizon 2020 i-MARECulture project to the case study of the sunken “Villa con ingresso a protiro”, dated around the II century AD, and located in the Marine Protected Area - Underwater Park of Baiae (Naples).</p><p>The i-MARECulture project (www.imareculture.eu), in fact, aims to improve the public awareness about the underwater cultural heritage by developing new tool and techniques that take advantage of the virtual reality technologies to allow the general public to explore the archaeological remains outside of the submerged environment.</p><p>To this end, the paper details the techniques and methods adopted for the development of an immersive virtual tour that allow users to explore, through a storytelling experience, a virtual replica and a 3D hypothetical reconstruction of the complex of the “Villa con ingresso a protiro”.</p>
APA, Harvard, Vancouver, ISO, and other styles
46

Vertemati, Maurizio, Simone Cassin, Francesco Rizzetto, Angelo Vanzulli, Marco Elli, Gianluca Sampogna, and Maurizio Gallieni. "A Virtual Reality Environment to Visualize Three-Dimensional Patient-Specific Models by a Mobile Head-Mounted Display." Surgical Innovation 26, no. 3 (January 11, 2019): 359–70. http://dx.doi.org/10.1177/1553350618822860.

Full text
Abstract:
Introduction. With the availability of low-cost head-mounted displays (HMDs), virtual reality environments (VREs) are increasingly being used in medicine for teaching and clinical purposes. Our aim was to develop an interactive, user-friendly VRE for tridimensional visualization of patient-specific organs, establishing a workflow to transfer 3-dimensional (3D) models from imaging datasets to our immersive VRE. Materials and Methods. This original VRE model was built using open-source software and a mobile HMD, Samsung Gear VR. For its validation, we enrolled 33 volunteers: morphologists (n = 11), trainee surgeons (n = 15), and expert surgeons (n = 7). They tried our VRE and then filled in an original 5-point Likert-type scale 6-item questionnaire, considering the following parameters: ease of use, anatomy comprehension compared with 2D radiological imaging, explanation of anatomical variations, explanation of surgical procedures, preoperative planning, and experience of gastrointestinal/neurological disorders. Results in the 3 groups were statistically compared using analysis of variance. Results. Using cross-sectional medical imaging, the developed VRE allowed to visualize a 3D patient-specific abdominal scene in 1 hour. Overall, the 6 items were evaluated positively by all groups; only anatomy comprehension was statistically significant different among the 3 groups. Conclusions. Our approach, based on open-source software and mobile hardware, proved to be a valid and well-appreciated system to visualize 3D patient-specific models, paving the way for a potential new tool for teaching and preoperative planning.
APA, Harvard, Vancouver, ISO, and other styles
47

Madeira, Tiago, Bernardo Marques, Pedro Neves, Paulo Dias, and Beatriz Sousa Santos. "Comparing Desktop vs. Mobile Interaction for the Creation of Pervasive Augmented Reality Experiences." Journal of Imaging 8, no. 3 (March 18, 2022): 79. http://dx.doi.org/10.3390/jimaging8030079.

Full text
Abstract:
This paper presents an evaluation and comparison of interaction methods for the configuration and visualization of pervasive Augmented Reality (AR) experiences using two different platforms: desktop and mobile. AR experiences consist of the enhancement of real-world environments by superimposing additional layers of information, real-time interaction, and accurate 3D registration of virtual and real objects. Pervasive AR extends this concept through experiences that are continuous in space, being aware of and responsive to the user’s context and pose. Currently, the time and technical expertise required to create such applications are the main reasons preventing its widespread use. As such, authoring tools which facilitate the development and configuration of pervasive AR experiences have become progressively more relevant. Their operation often involves the navigation of the real-world scene and the use of the AR equipment itself to add the augmented information within the environment. The proposed experimental tool makes use of 3D scans from physical environments to provide a reconstructed digital replica of such spaces for a desktop-based method, and to enable positional tracking for a mobile-based one. While the desktop platform represents a non-immersive setting, the mobile one provides continuous AR in the physical environment. Both versions can be used to place virtual content and ultimately configure an AR experience. The authoring capabilities of the different platforms were compared by conducting a user study focused on evaluating their usability. Although the AR interface was generally considered more intuitive, the desktop platform shows promise in several aspects, such as remote configuration, lower required effort, and overall better scalability.
APA, Harvard, Vancouver, ISO, and other styles
48

François, Paul, Jeffrey Leichman, Florent Laroche, and Françoise Rubellin. "Virtual reality as a versatile tool for research, dissemination and mediation in the humanities." Virtual Archaeology Review 12, no. 25 (July 14, 2021): 1. http://dx.doi.org/10.4995/var.2021.14880.

Full text
Abstract:
<p class="VARAbstract">The VESPACE project aims to revive an evening of theatre at the <em>Foire Saint-Germain</em> in Paris in the 18<sup>th</sup> century, by recreating spaces, atmospheres and theatrical entertainment in virtual reality. The venues of this fair have disappeared without leaving any archaeological traces, so their digital reconstruction requires the use of many different sources, including the expertise of historians, historians of theatre and literature. In this article, we present how we have used video game creation tools to enable the use of virtual reality in three key stages of research in the human sciences and particularly in history or archaeology: preliminary research, scientific dissemination and mediation with the general public. In particular, we detail the methodology used to design a three-dimensional (3D) model that is suitable for both research and virtual reality visualization, meets the standards of scientific work regarding precision and accuracy, and the requirements of a real-time display. This model becomes an environment in which experts can be immersed within their fields of research and expertise, and thus extract knowledge reinforcing the model created –through comments, serendipity and new perspectives– while enabling a multidisciplinary workflow. We also present our tool for annotating and consulting sources, relationships and hypotheses in immersion, called PROUVÉ. This tool is designed to make the virtual reality experience go beyond a simple image and to convey scientific information and theories in the same way an article or a monograph does. Finally, this article offers preliminary feedback on the use of our solutions with three target audiences: the researchers from our team, the broader theatre expert community and the general public.</p><p class="VARAbstract">Highlights:</p><p>• Immersive Virtual Reality is used to enhance the digital reconstruction of an 18th-century theatre, by allowing experts to dive into their research topic.</p><p>• Virtual Reality (VR) can also be used to disseminate the digital model through the scientific community and beyond while giving access to all kinds of sources that were used to build it.</p><p>• A quick survey shows that VR is a powerful tool to share theories and interpretations related to archaeological or historical tri-dimensional data.</p>
APA, Harvard, Vancouver, ISO, and other styles
49

Abjigitova, Djamila, Amir H. Sadeghi, Jette J. Peek, Jos A. Bekkers, Ad J. J. C. Bogers, and Edris A. F. Mahtab. "Virtual Reality in the Preoperative Planning of Adult Aortic Surgery: A Feasibility Study." Journal of Cardiovascular Development and Disease 9, no. 2 (January 18, 2022): 31. http://dx.doi.org/10.3390/jcdd9020031.

Full text
Abstract:
Background: Complex aortic anatomy needs careful preoperative planning in which a patient-tailored approach with novel immersive techniques could serve as a valuable addition to current preoperative imaging. This pilot study aimed to investigate the technical feasibility of virtual reality (VR) as an additional imaging tool for preoperative planning in ascending aortic surgery. Methods: Ten cardiothoracic surgeons were presented with six patients who had each undergone a recent repair of the ascending aorta. Two-dimensional computed tomography images of each patient were assessed prior to the VR session. After three-dimensional (3D) VR rendering and 3D segmentation of the ascending aorta and aortic arch, the reconstructions were analyzed by each surgeon in VR via a head-mounted display. Each cardiothoracic surgeon completed a questionnaire after each planning procedure. The results of their assessments were compared to the performed operations. The primary endpoint of the present study was a change of surgical approach from open to clamped distal anastomosis, and vice versa. Results: Compared with conventional imaging, 80% of surgeons found that VR prepared them better for surgery. In 33% of cases (two out of six), the preoperative decision was adjusted due to the 3D VR-based evaluation of the anatomy. Surgeons rated CardioVR usefulness, user-friendliness, and satisfaction with median scores of 3.8 (IQR: 3.5–4.1), 4.2 (IQR: 3.8–4.6,) and 4.1 (IQR: 3.8–4.7) on a five-point Likert scale, respectively. Conclusions: Three-dimensional VR imaging was associated with improved anatomical understanding among surgeons and could be helpful in the future preoperative planning of ascending aortic surgery.
APA, Harvard, Vancouver, ISO, and other styles
50

Deng, Yilin, Sang-Yun Han, Jianyi Li, Jinjin Rong, Wenyu Fan, and Tiancong Sun. "The design of tourism product CAD three-dimensional modeling system using VR technology." PLOS ONE 15, no. 12 (December 28, 2020): e0244205. http://dx.doi.org/10.1371/journal.pone.0244205.

Full text
Abstract:
In view of the high homogeneity of tourism products all over the country, an attempt is made to design virtual visit tourism products with cultural experience background, which can reflect the characteristics of culture + tourism in different scenic spots, so that tourists can deeply experience the local culture. Combined with computer aided design (CAD), the virtual three-dimensional (3D) modeling system of scenic spots is designed, and VR real scene visit interactive tourism products suitable for different scenic spots are designed. 360° VR panoramic display technology is used for 360° VR panoramic video shooting and visiting system display production of Elephant Trunk Hill park scenery. A total of 157 images are collected and 720 cloud panoramic interactive H5 tool is selected to produce a display system suitable for 360° VR panoramic display of scenic spots. Meanwhile, based on single view RGB-D image, the latest convolutional neural network (CNN) algorithm and point cloud processing algorithm are used to design the indoor 3D scene reconstruction algorithm based on semantic understanding. Experiments show that the pixel accuracy and mean intersection over union of the indoor scene layout segmentation network segmentation results are 89.5% and 60.9%, respectively, that is, it has high accuracy. The VR real scene visit interactive tourism product can make tourists have a more immersive sense of interaction and experience before, during and after the tour.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography