Automating the Integration of 4D Models in Game Engines for a Virtual Reality-Based Construction Simulation

Author(s):  
Simon Bourlon ◽  
Conrad Boton
Author(s):  
Claudia Lindner ◽  
Annette Ortwein ◽  
Kilian Staar ◽  
Andreas Rienow

AbstractElevation and visual data from Chang’E-2, Mars Viking, and MOLA were transformed into 3D models and environments using unity and unreal engine to be implemented in augmented (AR) and virtual reality (VR) applications, respectively. The workflows for the two game development engines and the two purposes overlap, but have significant differences stemming from their intended usage: both are used in educational settings, but while the AR app has to run on basic smartphones that students from all socio-economic backgrounds might have, the VR requires high-end PCs and can therefore make use of respective devices’ potential. Hence, the models for the AR app are reduced to the necessary components and sizes of the highest mountains on Luna and Mars, whereas the VR app contains several models of probe landing sites on Mars, a landscape containing the entire planet at multiple levels of detail and a complex environment. Both applications are enhanced for educational use with annotations and interactive elements. This study focuses on the transfer of scientific data into game development engines for the use in educational settings using the example of scales in extra-terrestrial environments.


2019 ◽  
Vol 214 ◽  
pp. 02013
Author(s):  
Bianchi Riccardo Maria ◽  
Claire Adam Bourdarios ◽  
Michael Hovdesven ◽  
Ilija Vukotic

Interactive 3D data visualization plays a key role in HEP experiments, as it is used in many tasks at different levels of the data chain. Outside HEP, for interactive 3D graphics, the game industry makes heavy use of so-called “game engines”, modern software frameworks offering an extensive set of powerful graphics tools and cross-platform deployment. Recently, a very strong support for Virtual Reality (VR) technology has been added to such engines. In this talk we explore the usage of game engines and VR for HEP data visualization, discussing the needs, the challenges and the issues of using such technologies. We will also make use of ATLASrift, a VR application developed by the ATLAS experiment, to discuss the lessons learned while developing it using the game engine Unreal Engine, and the feedback on the use of Virtual Reality we got from users while using it at many demonstrations and public events.


Proceedings ◽  
2020 ◽  
Vol 54 (1) ◽  
pp. 47
Author(s):  
João Azevedo ◽  
Paulo Veloso Gomes ◽  
João Donga ◽  
António Marques

Virtual Reality, due to its complexity and technological requirements, has a set of frictions that hinder its dissemination. The main ones can be summarized in the requirement to learn complex developing environments, like game engines, then we need to install applications, specific to each operating system and according to the means through which they can be accessed.


2019 ◽  
Vol 50 (2) ◽  
pp. 243-258 ◽  
Author(s):  
Blake Jones ◽  
Seyed Alireza Rohani ◽  
Nelson Ong ◽  
Tarek Tayeh ◽  
Ahmad Chalabi ◽  
...  

Background and Objectives. Hearing loss is one of the most prevalent chronic conditions and can significantly impact an individual’s quality of life. Cochlear implantation (CI) is a widely applicable treatment for severe to profound hearing loss, however CI surgery can be difficult for surgical trainees to master. Training environments that are safe, controlled, and affordable are needed. To this end, we present a virtual-reality (VR) cochlear implant surgical simulator developed with a popular, commercial game engine. Method. Unity3D was used to develop the simulator and model the delicate instruments involved. High-resolution models of human cochleae were created from images obtained from synchrotron-radiation phase-contrast imaging (SR-PCI). The physical-realism of the simulator was assessed via a comparison with fluoroscopic images of an actual cochlear implant insertion. Different resolutions of cochlear models were used to benchmark the real-time capabilities of the simulator with the number of frames per second (FPS) serving as the performance metric. Results. Quantitative analysis comparing the simulated procedure to fluoroscopic imaging revealed no significant differences. Qualitatively, the behaviour of the inserted and simulated implants were similar throughout the entirety of the procedure. The simulator was able to maintain 25 FPS even when experiencing an artificially high computational load. Conclusion. VR simulators provide a new and exciting avenue to enhance current medical education. Continued use of widely available and supported game engines in the development of medical simulators will hopefully result in lowered costs. Preliminary feedback from expert surgeons of the simulator presented here has been positive and future work will focus on evaluating face, content and construct validity.


Author(s):  
Brandon J. Newendorp ◽  
Christian Noon ◽  
Joe Holub ◽  
Eliot H. Winer ◽  
Stephen Gilbert ◽  
...  

In order to adapt to an ever-changing set of threats, military forces need to find new methods of training. The prevalence of commercial game engines combined with virtual reality (VR) and mixed reality environments can prove beneficial to training. Live, virtual and constructive (LVC) training combines live people, virtual environments and simulated actors to create a better training environment. However, integrating virtual reality displays, software simulations and artificial weapons into a mixed reality environment poses numerous challenges. A mixed reality environment known as The Veldt was constructed to research these challenges. The Veldt consists of numerous independent displays, along with movable walls, doors and windows. This allows The Veldt to simulate numerous training scenarios. Several challenges were encountered in creating this system. Displays were precisely located using the tracking system, then configured using VR Juggler. The ideal viewpoint for each display was configured based on the expect location for users to be looking at it. Finally, the displays were accurately aligned to the virtual terrain model. This paper describes how the displays were configured in The Veldt, as well as how it was used for two training scenarios.


2012 ◽  
Vol 256-259 ◽  
pp. 2849-2853
Author(s):  
Bing Yu Ren ◽  
Tao Guan

Combining the techniques of discrete event simulation and virtual-reality, research on real-time interactive simulation for concrete dam construction process is conducted, under the virtual-reality environment. The key technologies are discussed, including the modeling of DEM, optimization for real-time displaying of 3D scene, collision checking, real-time interaction and so on. A framework for the system of virtual-reality simulation for concrete dam construction operations based on Vega is presented. Virtual-reality supplies for construction simulation with a real-time interactive 3D simulation environment to inspect the validity of simulation model, which improves the simulation credibility and becomes a helpful tool for real-time decision making of construction operations. An example is performed to illustrate the feasibility of the system.


Author(s):  
Thomas Kersten ◽  
Daniel Drenkhan ◽  
Simon Deggim

AbstractTechnological advancements in the area of Virtual Reality (VR) in the past years have the potential to fundamentally impact our everyday lives. VR makes it possible to explore a digital world with a Head-Mounted Display (HMD) in an immersive, embodied way. In combination with current tools for 3D documentation, modelling and software for creating interactive virtual worlds, VR has the means to play an important role in the conservation and visualisation of cultural heritage (CH) for museums, educational institutions and other cultural areas. Corresponding game engines offer tools for interactive 3D visualisation of CH objects, which makes a new form of knowledge transfer possible with the direct participation of users in the virtual world. However, to ensure smooth and optimal real-time visualisation of the data in the HMD, VR applications should run at 90 frames per second. This frame rate is dependent on several criteria including the amount of data or number of dynamic objects. In this contribution, the performance of a VR application has been investigated using different digital 3D models of the fortress Al Zubarah in Qatar with various resolutions. We demonstrate the influence on real-time performance by the amount of data and the hardware equipment and that developers of VR applications should find a compromise between the amount of data and the available computer hardware, to guarantee a smooth real-time visualisation with approx. 90 fps (frames per second). Therefore, CAD models offer a better performance for real-time VR visualisation than meshed models due to the significant reduced data volume.


Sign in / Sign up

Export Citation Format

Share Document