A Liquid Metal Based Multimodal Sensor and Haptic Feedback Device for Thermal and Tactile Sensation Generation in Virtual Reality

2020 ◽  
pp. 2007772
Author(s):  
Jinhyeok Oh ◽  
Suin Kim ◽  
Sangyeop Lee ◽  
Seongmin Jeong ◽  
Seung Hwan Ko ◽  
...  
2021 ◽  
Vol 31 (39) ◽  
pp. 2170285
Author(s):  
Jinhyeok Oh ◽  
Suin Kim ◽  
Sangyeop Lee ◽  
Seongmin Jeong ◽  
Seung Hwan Ko ◽  
...  

Author(s):  
Benjamin Williams ◽  
Alexandra E. Garton ◽  
Christopher J. Headleand

2006 ◽  
Vol 10 (1) ◽  
pp. 24-30 ◽  
Author(s):  
David Swapp ◽  
Vijay Pawar ◽  
Céline Loscos

2018 ◽  
Vol 35 (2) ◽  
pp. 149-160 ◽  
Author(s):  
Mustufa H. Abidi ◽  
Abdulrahman M. Al-Ahmari ◽  
Ali Ahmad ◽  
Saber Darmoul ◽  
Wadea Ameen

AbstractThe design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.


Author(s):  
Yuwei Li ◽  
David Donghyun Kim ◽  
Brian Anthony

Abstract We present HapticWall, an encountered-type, motor actuated vertical two-dimensional system that enables both small and large scale physical interactions in virtual reality. HapticWall consists of a motor-actuated vertical two-dimensional gantry system that powers the physical proxy for the virtual counterpart. The physical proxy, combined with the HapticWall system, can be used to provide both small and large scale haptic feedbacks for virtual reality in the vertical space. Haptic Wall is capable of providing wall-like haptic feedback and interactions in the vertical space. We created two virtual reality applications to demonstrate the application of the HapticWall system. Preliminary user feedback was collected to evaluate the performance and the limitations of the HapticWall system. The results of our study are presented in this paper. The outcome of this research will provide better understanding of multi-scale haptic interfaces in the vertical space for virtual reality and guide the future development of the HapticWall system.


2019 ◽  
Vol 121 (4) ◽  
pp. 1398-1409 ◽  
Author(s):  
Vonne van Polanen ◽  
Robert Tibold ◽  
Atsuo Nuruki ◽  
Marco Davare

Lifting an object requires precise scaling of fingertip forces based on a prediction of object weight. At object contact, a series of tactile and visual events arise that need to be rapidly processed online to fine-tune the planned motor commands for lifting the object. The brain mechanisms underlying multisensory integration serially at transient sensorimotor events, a general feature of actions requiring hand-object interactions, are not yet understood. In this study we tested the relative weighting between haptic and visual signals when they are integrated online into the motor command. We used a new virtual reality setup to desynchronize visual feedback from haptics, which allowed us to probe the relative contribution of haptics and vision in driving participants’ movements when they grasped virtual objects simulated by two force-feedback robots. We found that visual delay changed the profile of fingertip force generation and led participants to perceive objects as heavier than when lifts were performed without visual delay. We further modeled the effect of vision on motor output by manipulating the extent to which delayed visual events could bias the force profile, which allowed us to determine the specific weighting the brain assigns to haptics and vision. Our results show for the first time how visuo-haptic integration is processed at discrete sensorimotor events for controlling object-lifting dynamics and further highlight the organization of multisensory signals online for controlling action and perception. NEW & NOTEWORTHY Dexterous hand movements require rapid integration of information from different senses, in particular touch and vision, at different key time points as movement unfolds. The relative weighting between vision and haptics for object manipulation is unknown. We used object lifting in virtual reality to desynchronize visual and haptic feedback and find out their relative weightings. Our findings shed light on how rapid multisensory integration is processed over a series of discrete sensorimotor control points.


2019 ◽  
Vol 25 (11) ◽  
pp. 3169-3177 ◽  
Author(s):  
Dennis Wolf ◽  
Michael Rietzler ◽  
Leo Hnatek ◽  
Enrico Rukzio

Sign in / Sign up

Export Citation Format

Share Document