scholarly journals Towards Smart Gaming Olfactory Displays

Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 1002
Author(s):  
Georgios Tsaramirsis ◽  
Michail Papoutsidakis ◽  
Morched Derbali ◽  
Fazal Qudus Khan ◽  
Fotis Michailidis

Olfaction can enhance the experience of music, films, computer games and virtual reality applications. However, this area is less explored than other areas such as computer graphics and audio. Most advanced olfactory displays are designed for a specific experiment, they are hard to modify and extend, expensive, and/or can deliver a very limited number of scents. Additionally, current-generation olfactory displays make no decisions on if and when a scent should be released. This paper proposes a low-cost, easy to build, powerful smart olfactory display, that can release up to 24 different aromas and allow control of the quantity of the released aroma. The display is capable of absorbing back the aroma, in an attempt to clean the air prior to releasing a new aroma. Additionally, the display includes a smart algorithm that will decide when to release certain aromas. The device controller application includes releasing scents based on a timer, text in English subtitles, or input from external software applications. This allows certain applications (such as games) to decide when to release a scent, making it ideal for gaming. The device also supports native connectivity with games developed using a game development asset, developed as part of this project. The project was evaluated by 15 subjects and it was proved to have high accuracy when the scents were released with 1.5 minutes’ delay from each other.

Author(s):  
Lorenzo Micaroni ◽  
Marina Carulli ◽  
Francesco Ferrise ◽  
Monica Bordegoni ◽  
Alberto Gallace

This research aims to design and develop an innovative system, based on an olfactory display, to be used for investigating the directionality of the sense of olfaction. In particular, the design of an experimental setup to understand and determine to what extent the sense of olfaction is directional and whether there is prevalence of the sense of vision over the one of smell when determining the direction of an odor, is described. The experimental setup is based on low cost Virtual Reality (VR) technologies. In particular, the system is based on a custom directional olfactory display, an Oculus Rift Head Mounted Display (HMD) to deliver both visual and olfactory cues and an input device to register subjects’ answers. The VR environment is developed in Unity3D. The paper describes the design of the olfactory interface as well as its integration with the overall system. Finally the results of the initial testing are reported in the paper.


Author(s):  
Ryan A. Pavlik ◽  
Judy M. Vance

Virtual reality (VR) environments based on interactive rendering of 3D computer graphics often incorporate the use of position and orientation tracking on the user’s head, hands, and control devices. The Wii Remote game controller is a mass-market peripheral that can provide a low-cost source of infrared point tracking and accelerometer data, making it attractive as a PC-based virtual reality head tracking system. This paper describes the development of an extension to the Virtual Reality Peripheral Network (VRPN) software to support the use of the Wii Remote game controller as a standard tracker object in a wide range of VR software applications. This implementation permits Wii Remote-based head tracking to directly substitute for more costly commercial trackers through the VRPN and VR Juggler Gadgeteer tracker interfaces. The head tracker provides up to 100Hz of head tracking input. It has been tested in a variety of VR applications on both Windows and Linux. The discussed solution has been released as open-source software.


Author(s):  
Lorenzo Micaroni ◽  
Marina Carulli ◽  
Francesco Ferrise ◽  
Alberto Gallace ◽  
Monica Bordegoni

The paper describes the design of an innovative virtual reality (VR) system, based on a combination of an olfactory display and a visual display, to be used for investigating the directionality of the sense of olfaction. In particular, the design of an experimental setup to understand and determine to what extent the sense of olfaction is directional and whether there is prevalence of the sense of vision over the one of smell when determining the direction of an odor, is described. The experimental setup is based on low-cost VR technologies. In particular, the system is based on a custom directional olfactory display (OD), a head mounted display (HMD) to deliver both visual and olfactory cues, and an input device to register subjects' answers. The paper reports the design of the olfactory interface as well as its integration with the overall system.


2020 ◽  
Author(s):  
Daniel Janos ◽  
Justyna Ruchała ◽  
Edyta Puniach ◽  
Paweł Ćwiąkała

<p>Representatives of the scientific community collect and store huge amounts of spatial data resulting from years of their studies. However, there is a common problem of visualization methods of data which would be interesting to understand for a recipient from outside of the area as well as according to the current trends. In the modern day, many spheres of our life have been moved to the virtual reality and that is why representatives of areas such as industry, science, culture and art need to deal with the representation of the real world in a 3D reality.</p><p> </p><p>This work is concerned with the current issue of visualization of spatial data collected by surveyors as well as representatives of many other areas. The proposed method of presentation of collected research data is not only low-cost at preparation but is also distinguished by its simplicity of implementation. Its functionality will be presented by using an example of the Agora area located in the Archaeological Park of Kato Paphos in Cyprus. The mentioned area was created in order to protect and promote the archaeological sites as well as the artefacts from the former epoch which have been found in the area. Such historic places are very often not fully available to see by visitors and that is why the documentation and visualization of them in 3D reality might be incredibly helpful. This kind of activity not only contributes to the popularization of archaeological research but also meets the expectations of a modern recipient who uses virtual reality more and more often in order to learn about new places. It is worth mentioning that the presented visualization of measurement data is a versatile method that is intended to be used in many different scientific and research areas.</p><p> </p><p>From a technical point of view, the presented work guides a recipient through the complete process of development of an advanced animation in the environment used in the creation of 3D computer games – the game engine Unity. In the first part of the article the suitability of the data results obtained in digital photogrammetry as well as laser scanning was estimated for purposes of applying the presented method. The work also brings up the issues of limitation of free software and raises a question of methods allowing to meet the requirements with minimized loss of quality and accuracy of the data. The next step was to present the method of importing data (a mesh model and a high-resolution texture). Operating mechanism in Unity as well as a transfer of interactive visualization into the online browser Unity Connect were discussed in the further part of the article. It is worth mentioning that thanks to the FPP (First Person Perspective) technique the developed visualization allows a user to be transferred right into the centre of the archaeological sites where the admission for the third party is usually significantly restricted.</p>


2003 ◽  
Author(s):  
David Walshe ◽  
Elizabeth Lewis ◽  
Kathleen O'Sullivan ◽  
Brenda K. Wiederhold ◽  
Sun I. Kim

2021 ◽  
Author(s):  
Polona Caserman ◽  
Augusto Garcia-Agundez ◽  
Alvar Gámez Zerban ◽  
Stefan Göbel

AbstractCybersickness (CS) is a term used to refer to symptoms, such as nausea, headache, and dizziness that users experience during or after virtual reality immersion. Initially discovered in flight simulators, commercial virtual reality (VR) head-mounted displays (HMD) of the current generation also seem to cause CS, albeit in a different manner and severity. The goal of this work is to summarize recent literature on CS with modern HMDs, to determine the specificities and profile of immersive VR-caused CS, and to provide an outlook for future research areas. A systematic review was performed on the databases IEEE Xplore, PubMed, ACM, and Scopus from 2013 to 2019 and 49 publications were selected. A summarized text states how different VR HMDs impact CS, how the nature of movement in VR HMDs contributes to CS, and how we can use biosensors to detect CS. The results of the meta-analysis show that although current-generation VR HMDs cause significantly less CS ($$p<0.001$$ p < 0.001 ), some symptoms remain as intense. Further results show that the nature of movement and, in particular, sensory mismatch as well as perceived motion have been the leading cause of CS. We suggest an outlook on future research, including the use of galvanic skin response to evaluate CS in combination with the golden standard (Simulator Sickness Questionnaire, SSQ) as well as an update on the subjective evaluation scores of the SSQ.


2020 ◽  
Author(s):  
Derek Schulte ◽  
Kyam Krieger ◽  
Carl W. Chin ◽  
Alexander Sonn
Keyword(s):  
Low Cost ◽  

Author(s):  
Wilver Auccahuasi ◽  
Mónica Diaz ◽  
Fernando Sernaque ◽  
Edward Flores ◽  
Justiniano Aybar ◽  
...  

Author(s):  
Jonas Austerjost ◽  
Robert Söldner ◽  
Christoffer Edlund ◽  
Johan Trygg ◽  
David Pollard ◽  
...  

Machine vision is a powerful technology that has become increasingly popular and accurate during the last decade due to rapid advances in the field of machine learning. The majority of machine vision applications are currently found in consumer electronics, automotive applications, and quality control, yet the potential for bioprocessing applications is tremendous. For instance, detecting and controlling foam emergence is important for all upstream bioprocesses, but the lack of robust foam sensing often leads to batch failures from foam-outs or overaddition of antifoam agents. Here, we report a new low-cost, flexible, and reliable foam sensor concept for bioreactor applications. The concept applies convolutional neural networks (CNNs), a state-of-the-art machine learning system for image processing. The implemented method shows high accuracy for both binary foam detection (foam/no foam) and fine-grained classification of foam levels.


Sign in / Sign up

Export Citation Format

Share Document