scholarly journals Insect-Computer Hybrid System for Autonomous Search and Rescue Mission

Author(s):  
Hirotaka Sato ◽  
P. Thanh Tran-Ngoc ◽  
Le Duc Long ◽  
Bing Sheng Chong ◽  
H. Duoc Nguyen ◽  
...  

Abstract There is still a long way to go before artificial mini robots are really used for search and rescue missions in disaster-hit areas due to hindrance in power consumption, computation load of the locomotion, and obstacle-avoidance system. Insect–computer hybrid system, which is the fusion of living insect platform and microcontroller, emerges as an alternative solution. This study demonstrates the first-ever insect–computer hybrid system conceived for search and rescue missions, which is capable of autonomous navigation and human presence detection in an unstructured environment. Customized navigation control algorithm utilizing the insect’s intrinsic navigation capability achieved exploration and negotiation of complex terrains. On-board high-accuracy human presence detection using infrared camera was achieved with a custom machine learning model. Low power consumption suggests system suitability for hour-long operations and its potential for realization in real-life missions.

Designs ◽  
2021 ◽  
Vol 5 (1) ◽  
pp. 17
Author(s):  
Nur-A-Alam ◽  
Mominul Ahsan ◽  
Md. Abdul Based ◽  
Julfikar Haider ◽  
Eduardo M. G. Rodrigues

In the era of Industry 4.0, remote monitoring and controlling appliance/equipment at home, institute, or industry from a long distance with low power consumption remains challenging. At present, some smart phones are being actively used to control appliances at home or institute using Internet of Things (IoT) systems. This paper presents a novel smart automation system using long range (LoRa) technology. The proposed LoRa based system consists of wireless communication system and different types of sensors, operated by a smart phone application and powered by a low-power battery, with an operating range of 3–12 km distance. The system established a connection between an android phone and a microprocessor (ESP32) through Wi-Fi at the sender end. The ESP32 module was connected to a LoRa module. At the receiver end, an ESP32 module and LoRa module without Wi-Fi was employed. Wide Area Network (WAN) communication protocol was used on the LoRa module to provide switching functionality of the targeted area. The performance of the system was evaluated by three real-life case studies through measuring environmental temperature and humidity, detecting fire, and controlling the switching functionality of appliances. Obtaining correct environmental data, fire detection with 90% accuracy, and switching functionality with 92.33% accuracy at a distance up to 12 km demonstrated the high performance of the system. The proposed smart system with modular design proved to be highly effective in controlling and monitoring home appliances from a longer distance with relatively lower power consumption.


Sensors ◽  
2021 ◽  
Vol 21 (1) ◽  
pp. 297
Author(s):  
Ali Marzoughi ◽  
Andrey V. Savkin

We study problems of intercepting single and multiple invasive intruders on a boundary of a planar region by employing a team of autonomous unmanned surface vehicles. First, the problem of intercepting a single intruder has been studied and then the proposed strategy has been applied to intercepting multiple intruders on the region boundary. Based on the proposed decentralised motion control algorithm and decision making strategy, each autonomous vehicle intercepts any intruder, which tends to leave the region by detecting the most vulnerable point of the boundary. An efficient and simple mathematical rules based control algorithm for navigating the autonomous vehicles on the boundary of the see region is developed. The proposed algorithm is computationally simple and easily implementable in real life intruder interception applications. In this paper, we obtain necessary and sufficient conditions for the existence of a real-time solution to the considered problem of intruder interception. The effectiveness of the proposed method is confirmed by computer simulations with both single and multiple intruders.


2004 ◽  
Vol 37 (8) ◽  
pp. 986-991
Author(s):  
Iñaki Rañó ◽  
Bogdan Raducanu ◽  
Sriram Subramanian

2015 ◽  
Vol 2015 ◽  
pp. 1-10 ◽  
Author(s):  
Bambang A. B. Sarif ◽  
Mahsa Pourazad ◽  
Panos Nasiopoulos ◽  
Victor C. M. Leung

There is an increasing interest in using video sensor networks (VSNs) as an alternative to existing video monitoring/surveillance applications. Due to the limited amount of energy resources available in VSNs, power consumption efficiency is one of the most important design challenges in VSNs. Video encoding contributes to a significant portion of the overall power consumption at the VSN nodes. In this regard, the encoding parameter settings used at each node determine the coding complexity and bitrate of the video. This, in turn, determines the encoding and transmission power consumption of the node and the VSN overall. Therefore, in order to calculate the nodes’ power consumption, we need to be able to estimate the coding complexity and bitrate of the video. In this paper, we modeled the coding complexity and bitrate of the H.264/AVC encoder, based on the encoding parameter settings used. We also propose a method to reduce the model estimation error for videos whose content changes within a specified period of time. We have conducted our experiments using a large video dataset captured from real-life applications in the analysis. Using the proposed model, we show how to estimate the VSN power consumption for a given topology.


2020 ◽  
Vol 4 (Supplement_2) ◽  
pp. 1308-1308
Author(s):  
James Hollis ◽  
James Oliver

Abstract Objectives The objective of this study was to a) determine the feasibility of eating in virtual reality (VR) environment while wearing a head mounted display (HMD) and b) determine the effect of eating in a virtual restaurant on food intake, sensory evaluation of the test food and masticatory parameters. Methods Fifteen adults were asked to report to the laboratory on two occasions, separated by at least one week, at their usual lunchtime. On reporting to the laboratory, surface electrodes were attached to the left and right masseter muscles to measure masticatory activity and a wristband placed on the non-dominant wrist to collect physiological data. The participant sat quietly for 5 minutes before a VR (HMD) was placed on their head. The HMD displayed either a virtual restaurant (pizzeria) or a blank scene (consisting of a white background and a table). The participant's hand movements were captured using an infrared camera mounted on the HMD so when the participant moved their hands this was represented by computer generated model hands in the VR scene. The test foods (pizza bites) were represented in VR using a 3D model of pizza bites. The test foods were arranged so that when the participant touched the test food model in the VR scene they touched the test food in real life allowing them to locate and pick up the test food. The participant was instructed to eat the test food until they felt comfortably full.  When the participant finished eating the equipment was removed and they completed questionnaires regarding their feelings of presence and experiences in the VR environment and their ratings of the test food attributes. Results Participants were able to successfully locate and eat the pizza rolls while in the VR environment. The participants feeling of presence was higher in the restaurant scene compared to the blank scene (P < 0.05). Heart rate and skin temperature were higher in the restaurant scene (P < 0.05). Differences in masticatory parameters were found with participants using fewer masticatory cycles before swallowing in the restaurant scene (P < 0.05). There were no differences between scenes regarding the sensory evaluation of the test foods. There was no difference in food intake between the treatments. Conclusions Eating in VR is feasible and may provide a new method to understand eating behavior in different contexts. Funding Sources None.


Sign in / Sign up

Export Citation Format

Share Document