Design of Indoor Large-Scale Multi-Target Precise Positioning and Tracking System

2014 ◽  
Vol 1049-1050 ◽  
pp. 1233-1236
Author(s):  
Hui Qin Sun

This paper is to build a interior Large-scale multi-target precise positioning and tracking system.In-depth resear of a visual-based optical tracking technology, inertial tracking technology and multi-sensor data fusion technology.Breaking the graphics, images, data fusion, tracking and other related key technologies.Develop high versatility, real-time, robustness of a wide range of high-precision optical tracking system for interior Large-scale multi-target precise positioning and tracking system.

2009 ◽  
Author(s):  
Jin Zhang ◽  
Yu Lu ◽  
Wanping Wang ◽  
Yong Zhang ◽  
Qinzhang Wu

Author(s):  
Filipe Gaspar ◽  
Rafael Bastos ◽  
Miguel Sales

In large-scale immersive virtual reality (VR) environments, such as a CAVE, one of the most common problems is tracking the position of the user’s head while he or she is immersed in this environment to reflect perspective changes in the synthetic stereoscopic images. In this paper, the authors describe the theoretical foundations and engineering approach adopted in the development of an infrared-optical tracking system designed for large scale immersive Virtual Environments (VE) or Augmented Reality (AR) settings. The system is capable of tracking independent retro-reflective markers arranged in a 3D structure in real time, recovering all possible 6DOF. These artefacts can be adjusted to the user’s stereo glasses to track his or her head while immersed or used as a 3D input device for rich human-computer interaction (HCI). The hardware configuration consists of 4 shutter-synchronized cameras attached with band-pass infrared filters and illuminated by infrared array-emitters. Pilot lab results have shown a latency of 40 ms when simultaneously tracking the pose of two artefacts with 4 infrared markers, achieving a frame-rate of 24.80 fps and showing a mean accuracy of 0.93mm/0.51º and a mean precision of 0.19mm/0.04º, respectively, in overall translation/rotation, fulfilling the requirements initially defined.


2014 ◽  
Vol 602-605 ◽  
pp. 2491-2494
Author(s):  
Yu Lu ◽  
Rong Shun Huang ◽  
Zi Li Xu

ADS-B (Automatic Dependent Surveillance - Broadcast) and MLAT (Multilateration) are the main surveillance techniques for ATC (Air Traffic Control), and will play an important role in the future's tracking system. The fusion between ADS-B and MLAT is able to achieve more accurate tracking. In views of the data characters of these two techniques, this paper designs a concrete fusion framework with multi-levels based on federal Kalman filters to fuse ADS-B and MLAT in approach. Under this framework, both ADS-B and MLAT data are processed intensively to achieve high accuracy. Experimental results based on simulation and practical data illustrate the algorithm can achieve high precision.


2010 ◽  
Vol 36 (9) ◽  
pp. 1239-1249 ◽  
Author(s):  
Bin LUO ◽  
Yong-Tian WANG ◽  
Yue LIU

Author(s):  
Filipe Gaspar ◽  
Rafael Bastos ◽  
Miguel Sales

In large-scale immersive virtual reality (VR) environments, such as a CAVE, one of the most common problems is tracking the position of the user’s head while he or she is immersed in this environment to reflect perspective changes in the synthetic stereoscopic images. In this paper, the authors describe the theoretical foundations and engineering approach adopted in the development of an infrared-optical tracking system designed for large scale immersive Virtual Environments (VE) or Augmented Reality (AR) settings. The system is capable of tracking independent retro-reflective markers arranged in a 3D structure in real time, recovering all possible 6DOF. These artefacts can be adjusted to the user’s stereo glasses to track his or her head while immersed or used as a 3D input device for rich human-computer interaction (HCI). The hardware configuration consists of 4 shutter-synchronized cameras attached with band-pass infrared filters and illuminated by infrared array-emitters. Pilot lab results have shown a latency of 40 ms when simultaneously tracking the pose of two artefacts with 4 infrared markers, achieving a frame-rate of 24.80 fps and showing a mean accuracy of 0.93mm/0.51º and a mean precision of 0.19mm/0.04º, respectively, in overall translation/rotation, fulfilling the requirements initially defined.


Author(s):  
Geoffrey Ho ◽  
Erin Kim ◽  
Shahzaib Khattak ◽  
Stephanie Penta ◽  
Tharmarasa Ratnasingham ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document