To validate the accuracy and reliability of onboard sensors for object detection and localization for driver assistance, as well as autonomous driving applications under realistic conditions (indoors and outdoors), a novel tracking system is presented. This tracking system is developed to determine the position and orientation of a slow-moving vehicle during test maneuvers within a reference environment (e.g., car during parking maneuvers), independent of the onboard sensors. One requirement is a 6 degree of freedom (DoF) pose with position uncertainty below 5 mm (3σ), orientation uncertainty below 0.3° (3σ), at a frequency higher than 20 Hz, and with a latency smaller than 500 ms. To compare the results from the reference system with the vehicle’s onboard system, synchronization via a Precision Time Protocol (PTP) and system interoperability to a robot operating system (ROS) are achieved. The developed system combines motion capture cameras mounted in a 360° panorama view setup on the vehicle, measuring retroreflective markers distributed over the test site with known coordinates, while robotic total stations measure a prism on the vehicle. A point cloud of the test site serves as a digital twin of the environment, in which the movement of the vehicle is visualized. The results have shown that the fused measurements of these sensors complement each other, so that the accuracy requirements for the 6 DoF pose can be met while allowing a flexible installation in different environments.