The Worldwide Journal of Robotics Analysis, Forward of Print.
This paper presents a novel out of doors aerial visual-inertial-LiDAR dataset captured utilizing a multi-sensor payload to advertise the worldwide navigation satellite tv for pc system (GNSS)-denied navigation analysis. The dataset options flight distances starting from 300 m to five km, collected utilizing a DJI-M600 hexacopter drone and the Nationwide Analysis Council (NRC) Bell412 Superior Methods Analysis Plane (ASRA). The dataset consists of hardware-synchronized monocular photos, inertial measurement unit (IMU) measurements, 3D mild detection and ranging (LiDAR) point-clouds, and high-precision real-time kinematic (RTK)-GNSS based mostly floor fact. 9 knowledge sequences had been collected as robotic working system (ROS) baggage over 100 minutes of out of doors setting footage starting from city areas, highways, airports, hillsides, prairies, and waterfronts. The dataset was collected to facilitate the event of visual-inertial-LiDAR odometry and mapping algorithms, visual-inertial navigation algorithms, object detection, segmentation, and touchdown zone detection algorithms based mostly on real-world drone and full-scale helicopter knowledge. All the info sequences comprise uncooked sensor measurements, {hardware} timestamps, and spatio-temporally aligned floor fact. The intrinsic and extrinsic calibrations of the sensors are additionally offered, together with uncooked calibration datasets. A efficiency abstract of state-of-the-art strategies utilized on the info sequences can be offered.
发布者:Ravindu G Thalagala,转转请注明出处:https://robotalks.cn/mun-frl-a-visual-inertial-lidar-dataset-for-aerial-autonomous-navigation-and-mapping/