MUN-FRL: A Visual-Inertial-LiDAR Dataset for Aerial Autonomous Navigation and Mapping

The Worldwide Journal of Robotics Analysis, Forward of Print.
This paper presents a novel out of doors aerial visual-inertial-LiDAR dataset captured utilizing a multi-sensor payload to advertise the worldwide navigation satellite tv for pc system (GNSS)-denied navigation analysis. The dataset options flight distances starting from 300 m to five km, collected utilizing a DJI-M600 hexacopter drone and the Nationwide Analysis Council (NRC) Bell412 Superior Methods Analysis Plane (ASRA). The dataset consists of hardware-synchronized monocular photos, inertial measurement unit (IMU) measurements, 3D mild detection and ranging (LiDAR) point-clouds, and high-precision real-time kinematic (RTK)-GNSS based mostly floor fact. 9 knowledge sequences had been collected as robotic working system (ROS) baggage over 100 minutes of out of doors setting footage starting from city areas, highways, airports, hillsides, prairies, and waterfronts. The dataset was collected to facilitate the event of visual-inertial-LiDAR odometry and mapping algorithms, visual-inertial navigation algorithms, object detection, segmentation, and touchdown zone detection algorithms based mostly on real-world drone and full-scale helicopter knowledge. All the info sequences comprise uncooked sensor measurements, {hardware} timestamps, and spatio-temporally aligned floor fact. The intrinsic and extrinsic calibrations of the sensors are additionally offered, together with uncooked calibration datasets. A efficiency abstract of state-of-the-art strategies utilized on the info sequences can be offered.

发布者:Ravindu G Thalagala,转转请注明出处:https://robotalks.cn/mun-frl-a-visual-inertial-lidar-dataset-for-aerial-autonomous-navigation-and-mapping/

(0)
上一篇 3 8 月, 2024
下一篇 3 8 月, 2024

相关推荐

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信
社群的价值在于通过分享与互动,让想法产生更多想法,创新激发更多创新。