Panoramic-LDSO (Panoramic Direct LiDAR-assisted Visual Odometry) is designed for fully associating the 360-degree field-of-view (FOV) LiDAR points with the 360-degree FOV panoramic image datas. 360-degree FOV panoramic images can provide more available information, which can compensate inaccurate pose estimation caused by insufficient texture or motion blur from a single view.
Panoramic Direct LiDAR-assisted Visual Odometry
Authors: Zikang Yuan, Tianle Xu, Xiaoxiang Wang, Jinni Geng and Xin Yang
The x16 Real-Time Performance (Up) and Final Trjaectory and Sparse Map (Down) on the segment of sequence 2012-01-08 from NCLT dataset.
GCC >= 7.5.0
Cmake >= 3.16.0
Eigen3 >= 3.3.4
OpenCV >= 3.3
PCL == 1.8 for Ubuntu 18.04, and == 1.10 for Ubuntu 20.04
Pangolin == 0.5 or 0.6 for Ubuntu 20.04
OS | GCC | Cmake | Eigen3 | OpenCV | PCL | Pangolin |
---|---|---|---|---|---|---|
Ubuntu 20.04 | 9.4.0 | 3.16.3 | 3.3.7 | 4.2.0 | 1.10.0 | 0.5 |
git clone https://github.com/ZikangYuan/panoramic_lidar_dso.git
mkdir build
cd build
cmake ..
make
Noted:
-
Before running, please down load the calib folder from Google drive, and put it under the <PATH_OF_PROJECT_FOLDER>.
-
Currently the package only supports interfaces to NCLT and IJRR datasets. If you want to run on other datasets, you'll need to modify the code yourself.
1. Run on NCLT
Before running, please ensure the dataset format is as follow:
<PATH_OF_NCLT_FOLDER>
|____________2012-01-08
|____________lb3
|____________velodyne_sync
|____________2012-09-28
|____________lb3
|____________velodyne_sync
|____________2012-11-04
|____________lb3
|____________velodyne_sync
|____________2012-12-01
|____________lb3
|____________velodyne_sync
|____________2013-02-23
|____________lb3
|____________velodyne_sync
|____________2013-04-05
|____________lb3
|____________velodyne_sync
Then open the terminal in the path of the <PATH_OF_PROJECT_FOLDER>/build, and type:
./dso_dataset dataset=<PATH_OF_NCLT_FOLDER> sequence=<SEQUENCE_NAME> seg=<SEGMENT_NUMBER> calib=<PATH_OF_PROJECT_FOLDER>/calib/nclt/calib undistort=<PATH_OF_PROJECT_FOLDER>/calib/nclt/U2D_Cam pathSensorPrameter<PATH_OF_PROJECT_FOLDER>/sensor/nclt/x_lb3_c resultPath=<PATH_OF_PROJECT_FOLDER>/output/pose.txt mode=1 quiet=0 IJRR=0
2. Run on IJRR
Before running, please ensure the dataset format is as follow:
<PATH_OF_IJRR_FOLDER>
|____________ford_1
|____________Timestamp.log
|____________IMAGES
|____________pcd
|____________ford_2
|____________Timestamp.log
|____________IMAGES
|____________pcd
Then open the terminal in the path of the <PATH_OF_PROJECT_FOLDER>/build, and type:
./dso_dataset dataset=<PATH_OF_IJRR_FOLDER> sequence=<SEQUENCE_NAME> seg=<SEGMENT_NUMBER> calib=<PATH_OF_PROJECT_FOLDER>/calib/ijrr/calib undistort=<PATH_OF_PROJECT_FOLDER>/calib/ijrr/U2D_Cam pathSensorPrameter=<PATH_OF_PROJECT_FOLDER>/sensor/ijrr/x_lb3_c resultPath=<PATH_OF_PROJECT_FOLDER>/output/pose.txt mode=1 quiet=0 IJRR=1
If you use our work in your research project, please consider citing:
@article{yuan2024panoramic,
title={Panoramic Direct LiDAR-assisted Visual Odometry},
author={Yuan, Zikang and Xu, Tianle and Wang, Xiaoxiang and Geng, Jinni and Yang, Xin},
journal={arXiv preprint arXiv:2409.09287},
year={2024}
}
Thanks for DSO.