1.配置ros環境
mkdir -p orbslam_ws/src
catkin_init_workspace
cd orbslam_ws
catkin_make
在主文件夾下ctrl+h後找到隱藏文件夾.bashrc。在.bashrc末尾添加:
source /opt/ros/kinetic/setup.bash
source /home/hong/ROS/orbslam_ws/devel/setup.bash
或者用命令行添加:
echo "source /home/hong/ROS/orbslam_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc
gedit ~/.bashrc #查看是否添加
2.安裝ROS usb_cam
cd src
git clone https://github.com/bosch-ros-pkg/usb_cam
cd ..
catkin_make
source devel/setup.bash
測試,新開終端:roscore
運行:roslaunch usb_cam usb_cam-test.launch出現畫面。
3.編譯ORB-SLAM2
#copy ORB_SLAM2-master
cd orbslam_ws/src/ORB_SLAM2
source /home/hong/ROS/orbslam_ws/devel/setup.bash
chmod +x build.sh
./build.sh
chmod +x build_ros.sh
./build_ros.sh
3.1 注意: 如果在編譯構建的過程中出現關於boost庫的錯誤"Undefined reference to symbol ‘_ZN5boost6system15system_categoryEv’",請參考 https://github.com/raulmur/ORB_SLAM2/issues/494
在/home/hong/ROS/orbslam_ws/src/ORB_SLAM2-master/Examples/ROS/ORB_SLAM2/CMakeLists.txt文件加上 -lboost_system
set(LIBS
${OpenCV_LIBS}
${EIGEN3_LIBS}
${Pangolin_LIBRARIES}
${PROJECT_SOURCE_DIR}/../../../Thirdparty/DBoW2/lib/libDBoW2.so
${PROJECT_SOURCE_DIR}/../../../Thirdparty/g2o/lib/libg2o.so
${PROJECT_SOURCE_DIR}/../../../lib/libORB_SLAM2.so
#加上這一句
-lboost_system
)
3.2 修改ros_mono.cc (或者ros_mono_ar.cc)
進入~/orbslam_ws/src/ORB_SLAM2/Examples/ROS/ORB_SLAM2/src
打開 ros_mono.cc 把程序裏面的topic改爲 /usb_cam/image_raw
具體還要查看自己的rviz(即上一步打開的攝像頭窗口的標題)
然後進入~/orbslam_ws/src/ORB_SLAM,重新編譯./build_ros.sh
3.3 RGBD
修改ros_rgbd.cc
把rgb_topic和depth_topic訂閱話題修改爲
"/camera/color/image_raw";
"/camera/aligned_depth_to_color/image_raw";
#重新編譯:
source /home/hong/ROS/orbslam_map_ws/devel/setup.bash
chmod +x build.sh
./build.sh
chmod +x build_ros.sh
./build_ros.sh
4.啓動Mono(或MonoAR)
(1)終端1:roscore
(2)終端2:roslaunch usb_cam usb_cam-test.launch
(3)終端3:rosrun ORB_SLAM2 Mono /home/hong/ROS/orbslam_ws/src/ORB_SLAM2-master/Vocabulary/ORBvoc.txt /home/hong/ROS/orbslam_ws/src/ORB_SLAM2-master/Examples/ROS/ORB_SLAM2/mono.yaml
5.啓動RGB-D相機 realsense D415
5.1 獲得相機內參矩陣
they’re published on a topic in the realsense ROS node. We can get those parameters by running rostopic echo /camera/color/camera_info
打開三個終端
roscore
roslaunch realsense2_camera rs_rgbd.launch
rostopic echo /camera/color/camera_info
運行之後的結構長這樣
---
header:
seq: 2053
stamp:
secs: 1561365943
nsecs: 92858932
frame_id: "camera_color_optical_frame"
height: 480
width: 640
distortion_model: "plumb_bob"
D: [0.0, 0.0, 0.0, 0.0, 0.0]
K: [616.58837890625, 0.0, 310.9554138183594, 0.0, 616.196044921875, 234.5266876220703, 0.0, 0.0, 1.0]
R: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
P: [616.58837890625, 0.0, 310.9554138183594, 0.0, 0.0, 616.196044921875, 234.5266876220703, 0.0, 0.0, 0.0, 1.0, 0.0]
binning_x: 0
binning_y: 0
roi:
x_offset: 0
y_offset: 0
height: 0
width: 0
do_rectify: False
---
K就是我們得到的內參矩陣寫成了向量模式。
K = [fx 0 cx 0 fy cy 0 0 1 ]
然後就是baseline,根據官方datasheet,
D435 的baseline爲55mm,bf的值爲bf = baseline (in meters) * fx。
根據相機參數得到的d415.yaml文件
%YAML:1.0
#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------
# Camera calibration and distortion parameters (OpenCV)
Camera.fx: 616.588
Camera.fy: 616.196
Camera.cx: 310.955
Camera.cy: 234.527
Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0
Camera.width: 640
Camera.height: 480
# Camera frames per second
Camera.fps: 30.0
# IR projector baseline times fx (aprox.)
Camera.bf: 30.72
# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1
# Close/Far threshold. Baseline times.
ThDepth: 50.0
# Deptmap values factor
DepthMapFactor: 1000.0
#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------
# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 1000
# ORB Extractor: Scale factor between levels in the scale pyramid
ORBextractor.scaleFactor: 1.2
# ORB Extractor: Number of levels in the scale pyramid
ORBextractor.nLevels: 8
# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7
#--------------------------------------------------------------------------------------------
# Viewer Parameters
#--------------------------------------------------------------------------------------------
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1
Viewer.GraphLineWidth: 0.9
Viewer.PointSize:2
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3
Viewer.ViewpointX: 0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -1.8
Viewer.ViewpointF: 500
# PointCloud Mapping
#--------------------------------------------------------------------------------------------
PointCloudMapping.Resolution: 0.03
5.2運行SLAM
roscore
roslaunch realsense2_camera rs_rgbd.launch
rosrun ORB_SLAM2 RGBD /home/hong/ROS/orbslam_ws/src/ORB_SLAM2-master/Vocabulary/ORBvoc.txt /home/hong/ROS/orbslam_ws/src/ORB_SLAM2-master/Examples/ROS/ORB_SLAM2/d415.yaml
6.標定環節,待更。
參考自:
https://blog.csdn.net/subiluo/article/details/88975979
https://blog.csdn.net/qq_36898914/article/details/88780649
https://blog.csdn.net/Carminljm/article/details/86353775
https://blog.csdn.net/zhuoyueljl/article/details/78524602
https://www.jianshu.com/p/dbf39b9e4617
http://www.liuxiao.org/2016/07/ubuntu-orb-slam2-在-ros-上編譯調試/
https://www.cnblogs.com/yepeichu/p/10896201.html
https://blog.csdn.net/Darlingqiang/article/details/78989544