Skip to content

RGB-D Encoder SLAM for a Differential-Drive Robot in Dynamic Environments

License

Notifications You must be signed in to change notification settings

ydsf16/dre_slam

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DRE-SLAM

Dynamic RGB-D Encoder SLAM for a Differential-Drive Robot

Authors: Dongsheng Yang, Shusheng Bi, Wei Wang, Chang Yuan, Wei Wang, Xianyu Qi, and Yueri Cai

image

DRE-SLAM is developed for a differential-drive robot that runs in dynamic indoor scenarios. It takes the information of an RGB-D camera and two wheel-encoders as inputs. The outputs are the 2D pose of the robot and a static background OctoMap.

Video: Youtube or Dropbox or Pan.Baidu

DRE-SLAM

Paper: DRE-SLAM: Dynamic RGB-D Encoder SLAM for a Differential-Drive Robot, Dongsheng Yang, Shusheng Bi, Wei Wang, Chang Yuan, Wei Wang, Xianyu Qi, and Yueri Cai. (Remote Sensing, 2019) PDF, WEB

Prerequisites

1. Ubuntu 16.04

Follow the instructions in: http://wiki.ros.org/kinetic/Installation/Ubuntu

3. ROS pacakges

sudo apt-get install ros-kinetic-cv-bridge ros-kinetic-tf ros-kinetic-message-filters ros-kinetic-image-transport ros-kinetic-octomap ros-kinetic-octomap-msgs ros-kinetic-octomap-ros ros-kinetic-octomap-rviz-plugins ros-kinetic-octomap-server ros-kinetic-pcl-ros ros-kinetic-pcl-msgs ros-kinetic-pcl-conversions ros-kinetic-geometry-msgs

We use the YOLOv3 implemented in OpenCV 4.0.

Follow the instructions in: https://opencv.org/opencv-4-0-0.html

5. Ceres

Follow the instructions in: http://www.ceres-solver.org/installation.html

Build DRE-SLAM

1. Clone the repository

cd ~/catkin_ws/src
git clone https://github.com/ydsf16/dre_slam.git

2. Build DBow2

cd dre_slam/third_party/DBoW2
mkdir build
cd build
cmake ..
make -j4

3. Build Sophus

cd ../../Sophus
mkdir build
cd build
cmake ..
make -j4

4. Build object detector

cd ../../../object_detector
mkdir build
cd build
cmake ..
make -j4

5. Download the YOLOv3 model

cd ../../config
mkdir yolov3
cd yolov3
wget https://pjreddie.com/media/files/yolov3.weights
wget https://github.com/pjreddie/darknet/blob/master/cfg/yolov3.cfg?raw=true -O ./yolov3.cfg
wget https://github.com/pjreddie/darknet/blob/master/data/coco.names?raw=true -O ./coco.names

6. Catkin_make

cd ~/catkin_ws
catkin_make
source ~/catkin_ws/devel/setup.bash

Example

Dataset

We collected several data sequences in our lab using our Redbot robot. The dataset is available at Pan.Baidu or Dropbox.

Run

1. Open a terminal and launch dre_slam

roslaunch dre_slam comparative_test.launch

2. Open a terminal and play one rosbag

rosbag play <bag_name>.bag

Run on your own robot

You need to do three things:

  1. Calibrate the intrinsic parameter of the camera, the robot odometry parameter, and the rigid transformation from the camera to the robot.

  2. Prepare a parameter configuration file, refer to the config folder.

  3. Prepare a launch file, refer to the launch folder.

Contact us

For any issues, please feel free to contact Dongsheng Yang: [email protected]