Edit on GitHub

Applications

NOTE:

  • This instructions were tested on Ubuntu 16.04 and ROS Kinetic Kame.
  • This instructions are supposed to be running on the remote PC. Please run the instructions below on your Remote PC. However, the part marked [TurtleBot] is the content that runs on SBC of TurtleBot3.
  • Make sure to run the Bringup instructions before running the instructions below.

TIP: The terminal application can be found with the Ubuntu search icon on the top left corner of the screen. The shortcut key for running the terminal is Ctrl-Alt-T.

This chapter shows some demos using TurtleBot3. In order to implement these demos, you have to install the turtlebot3_applications and turtlebot3_applications_msgs packages.

[Remote PC] Go to catkin workspace directory (/home/(user_name)/catkin_ws/src) and clone the turtlebot3_applications and turtlebot3_applications_msgs repository. Then run the catkin_make to build the new packages.

$ sudo apt-get install ros-kinetic-ar-track-alvar
$ sudo apt-get install ros-kinetic-ar-track-alvar-msgs
$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_applications.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_applications_msgs.git
$ cd ~/catkin_ws && catkin_make

TurtleBot Follower Demo

NOTE:

  • The follower demo was implemented only using a 360 Laser Distance Sensor LDS-01. a classification algorithm is used based on previous fitting with samples of person and obstacles positions to take actions. It follows someone in front of the robot within a 50 centimeter range and 140 degrees.
  • Running the follower demo in an area with obstacles may not work well. Therefore, it is recommended to run the demo in an open area without obstacles.

[TurtleBot] In order to run this demo, parameter in LIDAR launch file has to be modified. In the below example, Pluma is used to edit the launch file. In the param tag with frame_id as a name, replace base_scan to odom and save the file as shown in the below images.

$ pluma ~/catkin_ws/src/turtlebot3/turtlebot3_bringup/launch/turtlebot3_lidar.launch

NOTE: Turtlebot Follower Demo requires scikit-learn, NumPy and ScyPy packages.

[Remote PC] Install scikit-learn, NumPy and ScyPy packages with below commands.

$ sudo apt-get install python-pip
$ sudo pip install -U scikit-learn numpy scipy
$ sudo pip install --upgrade pip

[Remote PC] When installation is completed, run roscore on the remote pc with below command.

$ roscore

[TurtleBot] Launch the bringup

$ roslaunch turtlebot3_bringup turtlebot3_robot.launch

[Remote PC] Launch turtlebot3_follow_filter with below command.

$ roslaunch turtlebot3_follow_filter turtlebot3_follow_filter.launch

[Remote PC] Launch turtlebot3_follower with below command.

$ roslaunch turtlebot3_follower turtlebot3_follower.launch

TurtleBot Panorama Demo

NOTE:

  • The turtlebot3_panorama demo uses pano_ros for taking snapshots and stitching them together to create panoramic image.
  • Panorama demo requires to install raspicam_node package. Instructions for installing this package can be found at Gihub Link
  • Panorama demo requires to install OpenCV and cvbridge packages. Instructions for installing OpenCV can be found at OpenCV Tutorial Link

[TurtleBot] Launch the turtlebot3_rpicamera file

$ roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch

[Remote PC] Launch panorama with below command.

$ roslaunch turtlebot3_panorama panorama.launch

[Remote PC] To start the panorama demo, please enter below command.

$ rosservice call turtlebot3_panorama/take_pano 0 360.0 30.0 0.3

Parameters that can be sent to the rosservice to get a panoramic image are:

[Remote PC] To view the result image, please enter below command.

$ rqt_image_view image:=/turtlebot3_panorama/panorama

Automatic Parking

NOTE:

  • The turtlebot3_automatic_parking demo was using a 360 laser Distance Sensor LDS-01 and a reflective tape. The LaserScan topic has intensity and distance data from LDS. The TurtleBot3 uses this to locate the reflective tape.
  • The turtlebot3_automatic_parking demo requires NumPy package.

[Remote PC] Install NumPy package with below commands. If you already installed numpy, you can skip below commands.

$ sudo apt-get install python-pip
$ sudo pip install -U numpy
$ sudo pip install --upgrade pip

[Remote PC] Run roscore.

$ roscore

[TurtleBot] Bring up basic packages to start TurtleBot3 applications.

$ roslaunch turtlebot3_bringup turtlebot3_robot.launch

[Remote PC] If you use TurtleBot3 Burger, set the model of TurtleBot3 like command below.

TIP: Before executing this command, you have to specify the model name of TurtleBot3. The ${TB3_MODEL} is the name of the model you are using in burger, waffle, waffle_pi. If you want to permanently set the export settings, please refer to Export TURTLEBOT3_MODEL page.

$ export TURTLEBOT3_MODEL=${TB3_MODEL}

[Remote PC] Run RViz.

$ roslaunch turtlebot3_bringup turtlebot3_remote.launch
$ rosrun rviz rviz -d `rospack find turtlebot3_automatic_parking`/rviz/turtlebot3_automatic_parking.rviz

[Remote PC] Launch the automatic parking file.

$ roslaunch turtlebot3_automatic_parking turtlebot3_automatic_parking.launch  

Automatic Parking Vision

NOTE:

  • The turtlebot3_automatic_parking_vision uses raspberry pi camera and so the robot which is a default platform used for this demo is TurtleBot3 Waffle Pi. Since it parks from finding out AR marker on some wall, printed AR marker should be prepared. Whole process uses the image get from the camera, so if the process is not well being done, configure the parameters, such as brightness, contrast, etc.
  • The turtlebot3_automatic_parking_vision uses rectified image based on image_proc nodes. To get rectified image, the robot should get optic calibration data for raspberry pi camera. (Every downloaded turtlebot3 packages already have the camera calibration data as raspberry pi camera v2 default.)
  • The turtlebot3_automatic_parking_vision package requires ar_track_alvar package.

[Remote PC] Run roscore.

$ roscore

[TurtleBot] Bring up basic packages to start TurtleBot3 applications.

$ roslaunch turtlebot3_bringup turtlebot3_robot.launch

[TurtleBot] Start the raspberry pi camera nodes.

$ roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch

[Remote PC] Raspberry pi package will publish compressed type image for fast communication. However, what will be needed in image rectification in image_proc node is raw type image. Hence, compressed image should be transform to raw image.

$ rosrun image_transport republish compressed in:=raspicam_node/image raw out:=raspicam_node/image

[Remote PC] Then, the image rectification should be carried out.

$ ROS_NAMESPACE=raspicam_node rosrun image_proc image_proc image_raw:=image _approximate_s=true _queue_size:=20

[Remote PC] Now should start the AR marker detection. Before running related launch file, the model of what will be used by this example code should be exported. After running the launch file, RViz will be automatically run under preset environments.

$ export TURTLEBOT3_MODEL=waffle_pi
$ roslaunch turtlebot3_automatic_parking_vision turtlebot3_automatic_parking_vision.launch

Load Multiple TurtleBot3s

NOTE: This application must be set firmware version 1.2.1 or higher.

[Remote PC] Run roscore.

$ roscore

Bringup multiple turtlebot3s with different namespace. We recommend the namespace includes common words such as tb3_0, tb3_1 or my_robot_0, my_robot_1

[TurtleBot(tb3_0)] Bring up basic packages with ROS NAMESPACE for nodes, multi_robot_name for tf prefix and set_lidar_frame_id for lidar frame id. This parameters must be the same.

$ ROS_NAMESPACE=tb3_0 roslaunch turtlebot3_bringup turtlebot3_robot.launch multi_robot_name:="tb3_0" set_lidar_frame_id:="tb3_0/base_scan"

[TurtleBot(tb3_1)] Bring up basic packages with ROS NAMESPACE for nodes, multi_robot_name for tf prefix and set_lidar_frame_id for lidar frame id. This parameters must be the same but different other robots.

$ ROS_NAMESPACE=tb3_1 roslaunch turtlebot3_bringup turtlebot3_robot.launch multi_robot_name:="tb3_1" set_lidar_frame_id:="tb3_1/base_scan"

Then the terminal you launched tb3_0 will represents below messages. You can watch TF messages have prefix tb3_0

SUMMARY
========

PARAMETERS
 * /rosdistro: kinetic
 * /rosversion: 1.12.13
 * /tb3_0/turtlebot3_core/baud: 115200
 * /tb3_0/turtlebot3_core/port: /dev/ttyACM0
 * /tb3_0/turtlebot3_core/tf_prefix: tb3_0
 * /tb3_0/turtlebot3_lds/frame_id: tb3_0/base_scan
 * /tb3_0/turtlebot3_lds/port: /dev/ttyUSB0

NODES
  /tb3_0/
    turtlebot3_core (rosserial_python/serial_node.py)
    turtlebot3_diagnostics (turtlebot3_bringup/turtlebot3_diagnostics)
    turtlebot3_lds (hls_lfcd_lds_driver/hlds_laser_publisher)

ROS_MASTER_URI=http://192.168.1.2:11311

process[tb3_0/turtlebot3_core-1]: started with pid [1903]
process[tb3_0/turtlebot3_lds-2]: started with pid [1904]
process[tb3_0/turtlebot3_diagnostics-3]: started with pid [1905]
[INFO] [1531356275.722408]: ROS Serial Python Node
[INFO] [1531356275.796070]: Connecting to /dev/ttyACM0 at 115200 baud
[INFO] [1531356278.300310]: Note: publish buffer size is 1024 bytes
[INFO] [1531356278.303516]: Setup publisher on sensor_state [turtlebot3_msgs/SensorState]
[INFO] [1531356278.323360]: Setup publisher on version_info [turtlebot3_msgs/VersionInfo]
[INFO] [1531356278.392212]: Setup publisher on imu [sensor_msgs/Imu]
[INFO] [1531356278.414980]: Setup publisher on cmd_vel_rc100 [geometry_msgs/Twist]
[INFO] [1531356278.449703]: Setup publisher on odom [nav_msgs/Odometry]
[INFO] [1531356278.466352]: Setup publisher on joint_states [sensor_msgs/JointState]
[INFO] [1531356278.485605]: Setup publisher on battery_state [sensor_msgs/BatteryState]
[INFO] [1531356278.500973]: Setup publisher on magnetic_field [sensor_msgs/MagneticField]
[INFO] [1531356280.545840]: Setup publisher on /tf [tf/tfMessage]
[INFO] [1531356280.582609]: Note: subscribe buffer size is 1024 bytes
[INFO] [1531356280.584645]: Setup subscriber on cmd_vel [geometry_msgs/Twist]
[INFO] [1531356280.620330]: Setup subscriber on sound [turtlebot3_msgs/Sound]
[INFO] [1531356280.649508]: Setup subscriber on motor_power [std_msgs/Bool]
[INFO] [1531356280.688276]: Setup subscriber on reset [std_msgs/Empty]
[INFO] [1531356282.022709]: Setup TF on Odometry [tb3_0/odom]
[INFO] [1531356282.026863]: Setup TF on IMU [tb3_0/imu_link]
[INFO] [1531356282.030138]: Setup TF on MagneticField [tb3_0/mag_link]
[INFO] [1531356282.033628]: Setup TF on JointState [tb3_0/base_link]
[INFO] [1531356282.041117]: --------------------------
[INFO] [1531356282.044421]: Connected to OpenCR board!
[INFO] [1531356282.047700]: This core(v1.2.1) is compatible with TB3 Burger
[INFO] [1531356282.051355]: --------------------------
[INFO] [1531356282.054785]: Start Calibration of Gyro
[INFO] [1531356284.585490]: Calibration End

[Remote PC] Launch robot state publisher with same namespace.

$ ROS_NAMESPACE=tb3_0 roslaunch turtlebot3_bringup turtlebot3_remote.launch multi_robot_name:=tb3_0
$ ROS_NAMESPACE=tb3_1 roslaunch turtlebot3_bringup turtlebot3_remote.launch multi_robot_name:=tb3_1

Before start another application, check topics and TF tree to open rqt

$ rqt

To use this setup, each turtlebot3 makes map using SLAM and these maps are merged simutaneously by multi_map_merge packages. You can get more information about this to visit Virtual SLAM by Multiple TurtleBot3s sections

ROS2

NOTE: This application must be set ROS2 firmware version 1.0.0 or higher and must be used only ROS2 not ROS.

Installation

[TurtleBot] Burn specific raspbian image to your microSD card(>8GB).

After unzip downloaded image, burn the image to your microSD card(>8GB) by using gnome-disks. If you succeeded to burn, insert it in your Raspberry Pi 3 and boot raspbian. This image was to set username by pi and passwords turtlebot

NOTE: If you do not want to use raspbian image above, please refer to the “How to set sbc for turtlebot3 with ros2”.

[TurtleBot] Upload firmware for ROS2.

$ cd ~/turtlebot3
$ rm -rf ./opencr_update.tar.xz
$ wget https://github.com/ROBOTIS-GIT/OpenCR_Binaries/raw/master/turtlebot3/ROS2/latest/opencr_update.tar.xz
$ tar -xf ./opencr_update.tar.xz

$ export OPENCR_PORT=/dev/ttyACM0
$ export OPENCR_MODEL=burger
$ cd ./opencr_update
$ ./update.sh $OPENCR_PORT $OPENCR_MODEL.opencr

If it succeeded, you can check below texts in your terminal

armv7l
arm
OpenCR Update Start..
opencr_ld_shell ver 1.0.0
opencr_ld_main 
[  ] file name   	: burger.opencr 
[  ] file size   	: 168 KB
[  ] fw_name     	: burger 
[  ] fw_ver      	: V180903R1 
[OK] Open port   	: /dev/ttyACM0
[  ]
[  ] Board Name  	: OpenCR R1.0
[  ] Board Ver   	: 0x17020800
[  ] Board Rev   	: 0x00000000
[OK] flash_erase 	: 0.96s
[OK] flash_write 	: 1.92s 
[OK] CRC Check   	: 10E28C8 10E28C8 , 0.006000 sec
[OK] Download 
[OK] jump_to_fw 

[TurtleBot] Reset OpenCR using RESET button.

[Remote PC] Install ROS2 Bouncy

[Remote PC] Add source to bashrc file

NOTE: If some texts related ROS(like source /opt/ros/kinetic/setup.bash) was set in bashrc, please comment or delete it.

Open the bashrc

$ gedit ~/.bashrc

Add below texts

source ~/ros2_ws/install/local_setup.bash

Then, source the bashrc with below command

$ source ~/.bashrc

[Remote PC] Download dependency and build

$ mkdir -p ~/turtlebot3_ws/src
$ cd ~/turtlebot3_ws/src
$ git clone -b ros2 https://github.com/ROBOTIS-GIT/turtlebot3.git
$ git clone -b ros2 https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git
$ git clone -b release-latest https://github.com/ros2/cartographer.git
$ git clone -b release-latest https://github.com/ros2/cartographer_ros.git
$ git clone https://github.com/ros2/pcl_conversions.git
$ sudo apt install libpcl-conversions-dev libpcl-dev
$ cd ~/turtlebot3_ws && colcon build

[Remote PC] Add source to bashrc file

Open the bashrc

$ gedit ~/.bashrc

Add below texts

source ~/turtlebot3_ws/install/local_setup.bash

Then, source the bashrc with below command

$ source ~/.bashrc

Bringup TurtleBot3

[TurtleBot, RemotePC] Sync time between TurtleBot and RemotePC

$ sudo apt-get install ntpdate
$ sudo ntpdate ntp.ubuntu.com

[TurtleBot] Run MicroRTPSAgent for OpenCR

$ cd ~/turtlebot3
$ MicroRTPSAgent serial /dev/ttyACM0

[TurtleBot] Run MicroRTPSAgent for Lidar

$ cd ~/turtlebot3
$ MicroRTPSAgent udp 2018

[TurtleBot] Run Lidar application

$ cd ~/turtlebot3
$ ./turtlebot3_lidar

[Remote PC] Run turtlebot3_remote.launch.py

$ ros2 launch turtlebot3_bringup turtlebot3_remote.launch.py

Then The terminal will represent below messages.

[INFO] [launch]: process[robot_state_publisher-1]: started with pid [21355]
[INFO] [launch]: process[time_sync-2]: started with pid [21356]
[INFO] [launch]: process[odometry_publisher-3]: started with pid [21357]
[INFO] [launch]: process[tf_publisher-4]: started with pid [21358]
[INFO] [launch]: process[joint_states_publisher-5]: started with pid [21359]
[INFO] [launch]: process[scan_publisher-6]: started with pid [21360]
Initialize urdf model from file: /home/darby/ros2_overlay_ws/install/turtlebot3_description/share/turtlebot3_description/urdf/turtlebot3_burger.urdf
Parsing robot urdf xml string.
Link base_link had 5 children
Link caster_back_link had 0 children
Link imu_link had 0 children
Link base_scan had 0 children
Link wheel_left_link had 0 children
Link wheel_right_link had 0 children
got segment base_footprint
got segment base_link
got segment base_scan
got segment caster_back_link
got segment imu_link
got segment wheel_left_link
got segment wheel_right_link
[INFO] [time_sync]: Init System Time publisher
Adding fixed segment from base_footprint to base_link
Adding fixed segment from base_link to caster_back_link
Adding fixed segment from base_link to imu_link
Adding fixed segment from base_link to base_scan
Adding moving segment from base_link to wheel_left_link
Adding moving segment from base_link to wheel_right_link
[INFO] [joint_states_publisher]: Init joint_states publisher
[INFO] [scan_publisher]: Init scan publisher
[INFO] [tf_publisher]: Init tf publisher
[INFO] [odometry_publisher]: Init Odometry publisher

And you can check topic list

$ ros2 topic list
/clock
/cmd_vel
/imu
/joint_states
/motor_power
/odom
/parameter_events
/reset
/robot_description
/scan
/scan_half
/sensor_state
/sound
/tf
/tf_static
/time_sync
/version_info

Run turtlebot3_teleop node

[Remote PC]

$ ros2 run turtlebot3_teleop turtlebot3_teleop_key
Control Your TurtleBot3!
---------------------------
Moving around:
        w
   a    s    d
        x

w/x : increase/decrease linear velocity (Burger : ~ 0.22, Waffle and Waffle Pi : ~ 0.26)
a/d : increase/decrease angular velocity (Burger : ~ 2.84, Waffle and Waffle Pi : ~ 1.82)

space key, s : force stop

CTRL-C to quit

Launch Cartographer

[TurtleBot, RemotePC] Sync time between TurtleBot and RemotePC

$ sudo apt-get install ntpdate
$ sudo ntpdate ntp.ubuntu.com

[Remote PC]

$ launch `ros2 pkg prefix turtlebot3_cartographer`/share/turtlebot3_cartographer/launch/turtlebot3_cartographer.py

[Remote PC] Run Rviz2

$ rviz2