Examples
NOTE: The examples will be added as quickly as possible based on Humble. We appreciate your interest and support.
WARNING: This content is a temporarily upload of the manual originally supporting Kinetic. It will soon be ported to Noetic and support is planned for Humble examples soon.
Make sure to run the Bringup instruction before performing these examples, and be careful when testing the robot on tables or other areas where the robot could be damaged by unexpected movement.
NOTE:
- These instructions were tested on
Ubuntu 16.04
andROS Kinetic Kame
. - These instructions are intended to be run on the remote PC.
The contents in the e-Manual are subject change without prior notice. Some video content may differ from the contents in the e-Manual.
NOTE: This feature is available for ROS Kinetic.
Move using Interactive Markers
This example demonstrates how to control TurtleBot3 in RViz using Interactive Markers. Interactive Markers allow users to move the robot by manipulating on-screen controls in RViz, enabling both linear movement and rotation without the need for a physical joystick or command-line inputs. Please refer to the above tutorial video for more detailed usage.
Understanding the Overall Flow
To successfully use Interactive Markers to control TurtleBot3, several key components must be launched. The interaction between these components enables real-time manual control of the robot in RViz.
- TurtleBot3 Interactive Marker (
turtlebot3_interactive_marker node
)- Manages interactive markers that allow users to control TurtleBot3 within RViz.
- Publishes updates via
/turtlebot3_interactive_marker/update
. - Receives user interactions and provides feedback through
/turtlebot3_interactive_marker/feedback
. - Converts interactive marker movements into
cmd_vel
commands, sending them to TurtleBot3 Node.
- TurtleBot3 Node (
turtlebot3_node
)- Subscribes to
cmd_vel
to control the actual movement of TurtleBot3.
- Subscribes to
- Differential Drive Controller (
diff_drive_controller
)- Receives odometry data and sends it to
/odom
. - Publishes transformations (
/tf
) necessary for maintaining the robot’s coordinate system. - Works in conjunction with Robot State Publisher.
- Receives odometry data and sends it to
- Robot State Publisher (
robot_state_publisher
)- Publishes the
/robot_description
and maintains the robot’s kinematic model in RViz.
- Publishes the
These components work together to allow real-time interactive control of the TurtleBot3 directly from RViz, enabling linear movement and rotation through intuitive marker interactions.
Running the Interactive Marker Example
1. Bringup TurtleBot3
- Open a new first terminal on the remote PC with
Ctrl
+Alt
+T
and connect to the Raspberry Pi via SSH using its IP address.
Enter yourpassword
of Ubuntu OS inRaspberry pi
.
[Remote PC]$ ssh ubuntu@{IP_ADDRESS_OF_RASPBERRY_PI}
- Bring up basic packages to start essential TurtleBot3 applications. You will need to specify your specific TurtleBot3 model.
[TurtleBot3 SBC]$ export TURTLEBOT3_MODEL=burger $ ros2 launch turtlebot3_bringup robot.launch.py
Wait until the bringup process finishes and the TurtleBot3 is ready before proceeding.
2. Start the Interactive Marker Server
The interactive marker node must be launched to generate the markers in RViz and process user input. Run the following command to start the interactive marker node:
[Remote PC]
$ ros2 run turtlebot3_example turtlebot3_interactive_marker
This node initializes the interactive markers in RViz, allowing users to control the robot by dragging the visual markers.
3. Visualizing TurtleBot3 Model in Rviz
Before using interactive markers, RViz must be configured properly to visualize the robot model and enable marker-based control. Start RViz2:
[Remote PC]
$ rviz2
Once RViz2 opens, follow these steps to configure the display:
- Add the Robot Model:
- Click the “Add” button in the bottom-left corner.
- Select “RobotModel”.
- Set the Topic to
/robot_description
.
- Enable Interactive Markers:
- Click the “Add” button again.
- Navigate to the “By topic” tab.
- Locate and select “InteractiveMarkers.
At this point, TurtleBot3 should appear in RViz, along with interactive markers that allow you to control its movement.
4. Controlling TurtleBot3 with Interactive Markers
With RViz properly set up, you can now control TurtleBot3 using interactive markers:
- Move Forward/Backward: Drag the arrow-shaped marker along its axis.
- Rotate Left/Right: Click and rotate the circular handle.
These actions generate cmd_vel
messages, which are sent to TurtleBot3 Node to execute real movement.
In this example, you have set up interactive marker control for TurtleBot3 in RViz, allowing for direct manual operation without predefined commands. This method provides an intuitive way to move the robot and observe its response in real time. Try exploring different interactions to get familiar with controlling TurtleBot3 through RViz.
The TurtleBot3 can be moved using Interactive Markers in RViz.
[Remote PC] Open a new terminal and launch the remote file.
TIP: Before executing this command, you have to specify the model name of the TurtleBot3 variant you will be using. The ${TB3_MODEL}
variable must be set to name of the model you are using ( burger
, waffle
, or waffle_pi
). This can also be permanently set in your ROS environment configuration according to the Export TURTLEBOT3_MODEL instructions.
$ export TURTLEBOT3_MODEL=${TB3_MODEL}
$ roslaunch turtlebot3_bringup turtlebot3_remote.launch
[Remote PC] launch the interactive markers file.
$ roslaunch turtlebot3_example interactive_markers.launch
[Remote PC] Visualize the model in 3D with RViz.
$ rosrun rviz rviz -d `rospack find turtlebot3_example`/rviz/turtlebot3_interactive.rviz
Obstacle Detection
NOTE: This feature is not supported yet. We are currently working on it.
The TurtleBot3 can move or stop in response to LDS readings. When the TurtleBot3 is moving, it will stop when it detects an obstacle ahead.
[Remote PC] Launch the obstacle file.
$ roslaunch turtlebot3_example turtlebot3_obstacle.launch
Position Control
NOTE: This feature is not supported yet. We are currently working on it.
NOTE: This feature is not supported in ROS 1. We are currently working on it in ROS 2 Humble.
Point Operation
NOTE: This feature is not supported yet. We are currently working on it.
The TurtleBot3 can be moved in a 2D plane using point (x, y)
and z-angular
instructions. For example, if you send (0.5, 0.3, 60), the TurtleBot3 moves to point (x = 0.5m, y = 0.3m) and then rotates to face 60 degrees.
[Remote PC] launch the pointop file.
$ roslaunch turtlebot3_example turtlebot3_pointop_key.launch
Patrol
This section explains how to configure and execute the TurtleBot3 patrol function, enabling autonomous movement along predefined and custom routes using ROS 2 Humble. The Action Client sends patrol parameters (e.g., shape, area, or iteration count) to the Action Server, which then translates these parameters into cmd_vel
commands to move the TurtleBot3. Please refer to the above tutorial video for more detailed usage.
Understanding the Overall Flow
In this setup, different nodes work together to accomplish the patrol behavior:
- TurtleBot3 Patrol Client (
turtlebot3_patrol_client
)- Accepts user input (such as the selected patrol shape and number of iterations).
- Sends the patrol parameters to the Action Server.
- Receives feedback on the progress of the patrol and displays status messages.
- TurtleBot3 Patrol Server (
turtlebot3_patrol_server
)- Receives patrol instructions from the Action Client.
- Converts the instructions into motion commands (
cmd_vel
) for the TurtleBot3. - Sends back feedback (such as completion status) to the client.
- TurtleBot3 Node (
turtlebot3_node
)- Executes movements based on
cmd_vel
messages. - Publishes sensor data (e.g., IMU, battery state, LIDAR scans) and transformation data (
tf
). - Interfaces with hardware drivers so that the robot can perform physical movements.
- Executes movements based on
Running the Patrol Example
1. Bringup TurtleBot3
- Open a new first terminal on the remote PC with
Ctrl
+Alt
+T
and connect to the Raspberry Pi via SSH using its IP address.
Enter yourpassword
of Ubuntu OS inRaspberry pi
.
[Remote PC]$ ssh ubuntu@{IP_ADDRESS_OF_RASPBERRY_PI}
- Bring up basic packages to start essential TurtleBot3 applications. You will need to specify your specific TurtleBot3 model.
[TurtleBot3 SBC]$ export TURTLEBOT3_MODEL=burger $ ros2 launch turtlebot3_bringup robot.launch.py
Wait until the bringup process finishes and the TurtleBot3 is ready before proceeding.
2. Start the Patrol Server
Open a new second terminal on the remote PC and run the patrol server node.
[Remote PC]
$ export TURTLEBOT3_MODEL=burger
$ ros2 run turtlebot3_example turtlebot3_patrol_server
3. Start the Patrol Client
Open a new third terminal on the remote PC and run the patrol client node.
[Remote PC]
$ ros2 run turtlebot3_example turtlebot3_patrol_client
When the client starts, you will be prompted to select the patrol shape (square or triangle) and enter any required parameters (such as side length or number of iterations). Type in the desired values and press Enter. The client sends the provided information to the server, and the TurtleBot3 begins its patrol accordingly.
4. Visualizing TurtleBot3 in RViz (Optional)
To view the TurtleBot3’s movements and visualize the robot in RViz, you can launch the RViz visualization tool.
[Remote PC]
$ ros2 launch turtlebot3_bringup rviz2.launch.py
Once RViz is open, you will be able to see the TurtleBot3 as it performs the patrol according to the parameters you set.
To further explore the functionality of the system, it is recommended to experiment with different patrol parameters, such as varying the side length of the patrol shape, adjusting the number of repetitions, or modifying the patrol speed. These adjustments will allow you to observe how the patrol behavior changes and deepen your understanding of how the action client-server framework coordinates movement. By systematically testing different configurations, you can refine your usage of the patrol system and optimize it for specific applications.
The TurtleBot3 can also be programmed to move according to custom routes. The example program features three route shapes: rectangle, triangle and circle. Please refer to the tutorial video for a more detailed usage demonstration.
[Remote PC] Launch the patrol server file.
$ rosrun turtlebot3_example turtlebot3_server
[Remote PC] Launch the patrol client file.
$ roslaunch turtlebot3_example turtlebot3_client.launch
TurtleBot Follower Demo
NOTE: This feature is not supported yet. We are currently working on it.
In order to follow along with the examples below, you have to install the turtlebot3_applications and turtlebot3_applications_msgs packages.
[Remote PC] Go to your catkin workspace
directory (/home/(user_name)/catkin_ws/src) and clone the turtlebot3_applications and turtlebot3_applications_msgs repository. Then run catkin_make
to build the new packages.
$ sudo apt-get install ros-kinetic-ar-track-alvar
$ sudo apt-get install ros-kinetic-ar-track-alvar-msgs
$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_applications.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_applications_msgs.git
$ cd ~/catkin_ws && catkin_make
NOTE:
- The follower demo was implemented only using the integrated LDS-01 360 Laser Distance Sensor. A classification algorithm is used based on previous fitting with samples of person and obstacle positions. The example follows someone in front of the robot within a 50 centimeter range and 140 degree arc.
- Running the follower demo in an area with obstacles may not work well. Therefore, it is recommended to run the demo in an open area.
- [Remote PC] Install
scikit-learn
,NumPy
andScyPy
packages.$ sudo apt-get install python-pip $ sudo pip install -U scikit-learn numpy scipy $ sudo pip install --upgrade pip
- [Remote PC] When installation is completed, run roscore on the remote pc.
$ roscore
- [TurtleBot] Launch the Bringup package.
$ roslaunch turtlebot3_bringup turtlebot3_robot.launch
- [Remote PC] Set the enviroment variable for the TurtleBot3 Burger and launch
turtlebot3_follow_filter
.$ export TURTLEBOT3_MODEL=burger $ roslaunch turtlebot3_follow_filter turtlebot3_follow_filter.launch
- [Remote PC] Launch
turtlebot3_follower
.$ roslaunch turtlebot3_follower turtlebot3_follower.launch
TurtleBot Panorama Demo
NOTE: This feature is not supported yet. We are currently working on it.
NOTE:
- The
turtlebot3_panorama
demo usespano_ros
for taking snapshots and stitching them together to create panoramic image. - Panorama demo requires the
raspicam_node
package. Instructions for installing this package can be found here - Panorama demo requires the OpenCV and cvbridge packages. Instructions for installing OpenCV can be found here
- [TurtleBot] Launch the
turtlebot3_rpicamera
file$ roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch
- [Remote PC] Launch
panorama
.$ roslaunch turtlebot3_panorama panorama.launch
- [Remote PC] To start the panorama demo, enter the following command.
$ rosservice call turtlebot3_panorama/take_pano 0 360.0 30.0 0.3
Parameters that can be sent to the rosservice to get a panoramic image are:
- Mode for taking the pictures.
- 0 : snap&rotate (i.e. rotate, stop, snapshot, rotate, stop, snapshot, …)
- 1 : continuous (i.e. keep rotating while taking snapshots)
- 2 : stop taking pictures and create panoramic image - Total angle of panoramic image, in degrees - Angle interval (in degrees) when creating the panoramic image in snap&rotate mode, time interval (in seconds) otherwise - Rotating velocity (in radians/s)
- [Remote PC] To view the result image, enter the following command.
$ rqt_image_view image:=/turtlebot3_panorama/panorama
Automatic Parking
NOTE: This feature is not supported yet. We are currently working on it.
NOTE:
- The
turtlebot3_automatic_parking
demo was using an LDS-01 360 laser Distance Sensor and reflective tape. The LaserScan topic has intensity and distance data from LDS. The TurtleBot3 uses this to locate the reflective tape. - The
turtlebot3_automatic_parking
demo requires theNumPy
package.
- [Remote PC] Install
NumPy
. If you already installed numpy, you can skip these commands.$ sudo apt-get install python-pip $ sudo pip install -U numpy $ sudo pip install --upgrade pip
- [Remote PC] Run roscore.
$ roscore
- [TurtleBot] Bring up basic packages to start TurtleBot3 applications.
$ roslaunch turtlebot3_bringup turtlebot3_robot.launch
- [Remote PC] Export your TurtleBot3 model according to the version you will be using.
$ export TURTLEBOT3_MODEL=burger
NOTE: Specify
${TB3_MODEL}
:burger
,waffle
,waffle_pi
before excuting the command. You can permanently configure your TurtleBot model according to the Export TURTLEBOT3_MODEL instructions. - [Remote PC] Run RViz.
$ roslaunch turtlebot3_bringup turtlebot3_remote.launch $ rosrun rviz rviz -d `rospack find turtlebot3_automatic_parking`/rviz/turtlebot3_automatic_parking.rviz
- [Remote PC] Launch the automatic parking file.
$ roslaunch turtlebot3_automatic_parking turtlebot3_automatic_parking.launch
-
You can select LaserScan topic in RViz.
-
/scan
/scan_spot
-
Automatic Parking Vision
NOTE: This feature is not supported yet. We are currently working on it.
NOTE:
- The
turtlebot3_automatic_parking_vision
uses the RaspberryPi camera, the default platform used for this demo is TurtleBot3 Waffle Pi. Since it parks according to an AR marker on a visible wall, a printed AR marker should be prepared. - The
turtlebot3_automatic_parking_vision
uses a rectified image based onimage_proc
nodes. To get a rectified image, the robot should use optical calibration data for the RaspberryPi camera. (Every downloaded turtlebot3 package should have the camera calibration data for RaspberryPi camera v2 by default.) - The
turtlebot3_automatic_parking_vision
package requiresar_track_alvar
package.
- [Remote PC] Run roscore.
$ roscore
- [TurtleBot] Bring up basic packages to start TurtleBot3 applications.
$ roslaunch turtlebot3_bringup turtlebot3_robot.launch
- [TurtleBot] Start the RaspberryPi camera nodes.
$ roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch
- [Remote PC] The RaspberryPi package will publish compressed images for fast communication. However, what will be needed in image rectification in the
image_proc
node is raw image data. The compressed image should be transformed to a raw image.$ rosrun image_transport republish compressed in:=raspicam_node/image raw out:=raspicam_node/image
- [Remote PC] Then, the image rectification should be carried out.
$ ROS_NAMESPACE=raspicam_node rosrun image_proc image_proc image_raw:=image _approximate_s=true _queue_size:=20
- [Remote PC] Now the AR marker detection can start. Before running related launch file, the model of what will be used by this example code should be exported. After running the launch file, RViz will be automatically run under preset environments.
$ export TURTLEBOT3_MODEL=waffle_pi $ roslaunch turtlebot3_automatic_parking_vision turtlebot3_automatic_parking_vision.launch
TurtleBot3 Automatic Parking Vision
Load Multiple TurtleBot3s
NOTE: This feature is available for ROS2 Humble
.
Description
- What is Load Multiple TurtleBot3s example?
- This example shows how to operate multiple TurtleBot3s from one Remote PC.
- If you operate multiple TurtleBots as if you were operating just one, you won’t be able to distinguish which topic belongs to which robot.
- You can assign a unique namespace to each TurtleBot3’s
node
,topic
,frame
so you can identify each TurtleBot3.
- launch files
- multi_robot.launch.py
- Launch sub launch files(gzsever, gzclient, robot_state_publisher, multi_spawn_turtlebot3) with parameters.
- Modify the model SDF temporarily for changing odom frame_id and base scan’s target frame_id.
- robot_state_publisher.launch.py → robot_state_publisher node
- Read model URDF and publish
/tf
based on robot hardware.
- Read model URDF and publish
- multi_spawn_turtlebot3.launch.py → spawn_entity.py
- Read model.sdf and spawn the TurtleBot3 model in gazebo world.
- Sensor data is made by plugin that is written in model.sdf.
- multi_robot.launch.py
Multi Robot launch in Gazebo
In this chapter, we show how to launch multi robot gazebo
- Launch the multi_robot.launch gazebo package.
$ ros2 launch turtlebot3_gazebo multi_robot.launch.py
-
You can see three TurtleBot3s as in the picture below.
- Nodes and topics are completely separated by namespace.
NOTE
- Namespace is not necessary for
/tf
and/tf_static
. - tf2_ros package manages
/tf
&/tf_static
, allowing multiple nodes to publish/tf
&/tf_static
. - Instead, frame_ids in
/tf
&/tf_static
must be unique.
- Frames are completely saparated by namespace.
Modifing Multi Robot launch in Gazebo
In this chapter, we will learn how to modify the launch files to fit the Gazebo simulation for your own project
- Change robot number.
- multi_robot.launch → Change value of
number_of_robot
variable.
- multi_robot.launch → Change value of
- Change world model.
- multi_robot.launch → Change value of
world
variable to your world file path.
- multi_robot.launch → Change value of
- Change robot spawn location.
- multi_robot.launch → Change value of
pose
list. Row means robot number, column[0] isx_pose
, column[1] isy_pose
- multi_robot.launch → Change value of
- Change robot namespace.
- multi_robot.launch → Change value of
namespace
variable.
- multi_robot.launch → Change value of
TurtleBot3 World Example
- You can see three TurtleBot3s in the picture below.
number_of_robot
= 4
pose
= [[-2,-0.5], [0.5,-2], [2,0.5], [-0.5,2]]
world
= turtlebot3_world.world
Multi Robot launch in reality
In this chapter, we show how to launch multiple robots in reality
- Modify the frame_id of the topic header. This allows sensor data to be viewed separately in RViz2.
This task should be performed on the files located on the TurtleBot SBC where bringup is run.
$ nano ~/turtlebot3_ws/src/turtlebot3/turtlebot3_node/include/turtlebot3_node/sensors/imu.hpp
$ nano ~/turtlebot3_ws/src/ld08_driver/src/lipkg.cpp
- Launch bringup with the namespace as an argument.
$ ros2 launch turtlebot3_bringup robot.launch.py namespace:=tb3_1 # Insert what you want to use as namespace
- Nodes and topics are completely separated by namespace.
Multi Robot Teleop
In this chapter, we will control multiple robots with teleop in Gazebo simulation.
$ ros2 run turtlebot3_teleop teleop_keyboard --ros-args -r __ns:=/tb3_1 # Change the number to the robot you want to control
$ ros2 run turtlebot3_teleop teleop_keyboard Change --ros-args -r __ns:=/tb3_1
Control Your TurtleBot3!
---------------------------
Moving around:
w
a s d
x
w/x : increase/decrease linear velocity (Burger : ~ 0.22, Waffle and Waffle Pi : ~ 0.26)
a/d : increase/decrease angular velocity (Burger : ~ 2.84, Waffle and Waffle Pi : ~ 1.82)
space key, s : force stop
CTRL-C to quit
NOTE: This example can only be run on firmware version 1.2.1
or higher.
- [Remote PC] Run roscore.
$ roscore
-
Bringup multiple TurtleBot3s with different namespaces. We recommend that the namespace is easy to remember, and to identify multiple units like
tb3_0
,tb3_1
ormy_robot_0
,my_robot_1
- [TurtleBot(tb3_0)] Bring up basic packages with
ROS NAMESPACE
for nodes,multi_robot_name
for tf prefix andset_lidar_frame_id
for lidar frame id. These parameters must be the same.$ ROS_NAMESPACE=tb3_0 roslaunch turtlebot3_bringup turtlebot3_robot.launch multi_robot_name:="tb3_0" set_lidar_frame_id:="tb3_0/base_scan"
- [TurtleBot(tb3_1)] Bring up basic packages with
ROS NAMESPACE
for nodes,multi_robot_name
for tf prefix andset_lidar_frame_id
for lidar frame id. These parameters must be the same but different other robots.$ ROS_NAMESPACE=tb3_1 roslaunch turtlebot3_bringup turtlebot3_robot.launch multi_robot_name:="tb3_1" set_lidar_frame_id:="tb3_1/base_scan"
- [TurtleBot(tb3_0)] Bring up basic packages with
- Then the terminal for
tb3_0
will output the messages below. You can watch TF for messages with the prefixtb3_0
SUMMARY ======== PARAMETERS * /rosdistro: kinetic * /rosversion: 1.12.13 * /tb3_0/turtlebot3_core/baud: 115200 * /tb3_0/turtlebot3_core/port: /dev/ttyACM0 * /tb3_0/turtlebot3_core/tf_prefix: tb3_0 * /tb3_0/turtlebot3_lds/frame_id: tb3_0/base_scan * /tb3_0/turtlebot3_lds/port: /dev/ttyUSB0 NODES /tb3_0/ turtlebot3_core (rosserial_python/serial_node.py) turtlebot3_diagnostics (turtlebot3_bringup/turtlebot3_diagnostics) turtlebot3_lds (hls_lfcd_lds_driver/hlds_laser_publisher) ROS_MASTER_URI=http://192.168.1.2:11311 process[tb3_0/turtlebot3_core-1]: started with pid [1903] process[tb3_0/turtlebot3_lds-2]: started with pid [1904] process[tb3_0/turtlebot3_diagnostics-3]: started with pid [1905] [INFO] [1531356275.722408]: ROS Serial Python Node [INFO] [1531356275.796070]: Connecting to /dev/ttyACM0 at 115200 baud [INFO] [1531356278.300310]: Note: publish buffer size is 1024 bytes [INFO] [1531356278.303516]: Setup publisher on sensor_state [turtlebot3_msgs/SensorState] [INFO] [1531356278.323360]: Setup publisher on version_info [turtlebot3_msgs/VersionInfo] [INFO] [1531356278.392212]: Setup publisher on imu [sensor_msgs/Imu] [INFO] [1531356278.414980]: Setup publisher on cmd_vel_rc100 [geometry_msgs/Twist] [INFO] [1531356278.449703]: Setup publisher on odom [nav_msgs/Odometry] [INFO] [1531356278.466352]: Setup publisher on joint_states [sensor_msgs/JointState] [INFO] [1531356278.485605]: Setup publisher on battery_state [sensor_msgs/BatteryState] [INFO] [1531356278.500973]: Setup publisher on magnetic_field [sensor_msgs/MagneticField] [INFO] [1531356280.545840]: Setup publisher on /tf [tf/tfMessage] [INFO] [1531356280.582609]: Note: subscribe buffer size is 1024 bytes [INFO] [1531356280.584645]: Setup subscriber on cmd_vel [geometry_msgs/Twist] [INFO] [1531356280.620330]: Setup subscriber on sound [turtlebot3_msgs/Sound] [INFO] [1531356280.649508]: Setup subscriber on motor_power [std_msgs/Bool] [INFO] [1531356280.688276]: Setup subscriber on reset [std_msgs/Empty] [INFO] [1531356282.022709]: Setup TF on Odometry [tb3_0/odom] [INFO] [1531356282.026863]: Setup TF on IMU [tb3_0/imu_link] [INFO] [1531356282.030138]: Setup TF on MagneticField [tb3_0/mag_link] [INFO] [1531356282.033628]: Setup TF on JointState [tb3_0/base_link] [INFO] [1531356282.041117]: -------------------------- [INFO] [1531356282.044421]: Connected to OpenCR board! [INFO] [1531356282.047700]: This core(v1.2.1) is compatible with TB3 Burger [INFO] [1531356282.051355]: -------------------------- [INFO] [1531356282.054785]: Start Calibration of Gyro [INFO] [1531356284.585490]: Calibration End
- [Remote PC] Launch the robot state publisher with the same namespace.
- [TurtleBot(tb3_0)]
$ ROS_NAMESPACE=tb3_0 roslaunch turtlebot3_bringup turtlebot3_remote.launch multi_robot_name:=tb3_0
- [TurtleBot(tb3_1)]
$ ROS_NAMESPACE=tb3_1 roslaunch turtlebot3_bringup turtlebot3_remote.launch multi_robot_name:=tb3_1
- [TurtleBot(tb3_0)]
- Before starting another application, check topics and the TF tree to open rqt
$ rqt
To use this setup, each TurtleBot3 makes a map using SLAM and these maps are merged simultaneously by the multi_map_merge package. You can get more information about this by visiting Virtual SLAM by Multiple TurtleBot3s.