Edit on GitHub

Perceptions

NOTE Please run the instructions below on a PC with ROS packages installed.

Camera

The OpenMAIPULATOR-X can work with some cameras to recognize a front object. Use the following packages with introduced cameras: Astro Pro, Realsense D435, Raspberry Pi Camera V2.

This example is not supported.
https://github.com/ROBOTIS-GIT/open_manipulator_perceptions

This example is not supported.
https://github.com/ROBOTIS-GIT/open_manipulator_perceptions

This example is not supported.
https://github.com/ROBOTIS-GIT/open_manipulator_perceptions

The following is an example of connecting the OV7725 camera module with AL422B buffer to the GPIO of OpenCR.
https://community.robotsource.org/t/opencr-with-ov7725-al422b-camera-2-8-tftlcd/1413

The Arduino sketch code can be found from OpenCR examples in Arduino IDE.

Install Camera Package

Astra Pro

The Astra Series was designed to further improve on the attributes that set Orbbec 3D cameras apart from existing 3D cameras on the market. Astra 3D cameras provide computer vision that enables dozens of functions such as face recognition, gesture recognition, human body tracking, three-dimensional measurement, environment perception, and three-dimensional map reconstruction.

Items Specifications
RGB Image Resolution and Frame Rate 1280 x 720, @30fps
Depth Imgae Resolution and Frame Rate 640 x 480, @30fps
FOV (Field-of-View) 60°H x 49.5°V x 73°D
Range 0.6m - 8m
USB Port USB 2.0
Dimensions 165mm x 30mm x 40mm
Operating Systems Android/Linux/Windows 7/8/10
SDK Astra SDK or OpenNI
Microphones 2 (Built - in)

Specification

Install Astra Camera Library

The following commands will install relevant Astra Pro library.

  $ sudo apt-get install ros-kinetic-rgbd-launch ros-kinetic-libuvc-camera
  $ cd ~/catkin_ws/src
  $ git clone https://github.com/orbbec/ros_astra_camera.git
  $ git clone https://github.com/ROBOTIS-GIT/ros_astra_launch.git
  $ cd ~/catkin_ws && catkin_make
  $ roscd astra_camera && ./scripts/create_udev_rules
Run Astra Launch File

Run the following command.

  $ sudo chmod a+rw /dev/bus/usb/${USB}/${PORT}
  $ roslaunch ros_astra_launch astra_pro.launch

You can use RViz or image_view to verify driver. You can select data topic name related to Astra Pro from drop down menu at the top of the application.

  $ rqt_image_view
Reference

Realsense D435

The Intel® RealSense™ Depth Camera D435 is a USB-powered depth camera and consists of a pair of depth sensors, RGB sensor, and infrared projector. It is ideal for makers and developers to add depth perception capability to their prototype development. The D435 is designed to best fit your prototype.

Items Specifications
Use Environment Indoor/Outdoor
RGB Sensor Resolution and Frame Rate 1920 x 1080 at 30 fps
RGB Sensor FOV 69.4°(H) x 42.5°(V) x 77°(D) (+/- 3°)
Depth Stream Output Resolution Up to 1280 x 720
Depth Stream Output Frame Rate Up to 90 fps
Depth Field of View (FOV) 85.2°(H) x 58°(V) x 94°(D) (+/- 3°)
Minimum Depth Distance (Min-Z) 0.2m
Maximum Range Approx.10 meters
Dimension 90 mm x 25 mm x 25 mm
Connectors USB 3.0 Type - C

Specification

Install Realsense D43 Library

The following commands will install relevant Intel® RealSense™ Depth Camera D435 library.

  $ sudo apt-key adv --keyserver keys.gnupg.net --recv-key C8B3A55A6F3EFCDE || sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-key C8B3A55A6F3EFCDE
  $ sudo add-apt-repository "deb http://realsense-hw-public.s3.amazonaws.com/Debian/apt-repo xenial main" -u
  $ sudo apt-get install librealsense2-dev librealsense2-utils ros-kinetic-rgbd-launch
  $ cd ~/catkin_ws/src
  $ git clone https://github.com/intel-ros/realsense.git
  $ cd ~/catkin_ws && catkin_make
Run Realsense D43 Launch File

Run the following command.

  $ roslaunch realsense2_camera rs_camera.launch

You can use RViz or image_view to verify driver. You can select data topic name related to Intel® RealSense™ Depth Camera D435 from drop down menu at the top of the application.

  $ rqt_image_view
Reference

Raspberry Pi Camera V2

Setup Raspberry Pi Camera

Warning!: Below instruction uses raspi-config tool to setup Raspberry Pi Camera V2. See Raspberry Pi Camera for more details.

  1. [Raspberry Pi] Setting up the camera hardware.

     $ sudo raspi-config
    

  2. Select 3 Interfacing Options.

  3. Select P1 Camera.

  4. Enable camera interface.

  5. After reboot Raspberry Pi, to test that the system is installed and working, try the following command:

     $ raspistill -v -o test.jpg
    

    The display should show a five-second preview from the camera and then take a picture, saved to the file test.jpg

    The following commands will install relevant Raspberry Pi Camera packages on your ROS system.

     $ sudo apt-get install ros-kinetic-compressed-image-transport ros-kinetic-camera-info-manager ros-kinetic-ar-track-alvar ros-kinetic-ar-track-alvar-msgs ros-kinetic-image-proc
    
  6. Clone the relevant packages from Git, and build.

     $ cd ~/catkin_ws/src
     $ git clone https://github.com/UbiquityRobotics/raspicam_node.git
     $ git clone https://github.com/ROBOTIS-GIT/open_manipulator_perceptions.git
     $ cd ~/catkin_ws && catkin_make
    
Run Raspberry Pi Camera Launch File
  1. [Remote PC] Run the following command.

     $ roscore
     $ rqt_image_view
    

    Warning!: Before running rqt_image_view in Remote PC, be sure Raspberry Pi and Remote PC are connected under the same IP address. If not, refer to Raspberry Pi 3 Setup and Remote PC Setup and connect the board and PC under the same IP address. Otherwise, rqt_image_view screen will not show any images from the camera.

  2. [Raspberry Pi] Run the following command.

     $ roslaunch roslaunch open_manipulator_camera raspicam.launch
    
Reference

Not supported in Arduino

Install AR Marker Package

NOTE:

  1. If you use the ar_track_alvar package to recognize the ar marker, print out the ar marker here.

  2. Install AR Marker Packages on Remote PC.
    $ sudo apt-get install ros-kinetic-ar-track-alvar ros-kinetic-ar-track-alvar-msgs ros-kinetic-image-proc
    
  3. Clone the relevant package from Git, and build.
    $ cd ~/catkin_ws/src
    $ git clone https://github.com/ROBOTIS-GIT/open_manipulator_perceptions.git
    $ cd ~/catkin_ws && catkin_make
    

Run AR Marker with a Camera in Use

In order to use AR Marker properly with your camera, be sure to add the camera model to the launch command when using AR Marker.

See the following section and use the provided command to enable AR Marker feature with your camera.

AR Marker With Astra Pro
  1. Install Astra Pro ROS package.

  2. Run AR Marker with the camera model in use, astra_pro.

    $ roslaunch open_manipulator_ar_markers ar_pose.launch camera_model:=astra_pro
    
AR Marker With Realsense D435
  1. Install Realsense D435 ROS package.

  2. Run AR Marker with the camera model in use, realsense_d435.

    $ roslaunch open_manipulator_ar_markers ar_pose.launch camera_model:=realsense_d435
    
AR Marker With Raspberry Pi Camera V2
  1. Install Raspberry Pi Camera V2 ROS package.

  2. [Raspberry Pi] Run a Raspberry Pi camera.
    $ roslaunch open_manipulator_camera raspicam.launch
    
  3. [Remote PC] Run AR Marker with the camera model in use, raspicam
    $ roslaunch open_manipulator_ar_markers ar_pose.launch camera_model:=raspicam
    

AR Marker displayed on RViz

When the camera recognizes the object with the AR marker, the pose of the AR marker will be displated on a RViz screen.

Not supported in Arduino

Pick and Place Example

In this example, OpenMANIPULATOR-X uses Raspberry Pi Camera V2 to pick and place blocks. Print the camera frame to the OpenMANIPULATOR-X using a 3D printer to mount the camera. Attach a 3cm x 3cm ar marker to the cube block. When you run the example package, you pick the ar marker’s id 1, 2, and 3 in order and stack them on one side.

Camera Frame

Raspberry Pi Camera V2

Download the .stl file in the path below and output it to the 3D printer.


Realsense D435

Download the .stl file in the path below and output it to the 3D printer.


Install ROS package

NOTE:

NOTE:

  • To use the Raspberry Pi Camera V2, install it on the Remote PC
  $ cd ~/catkin_ws/src
  $ git clone https://github.com/ROBOTIS-GIT/open_manipulator_applications.git
  $ cd ~/catkin_ws && catkin_make

If the catkin_make command has been completed without any errors, all the preparations are done.

Execution Example

Please, open the terminal window, run roscore as entering following command.

$ roscore

After run roscore, Run the controller of OpenMANIPULATOR. Open the other terminal window and enter the following command in the terminal.

$ roslaunch open_manipulator_controller open_manipulator_controller.launch

WARNING
Please check each joint position before running OpenMANIPULATOR-X. It might stop operation because of joint postion out of range.
The picture on the below is showing you the ideal pose of OpenMANIPULATOR-X. Please adjust each joints along with the following picture when DYNAMIXEL torque isn’t enabled.

If the OpenMANIPULATOR controller has been launched successfully, the terminal will show the following message.

SUMMARY
========

PARAMETERS
 * /open_manipulator/control_period: 0.01
 * /open_manipulator/moveit_sample_duration: 0.05
 * /open_manipulator/planning_group_name: arm
 * /open_manipulator/using_moveit: False
 * /open_manipulator/using_platform: True
 * /rosdistro: kinetic
 * /rosversion: 1.12.14

NODES
  /
    open_manipulator (open_manipulator_controller/open_manipulator_controller)

ROS_MASTER_URI=http://localhost:11311

process[open_manipulator-1]: started with pid [23452]
Joint Dynamixel ID : 11, Model Name : XM430-W350
Joint Dynamixel ID : 12, Model Name : XM430-W350
Joint Dynamixel ID : 13, Model Name : XM430-W350
Joint Dynamixel ID : 14, Model Name : XM430-W350
Gripper Dynamixel ID : 15, Model Name :XM430-W350
[ INFO] [1544509070.096942788]: Succeeded to init /open_manipulator

And Open the other terminal window and enter the following command in the terminal. This command is to execute a package that recognizes the ar marker. Enter the type of camera you are using and the size of the ar marker. In this example, we use a Raspberry Pi Camera V2 and 3cm ar marker.

$ roslaunch open_manipulator_ar_markers ar_pose.launch camera_model:=raspicam user_marker_size:=3.0

NOTE: To use the Raspberry Pi Camera V2, Run the camera node in Raspberry Pi.

And Open the other terminal window and enter the following command in the terminal.

$ roslaunch open_manipulator_pick_and_place open_manipulator_pick_and_place.launch

So, you can see the following message in the terminal window. You can check the robot status.

-----------------------------
Pick and Place demonstration!
-----------------------------
1 : Home pose
2 : Pick and Place demo. start
3 : Pick and Place demo. Stop
-----------------------------
-----------------------------
Present Joint Angle J1: 0.000 J2: 0.000 J3: 0.000 J4: 0.000
Present Tool Position: 0.000
Present Kinematics Position X: 0.000 Y: 0.000 Z: 0.000
-----------------------------

There are three commands. Please enter that number in the terminal.  

  1. Home pose: Move to the home pose.
  2. Pick and Place demo. Start: Start the Pick and Place demonstration.
  3. Pick and Place demo. Stop: Stop the Pick and Place demonstration.

Not supported in Arduino