Skip to main content

reBot Arm B601-DM Visual Grasping Demo

reBot Arm B601-DM

License: MITPython VersionPlatformCameraYOLO

Depth Perception · Object Detection · Hand-Eye Calibration · Autonomous Grasping · Fully Open Source

YOLO is a widely used family of real-time object detection models that can localize and classify targets in a single forward pass. This tutorial uses YOLO together with the Orbbec Gemini 2 depth camera to build a working desktop visual grasping demo for the reBot Arm B601-DM, covering environment setup, camera integration, hand-eye calibration, and grasping validation.

Project Features

  1. Direct grasp pose estimation from YOLO + OBB
    The pipeline uses detection boxes or OBB minimum-area rectangles directly and takes the short axis as the gripper opening direction, avoiding complex 3D point-cloud processing.

  2. Lightweight robotic arm and gripper integration
    The main grasping script reuses the RebotArm interface and integrates IK, trajectory control, and the gripper state machine.

  3. Open Source and Extensible
    All source code is open, and users can customize control algorithms and effects based on their own needs.

Specifications

The hardware for this tutorial is provided by Seeed Studio

ParameterSpecification
Robot Arm ModelreBot Arm B601-DM
Degrees of Freedom6-DOF + Gripper
Camera ModelOrbbec Gemini 2
Detection MethodYOLO + OBB Minimum-Area Rectangle
Communication MethodCAN Bus via USB2CAN adapter; USB 3.0 camera connection
Operating Voltage24V DC
Host PlatformUbuntu 22.04+ PC
Recommended Python VersionPython 3.10

Bill of Materials (BOM)

ComponentQuantityIncluded
reBot Arm B601-DM Robotic Arm1
Gripper1
USB2CAN Serial Bridge1
Power Adapter (24V)1
USB-C / Communication Cable1
Orbbec Gemini 2 Depth Camera1
Gemini 2 Camera Connector / Mounting Bracket1

Wiring

  1. Connect the Gemini 2 to the host via USB 3.0.
  2. Connect the USB2CAN adapter to the arm CAN bus.
  3. Make sure the 24V power supply, camera, and robotic arm are all connected securely.
  4. Set permissions:
sudo chmod a+rw /dev/bus/usb/*/*
sudo chmod 666 /dev/ttyUSB0

Environment Requirements

ItemRequirement
Operating SystemUbuntu 22.04+
Python3.10
Recommended Environmentconda
Recommended Workspace Folderrebot_grasp
Recommended conda Environmentrebotarm

Installation Steps

Step 0. Complete the basic robotic arm preparation first

Before starting this tutorial, please finish the content in reBot Arm B601-DM Quick Start, including robotic arm assembly, zero-point initialization, motor ID configuration, and basic connectivity checks.

Step 1. Clone the repository

git clone https://github.com/Seeed-Projects/reBot-DevArm-Grasp.git rebot_grasp
cd rebot_grasp

Step 2. Create the Python environment

conda create -n rebotarm python=3.10 -y
conda activate rebotarm

Step 3. Install project dependencies

pip install -r requirements.txt

Step 4. Install the robotic arm SDK

git clone https://github.com/vectorBH6/reBotArm_control_py.git sdk/reBotArm_control_py
cd sdk/reBotArm_control_py
pip install -e .
cd ../..

Step 5. Install the Orbbec Gemini 2 SDK

This project depends on pyorbbecsdk. The repository does not bundle sdk/pyorbbecsdk by default, so you need to clone the official repository under sdk/ yourself or install it in another way.

sudo apt-get update
sudo apt-get install -y cmake build-essential libusb-1.0-0-dev

cd sdk
git clone https://github.com/orbbec/pyorbbecsdk.git
cd pyorbbecsdk
pip install -e .

You can also use the Gitee mirror:

cd sdk
git clone https://gitee.com/orbbecdeveloper/pyorbbecsdk.git
cd pyorbbecsdk
pip install -e .

For first-time use, it is recommended to install the udev rules:

sudo bash scripts/install_udev_rules.sh
sudo udevadm control --reload-rules
sudo udevadm trigger

Step 6. Verify the dependencies

python -c "import pyorbbecsdk; print('pyorbbecsdk OK')"
python -c "import motorbridge; print('motorbridge OK')"

For first-time Orbbec camera use, it is recommended to run scripts/install_udev_rules.sh inside your installed pyorbbecsdk directory, otherwise the camera may fail to open correctly.

Hand-Eye Calibration

Before running the full grasping pipeline, complete the Eye-in-Hand hand-eye calibration first.

python scripts/collect_handeye_eih.py

Before running it, make sure the following ArUco size parameter in config/default.yaml matches the actual printed marker:

calibration:
aruco:
marker_length_m: 0.1

In automatic mode, the arm traverses 50 preset poses and records a sample whenever the ArUco marker is detected stably. Even if you interrupt the process with c or q, the script still tries to compute the calibration result from the collected samples.

If you want to move the robotic arm manually during collection, use manual mode:

python scripts/collect_handeye_eih.py --manual

In manual mode, the arm enters gravity-compensation mode. Move the end effector to a proper viewing angle, press Enter to capture, and press c or q to finish and compute the result.

The calibration result is saved to:

config/calibration/orbbec_gemini2/hand_eye.npz

Recommended sample count:

  • Minimum: 5 samples
  • Recommended: at least 15 samples

Running and Debugging

1. Verify object detection only

python scripts/object_detection.py

If you need to change the detection model or classes, modify config/default.yaml:

yolo:
model_name: "yoloe-26l-seg.pt"
device: "cpu"
use_world: true
custom_classes:
- "yellow banana"
- "water bottle"
- "cup"

This step is useful to confirm:

  • The camera opens correctly
  • The YOLO model loads correctly
  • YOLO object detection works as expected

2. Verify grasp estimation only

python scripts/ordinary_grasp_pipeline.py

If you need to adjust the grasp inference frequency or the pre-grasp retreat distance, modify:

grasp_pipeline:
infer_every_live: 3
grasp:
depth_quantile: 0.6
pregrasp_offset_m: 0.080

This script does not connect to the robotic arm. It is only used to verify:

  • Whether the OBB or minimum-area rectangle is reasonable
  • Whether the grasp point lies near the target center area
  • Whether the short-axis direction matches the expected gripper opening direction

Key controls:

  • Left mouse button: inspect depth at the selected pixel
  • G: print the current best grasp pose
  • Q / Esc: exit

3. Run the main grasping program

python scripts/main.py

If you only want to validate the target pose without moving the robotic arm:

python scripts/main.py --dry-run

It is recommended to verify the pose and reachable workspace with --dry-run first before executing a real grasp.

If reBotArm_control_py is not in the default location, specify it in config/default.yaml:

robot:
repo_root: null

Keeping it as null is usually enough because the program will try to auto-detect sdk/reBotArm_control_py first.

Main program flow:

  1. Initialize the robotic arm and gripper
  2. Move to the ready pose. If you want to change the startup ready pose, modify config/default.yaml:
robot:
ready_pose:
x: 0.3
y: 0.0
z: 0.3
roll: 0.0
pitch: 1.0
duration: 3.0
  1. Detect tabletop targets in real time
  2. Estimate the grasp pose from the short axis
  3. Press G to capture the current frame and execute grasping

Runtime keys:

  • G: grasp the current best target
  • R: resume live preview
  • Q / Esc: exit

FAQ

1. ModuleNotFoundError: No module named 'motorbridge'

This usually means the robotic arm SDK dependencies are not installed in the current Python environment. Please check:

conda activate rebotarm
pip install -r requirements.txt
cd sdk/reBotArm_control_py && pip install -e .

2. Pressing G does not execute grasping

Common causes:

  • hand_eye.npz does not exist
  • The hand-eye calibration mode is not eye_in_hand
  • The target pose is not reachable by IK

It is recommended to run:

python scripts/main.py --dry-run

3. The grasp depth is unstable

You can try adjusting:

  • grasp_pipeline.grasp.depth_quantile
  • The installation height of the camera relative to the workspace
  • Reflective properties of the target surface

Contact

References

Loading Comments...