Getting Started with the AmazingHand Robotic Hand
The AmazingHand is an open-source robotic hand project designed to make research and experimentation with humanoid manipulation accessible and affordable. Traditional robotic hands are often prohibitively expensive and rely on complex, bulky forearm actuators. The AmazingHand solves this by integrating all its motors directly into a compact, 3D-printable design.
Its design is inspired by the "ILDA hand" research project but simplified to lower the barrier to entry for students, hobbyists, and researchers. The wrist interface is designed for the Reachy2 robot but can be easily adapted for any platform.

- The key features and architecture of the AmazingHand.
- Where to find all the resources to build your own hand (BOM, CAD, Guides).
- How to set up control using either Python or Arduino.
- How to run basic demos and where to find more advanced examples.
Key Features & Specifications
The AmazingHand packs impressive capabilities into a lightweight and accessible package.
Feature | Specification |
---|---|
Degrees of Freedom | 8-DOF (4 fingers, 2-DOF per finger) |
Actuation | Parallel mechanism with 2x Feetech SCS0009 servos per finger |
Motion | Flexion/Extension & Abduction/Adduction via differential motor movement |
Construction | Fully 3D printable with rigid "bones" and flexible TPU shells |
Weight | ~400g |
Control Interface | Serial bus |
License | Code: Apache 2.0, Mechanical Design: CC BY 4.0 |
Each finger is actuated by two motors in parallel. This clever design allows for both curling (flexion/extension) and side-to-side (abduction/adduction) movements by controlling the differential motion of the servos. The palm is also a flexible part, allowing for safer and more compliant grasping of objects.


Build Resources 🛠️
Everything you need to build your own AmazingHand is available on the project's GitHub repository.
-
Bill of Materials (BOM): A complete list of all required electronic and hardware components can be found here:
-
CAD Files and 3D Printing: All STL and STEP files are provided. Note that while the finger design is universal, some palm components are specific for right or left hands.
- CAD Files: Link to CAD Folder
- 3D Printing Guide: Instructions for Printing Parts
-
Assembly Guide: A detailed, step-by-step guide to assemble the hand.
Control Methods
You have two primary options for controlling the hand's servos over the serial bus.
-
MPU: Use a Python script on a host computer (like a Raspberry Pi or PC) connected via a serial bus driver (e.g., Bus Servo Driver Board ). This is great for integration with larger robotics frameworks like ROS.
-
MCU: Use an microcontroller like Arduino with a Bus Servo Driver Board. This is ideal for standalone projects or when a dedicated microcontroller is preferred for handling real-time control.
Calibration scripts are provided for both methods to help you set up the fingers correctly during assembly.
Running the Demos
Once assembled, you can test your AmazingHand with the provided demo programs.
The eight servos in the hand require a stable power source. A simple 5V / 2A DC adapter with a jack connector is sufficient. Do not try to power the servos directly from your computer's USB port.
Basic Demos
Once assembled, you can test your AmazingHand with the provided demo programs. First, clone the project repository to your computer to get all the necessary code:
git clone https://github.com/pollen-robotics/AmazingHand
cd AmazingHand
The eight servos in the hand require a stable power source. A simple 5V / 2A DC adapter with a jack connector is sufficient. Do not try to power the servos directly from your computer's USB port.
Python Examples
The PythonExample
directory contains several useful scripts for testing and control. Navigate into this directory (cd PythonExample
) to run them.
AmazingHand_Demo.py
: This is the main demo. It cycles the hand through several pre-programmed gestures. It's the perfect first test to ensure everything is working correctly.AmazingHand_Demo_Both.py
: A specific demo for controlling both a right and a left hand connected to the same serial bus.AmazingHand_FingerTest.py
: A script to test the movement of a single finger, which is very useful for debugging during assembly.AmazingHand_Hand_FingerMiddlePos.py
: A utility script used for calibration to set fingers to their neutral, middle position.
To run the main demo, execute the following command from within the PythonExample
folder:
python3 AmazingHand_Demo.py
Arduino Examples
For standalone control, the ArduinoExample
directory contains sketches you can upload directly to your microcontroller.
Amazing_Hand_Demo.ino
: The main demo sketch that cycles the hand through the same gestures as the Python version.Amazing_Hand-Finger_Test.ino
: A simple sketch to test a single finger, useful for calibration and debugging hardware connections.
To use these, open the .ino
file in the Arduino IDE, ensure you have the required SCServo
library installed, then compile and upload it to your Arduino board.
Strain Gauge Remote Control Demo
This advanced demo allows for intuitive, force-based control of the hand's grip and gestures.
Principle of Operation
The core of this demo is to create a data glove that translates your finger movements into commands for the AmazingHand. We achieve this by leveraging the electrical property of strain gauges, which change their resistance when bent.
The workflow is as follows:
-
Sensing Finger Bending: We attach strain gauges to a glove or fingers. When you bend your fingers, the strain gauges bend with them, causing a measurable change in their electrical resistance.
-
Data Acquisition: A Seeed Studio XIAO ESP32-S3 microcontroller is used to read these resistance changes. Each strain gauge circuit is connected to one of the ESP32's Analog-to-Digital Converter (ADC) pins, which converts the analog resistance signal into a digital number (typically from 0 to 4095).
-
Serial Communication: The ESP32 continuously sends these digital ADC readings over a USB serial port to a host computer.
-
Processing and Mapping: A Python script running on the host computer listens to the serial port, receiving the raw ADC values. It then maps these values from the ADC's range (0-4095) to the AmazingHand's desired servo angle range.
-
Hand Control: Finally, the script sends the calculated target angles to the AmazingHand's controller, causing the robotic fingers to mirror the movements of your own fingers in real-time.
This creates a complete closed-loop system where your physical hand gestures directly command the robotic hand.
Click to expand code and setup instructions
To run this demo, you need to flash the data acquisition firmware onto the ESP32 and run the Python control script on your host computer.
- XIAO ESP32-S3 Firmware
This firmware is responsible for reading the values from the strain gauges connected to the ADC pins and sending them over the serial port in the format "value1,value2".
/**
* @file XIAO_ESP32_Universal_Firmware_EN.ino
* @brief Firmware for XIAO ESP32-S3 and C3 to read 4 ADC channels and output 2 processed values for a Python script.
* @author TienjuiWong
* @version 1.2
* @date 2025-09-10
*
* @details
* - Platform Compatibility: Seeed Studio XIAO ESP32-S3 & XIAO ESP32-C3.
* - Functionality:
* 1. Reads analog signals from pins D0, D1, and D2.
* 2. Averages these three readings to create a single control value for the main fingers.
* 3. Reads the analog signal from pin D3 for independent thumb control.
* 4. Transmits data over USB Serial in the format "fingers_avg,thumb_val\n".
* 5. The communication baud rate is set to 115200 to match the host script.
*/
// --- Pin Definitions ---
// These pin definitions are valid ADC pins for both the XIAO ESP32-S3 and C3.
// Pins for the three main fingers (e.g., Index, Middle, Ring).
const int FINGER_SENSOR_1_PIN = D0;
const int FINGER_SENSOR_2_PIN = D1;
const int FINGER_SENSOR_3_PIN = D2;
// Dedicated pin for the thumb.
const int THUMB_SENSOR_PIN = D3;
void setup() {
// Initialize serial communication at 115200 baud to match the Python script.
Serial.begin(115200);
// Wait for the serial port to initialize. This is good practice.
while (!Serial) {
delay(10);
}
// Set the ADC resolution to 12-bit for a 0-4095 reading range.
// This setting is supported by both the ESP32-S3 and ESP32-C3.
analogReadResolution(12);
}
void loop() {
// 1. Read the values from the three main finger sensors.
int fingerVal1 = analogRead(FINGER_SENSOR_1_PIN);
int fingerVal2 = analogRead(FINGER_SENSOR_2_PIN);
int fingerVal3 = analogRead(FINGER_SENSOR_3_PIN);
// 2. Calculate the average of the main finger values.
// This single value will control the group of fingers in the Python script.
int avgFingersValue = (fingerVal1 + fingerVal2 + fingerVal3) / 3;
// 3. Read the independent value for the thumb sensor.
int thumbValue = analogRead(THUMB_SENSOR_PIN);
// 4. Send the processed data in the required "value1,value2" format.
Serial.print(avgFingersValue);
Serial.print(",");
Serial.println(thumbValue); // println automatically adds the required newline character.
// 5. Add a short delay to maintain a stable data stream and allow time for processing on the receiver.
delay(50);
}
- Python Control Server
import time
import numpy as np
import serial # <--- Added import for serial library
from rustypot import Scs0009PyController
# --- 1. Define Finger Configuration and Servo Parameters ---
# The FINGERS list defines the mapping between fingers and their corresponding servos.
FINGERS = [
{'name': 'Index', 'm1_id': 1, 'm2_id': 2},
{'name': 'Middle', 'm1_id': 3, 'm2_id': 4},
{'name': 'Ring', 'm1_id': 5, 'm2_id': 6},
{'name': 'Thumb', 'm1_id': 7, 'm2_id': 8}, # Thumb is controlled by an independent ADC channel
]
# Defines the motion range: an offset of -30 degrees when open, and +90 degrees when closed.
# The servos will move in real-time within this [-30, 90] degree range based on the ADC value.
CLOSE_ANGLE_OFFSET = 90
OPEN_ANGLE_OFFSET = -30 # Using a negative value more intuitively represents the offset in the opening direction
# --- 2. Initialize Controllers ---
# !!! NOTE: Please ensure the serial ports for the hand controller and the ESP32 are correct !!!
SERVO_CONTROLLER_PORT = "/dev/ttyACM0" # Serial port for the robotic hand controller
ESP32_ADC_PORT = "/dev/ttyACM1" # Serial port for the ESP32 development board
try:
# Initialize the robotic hand controller
c = Scs0009PyController(
serial_port=SERVO_CONTROLLER_PORT,
baudrate=1000000,
timeout=0.5,
)
# Initialize the serial port for receiving data from ESP32
adc_port = serial.Serial(ESP32_ADC_PORT, 115200, timeout=1)
# Flush the input buffer to prevent reading old data
adc_port.flushInput()
except serial.SerialException as e:
print(f"Serial Error: {e}")
print("Please confirm your serial port settings are correct and the devices are connected.")
exit()
def map_value(x, in_min, in_max, out_min, out_max):
"""
Core function: Linearly maps a value from one range to another.
Used to map the ADC's 0-4095 range to the servo angle offset range
from OPEN_ANGLE_OFFSET to CLOSE_ANGLE_OFFSET.
"""
# Avoid division by zero
if in_max == in_min:
return out_min
return (x - in_min) * (out_max - out_min) / (in_max - in_min) + out_min
def main():
"""Main function: Starts the motors and enters the real-time remote control loop."""
# Get all servo IDs to be controlled from the configuration
all_servo_ids = [id for finger in FINGERS for id in (finger['m1_id'], finger['m2_id'])]
print(f"Servo IDs to be controlled: {all_servo_ids}")
print(f"Connected to Hand Controller: {SERVO_CONTROLLER_PORT}")
print(f"Connected to ADC Data Source (ESP32): {ESP32_ADC_PORT}")
try:
# -- Start and initialize all motors --
enable_values = [1] * len(all_servo_ids)
c.sync_write_torque_enable(all_servo_ids, enable_values)
print("Torque enabled for all motors.")
speeds = [6] * len(all_servo_ids) # 6 is generally a fast speed
c.sync_write_goal_speed(all_servo_ids, speeds)
print("All motor speeds have been set.")
time.sleep(0.5)
# -- Enter real-time remote control main loop --
print("\nEntering real-time control mode... Press Ctrl+C to exit.")
while True:
# Read only when data is available in the serial buffer to avoid blocking
if adc_port.in_waiting > 0:
# 1. Read and parse the serial data from the ESP32
line = adc_port.readline().decode('utf-8').strip()
# Must ensure the data format is "value1,value2"
if ',' not in line:
continue # If the format is incorrect, skip this iteration
try:
val1_str, val2_str = line.split(',')
adc_val_fingers = int(val1_str) # The first value controls the main three fingers
adc_val_thumb = int(val2_str) # The second value controls the thumb
except ValueError:
print(f"Warning: Could not parse data '{line}', skipping.")
continue
# 2. Map the ADC values to angle offsets
# Map ADC range 0-4095 to angle range -30 (Open) to +90 (Close)
fingers_offset = map_value(adc_val_fingers, 0, 4095, OPEN_ANGLE_OFFSET, CLOSE_ANGLE_OFFSET)
thumb_offset = map_value(adc_val_thumb, 0, 4095, OPEN_ANGLE_OFFSET, CLOSE_ANGLE_OFFSET)
# (Optional) Print the current status for debugging
# print(f"ADC: {adc_val_fingers},{adc_val_thumb} -> Angle Offset: Fingers={fingers_offset:.1f}, Thumb={thumb_offset:.1f}")
# 3. Prepare the synchronous write command
all_ids = []
positions_deg = []
for finger in FINGERS:
all_ids.extend([finger['m1_id'], finger['m2_id']])
# Apply the corresponding angle offset based on the finger's name
if finger['name'] == 'Thumb':
current_offset = thumb_offset
else: # Index, Middle, Ring
current_offset = fingers_offset
# M1 and M2 move in opposite directions from the 0-degree center
positions_deg.append(0 + current_offset)
positions_deg.append(0 - current_offset)
# 4. Convert units and send the command to the hand
positions_rad = np.deg2rad(positions_deg).tolist()
c.sync_write_goal_position(all_ids, positions_rad)
except KeyboardInterrupt:
print("\nUser interrupt detected (Ctrl+C)...")
finally:
# Before the program ends, safely disable the torque on all motors
print("Disabling torque on all motors...")
if 'all_servo_ids' in locals() and all_servo_ids:
disable_values = [0] * len(all_servo_ids)
c.sync_write_torque_enable(all_servo_ids, disable_values)
print("Torque disabled. Program terminated.")
if __name__ == '__main__':
main()
MediaPipe Hand Tracking Demo
Control the robotic hand in real-time by mirroring the movements of your own hand, tracked using a simple webcam and Google's MediaPipe framework. This advanced demo uses a Python backend to run the complex hand-tracking AI model, while a simple HTML frontend captures your camera feed. The two communicate over a WebSocket, allowing for smooth, low-latency control of the AmazingHand just by showing your hand to the camera.
To get this demo running, you will need two files: index.html
for the browser interface and backend.py
for the server-side processing.
Step 1: Save the Code Files
First, create the two required files in the same directory. Copy the HTML code into a file named index.html
, and copy the Python code into a file named backend.py
.
index.html
: This file creates the simple web page that requests access to your webcam and streams the video to the backend.backend.py
: This script launches a local WebSocket server. It receives the video stream, uses the MediaPipe library to detect hand landmarks in each frame, and then translates those landmark positions into motor commands for the AmazingHand.
Step 2: Run the Backend Server
Open a terminal or command prompt, navigate to the directory where you saved the files, and run the following command to start the Python server:
python backend.py
You should see a message in the terminal indicating that the server has started and is waiting for a connection, for example: WebSocket server started on ws://localhost:8765
.
Step 3: Launch the Frontend
Finally, navigate to the same directory in your file explorer and double-click the index.html
file. It will open in your default web browser. The browser will likely ask for permission to use your webcam; please allow it.
Once the page loads, you will see your camera feed. The Python script will begin processing the video, and you can control the AmazingHand by moving your hand in front of the camera.
Details
import asyncio
import websockets
import serial
import json
import time
import numpy as np
from rustypot import Scs0009PyController
# --- 1. Configuration ---
# Dexterous hand finger configuration (Pinky finger is not controlled)
FINGERS = [
{'name': 'Index', 'm1_id': 1, 'm2_id': 2},
{'name': 'Middle', 'm1_id': 3, 'm2_id': 4},
{'name': 'Ring', 'm1_id': 5, 'm2_id': 6},
{'name': 'Thumb', 'm1_id': 7, 'm2_id': 8},
]
# Servo motion range definition
CLOSE_ANGLE_OFFSET = 90
OPEN_ANGLE_OFFSET = -30
# Gesture recognition angle input range (set based on actual values observed from the front-end)
# Approximately 10-20 degrees when extended, 140-160 degrees when flexed
# We are setting a relatively tolerant range
GESTURE_ANGLE_MIN = 20 # Corresponds to OPEN_ANGLE_OFFSET
GESTURE_ANGLE_MAX = 160 # Corresponds to CLOSE_ANGLE_OFFSET
# !!! NOTE: Please ensure the serial port for the hand controller is correct
# On Windows, it might be "COM3", "COM4", etc
# On Linux/Mac, it might be "/dev/ttyACM0", etc
SERVO_CONTROLLER_PORT = "/dev/ttyACM0"
# --- 2. Initialize Controller ---
try:
c = Scs0009PyController(
serial_port=SERVO_CONTROLLER_PORT,
baudrate=1000000,
timeout=0.5,
)
print(f"Successfully connected to the hand controller: {SERVO_CONTROLLER_PORT}")
except serial.SerialException as e:
print(f"Serial Error: {e}")
print("Please confirm your serial port settings are correct and the device is connected.")
exit()
def map_value(x, in_min, in_max, out_min, out_max):
"""
Core function: Linearly maps a value from one range to another.
It also clamps the input value to the source range to prevent exceeding the target range.
"""
# Clamp the input value to the source range
x = max(in_min, min(x, in_max))
# Avoid division by zero
if in_max == in_min:
return out_min
return (x - in_min) * (out_max - out_min) / (in_max - in_min) + out_min
def setup_servos(controller, finger_config):
"""Starts and initializes all motors."""
all_servo_ids = [id for finger in finger_config for id in (finger['m1_id'], finger['m2_id'])]
print(f"Servo IDs to be controlled: {all_servo_ids}")
enable_values = [1] * len(all_servo_ids)
controller.sync_write_torque_enable(all_servo_ids, enable_values)
print("Torque enabled for all motors.")
speeds = [6] * len(all_servo_ids) # 6 is generally a fast speed
controller.sync_write_goal_speed(all_servo_ids, speeds)
print("All motor speeds have been set.")
time.sleep(0.5)
return all_servo_ids
async def handler(websocket, controller):
"""WebSocket server logic: receives data and controls the servos."""
print("Web front-end connected.")
try:
async for message in websocket:
try:
# 1. Parse the JSON data received from the front-end
data = json.loads(message)
# 2. Calculate the target angle offset for each finger
thumb_offset = map_value(data.get('thumb', 0), GESTURE_ANGLE_MIN, GESTURE_ANGLE_MAX, OPEN_ANGLE_OFFSET, CLOSE_ANGLE_OFFSET)
index_offset = map_value(data.get('index', 0), GESTURE_ANGLE_MIN, GESTURE_ANGLE_MAX, OPEN_ANGLE_OFFSET, CLOSE_ANGLE_OFFSET)
middle_offset = map_value(data.get('middle', 0), GESTURE_ANGLE_MIN, GESTURE_ANGLE_MAX, OPEN_ANGLE_OFFSET, CLOSE_ANGLE_OFFSET)
ring_offset = map_value(data.get('ring', 0), GESTURE_ANGLE_MIN, GESTURE_ANGLE_MAX, OPEN_ANGLE_OFFSET, CLOSE_ANGLE_OFFSET)
# 3. Prepare the synchronous write command
all_ids = []
positions_deg = []
offsets = {
'Thumb': thumb_offset,
'Index': index_offset,
'Middle': middle_offset,
'Ring': ring_offset,
}
for finger in FINGERS:
finger_name = finger['name']
current_offset = offsets.get(finger_name, 0)
all_ids.extend([finger['m1_id'], finger['m2_id']])
positions_deg.append(0 + current_offset)
positions_deg.append(0 - current_offset)
# 4. Convert units and send the command using the passed controller object
positions_rad = np.deg2rad(positions_deg).tolist()
controller.sync_write_goal_position(all_ids, positions_rad)
except json.JSONDecodeError:
print(f"Warning: Received non-JSON data: {message}")
except Exception as e:
print(f"Error processing message: {e}")
except websockets.exceptions.ConnectionClosed:
print("Web front-end disconnected.")
async def main():
"""Main function: Initializes servos and starts the WebSocket server."""
# 'c' is the controller instance initialized in the global scope
all_servo_ids = setup_servos(c, FINGERS)
# Use a lambda function to pass the controller instance 'c' to the handler
handler_with_controller = lambda ws, path: handler(ws, c)
try:
# Use the new handler_with_controller
async with websockets.serve(handler_with_controller, "0.0.0.0", 8765):
print("\nWebSocket server started at ws://localhost:8765")
print("Waiting for the web front-end to connect...")
await asyncio.Future() # Run forever
except KeyboardInterrupt:
print("\nUser interrupt detected (Ctrl+C)...")
finally:
# Before the program exits, safely disable the torque on all motors
print("Disabling torque on all motors...")
if all_servo_ids:
disable_values = [0] * len(all_servo_ids)
c.sync_write_torque_enable(all_servo_ids, disable_values)
print("Torque disabled. Program terminated.")
if **name** == '**main**':
asyncio.run(main())
Details
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Real-time Hand Tracking & Finger Angle Detection</title>
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Roboto:wght@300;400;700&display=swap" rel="stylesheet">
<style>
:root {
--bg-color: #1a1a2e;
--primary-color: #0f3460;
--accent-color: #3f72af;
--highlight-color: #e94560;
--text-color: #dbe2ef;
--success-color: #28a745;
--error-color: #dc3545;
}
body {
font-family: 'Roboto', sans-serif;
margin: 0;
padding: 2rem;
background-color: var(--bg-color);
color: var(--text-color);
display: flex;
flex-direction: column;
align-items: center;
min-height: 100vh;
box-sizing: border-box;
}
h1 {
font-size: 2.5rem;
font-weight: 700;
color: var(--text-color);
margin-bottom: 0.5rem;
text-shadow: 0 0 10px rgba(63, 114, 175, 0.5);
}
p {
color: #a9b3c9;
margin-bottom: 2rem;
font-weight: 300;
}
.main-container {
display: flex;
flex-wrap: wrap;
gap: 2rem;
justify-content: center;
align-items: flex-start;
width: 100%;
max-width: 1600px;
}
#video-container {
position: relative;
flex-grow: 1;
max-width: 1280px;
aspect-ratio: 16 / 9;
border-radius: 16px;
overflow: hidden;
box-shadow: 0 0 25px rgba(15, 52, 96, 0.8);
border: 2px solid var(--accent-color);
background-color: #000;
}
#webcam, #outputCanvas {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
transform: scaleX(-1);
}
#angle-output {
background: rgba(15, 52, 96, 0.5);
backdrop-filter: blur(10px);
border: 1px solid var(--accent-color);
border-radius: 16px;
width: 320px;
padding: 1.5rem;
box-shadow: 0 8px 32px 0 rgba(0, 0, 0, 0.37);
flex-shrink: 0;
}
#angle-output h3 {
margin-top: 0;
color: var(--text-color);
border-bottom: 1px solid var(--accent-color);
padding-bottom: 1rem;
margin-bottom: 1.5rem;
text-align: center;
font-size: 1.25rem;
font-weight: 400;
}
.finger-angle {
margin-bottom: 1.25rem;
}
.finger-label {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 0.5rem;
font-size: 1rem;
}
.progress-bar-container {
width: 100%;
height: 12px;
background-color: rgba(0, 0, 0, 0.3);
border-radius: 6px;
overflow: hidden;
}
.progress-bar {
width: 0%;
height: 100%;
background: linear-gradient(90deg, var(--accent-color), var(--highlight-color));
border-radius: 6px;
transition: width 0.1s linear;
}
.loading {
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
color: white;
font-size: 1.5em;
text-align: center;
padding: 20px;
background-color: rgba(0,0,0,0.7);
border-radius: 10px;
}
#socket-status {
display: flex;
align-items: center;
gap: 8px;
position: fixed;
top: 20px;
right: 20px;
padding: 8px 16px;
border-radius: 20px;
color: white;
font-size: 0.9em;
font-weight: 500;
transition: all 0.3s ease;
backdrop-filter: blur(5px);
}
#socket-status::before {
content: '';
width: 10px;
height: 10px;
border-radius: 50%;
display: inline-block;
transition: background-color 0.3s ease;
}
#socket-status.connected {
background-color: rgba(40, 167, 69, 0.3);
border: 1px solid var(--success-color);
}
#socket-status.connected::before { background-color: var(--success-color); }
#socket-status.disconnected {
background-color: rgba(220, 53, 69, 0.3);
border: 1px solid var(--error-color);
}
#socket-status.disconnected::before { background-color: var(--error-color); }
</style>
</head>
<body>
<div id="socket-status" class="disconnected"><span>Disconnected</span></div>
<h1>Robotic Hand Visual Teleoperation</h1>
<p>Present your palm to the camera to begin real-time control.</p>
<div class="main-container">
<div id="video-container">
<video id="webcam" autoplay playsinline></video>
<canvas id="outputCanvas"></canvas>
<div id="loading-message" class="loading">Initializing...</div>
</div>
<div id="angle-output">
<h3>Finger Bending Status</h3>
<div class="finger-angle">
<div class="finger-label"><span>👍 Thumb</span><strong id="thumb-angle-value">--°</strong></div>
<div class="progress-bar-container"><div id="thumb-progress" class="progress-bar"></div></div>
</div>
<div class="finger-angle">
<div class="finger-label"><span>☝️ Index Finger</span><strong id="index-angle-value">--°</strong></div>
<div class="progress-bar-container"><div id="index-progress" class="progress-bar"></div></div>
</div>
<div class="finger-angle">
<div class="finger-label"><span>🖕 Middle Finger</span><strong id="middle-angle-value">--°</strong></div>
<div class="progress-bar-container"><div id="middle-progress" class="progress-bar"></div></div>
</div>
<div class="finger-angle">
<div class="finger-label"><span>💍 Ring Finger</span><strong id="ring-angle-value">--°</strong></div>
<div class="progress-bar-container"><div id="ring-progress" class="progress-bar"></div></div>
</div>
<div class="finger-angle">
<div class="finger-label"><span>- Pinky (Not Controlled)</span><strong id="pinky-angle-value">--°</strong></div>
<div class="progress-bar-container"><div id="pinky-progress" class="progress-bar"></div></div>
</div>
</div>
</div>
<script type="module">
import { HandLandmarker, FilesetResolver, DrawingUtils } from "https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@latest";
const video = document.getElementById("webcam");
const canvasElement = document.getElementById("outputCanvas");
const canvasCtx = canvasElement.getContext("2d");
const loadingMessage = document.getElementById("loading-message");
const socketStatus = document.querySelector("#socket-status span");
let handLandmarker;
let lastVideoTime = -1;
let socket;
function setupWebSocket() {
socket = new WebSocket('ws://127.0.0.1:8765');
const statusIndicator = document.getElementById("socket-status");
socket.onopen = () => {
console.log("Successfully connected to the local WebSocket server.");
socketStatus.textContent = "Connected";
statusIndicator.className = "connected";
};
socket.onclose = () => {
console.log("Connection to the WebSocket server has been closed.");
socketStatus.textContent = "Disconnected";
statusIndicator.className = "disconnected";
};
socket.onerror = () => {
console.error("WebSocket connection error.");
socketStatus.textContent = "Connection Error";
statusIndicator.className = "disconnected";
};
}
async function createHandLandmarker() {
try {
const vision = await FilesetResolver.forVisionTasks("https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@latest/wasm");
handLandmarker = await HandLandmarker.createFromOptions(vision, {
baseOptions: {
modelAssetPath: `https://storage.googleapis.com/mediapipe-models/hand_landmarker/hand_landmarker/float16/1/hand_landmarker.task`,
delegate: "GPU"
},
runningMode: "VIDEO",
numHands: 1
});
} catch (error) {
throw new Error("Failed to load hand gesture model. Please check your network connection.");
}
}
async function setupWebcam() {
if (!navigator.mediaDevices?.getUserMedia) throw new Error("Browser does not support camera access.");
try {
const stream = await navigator.mediaDevices.getUserMedia({ video: { width: 1280, height: 720 } });
video.srcObject = stream;
video.addEventListener("loadeddata", () => {
loadingMessage.style.display = 'none';
predictWebcam();
});
} catch (err) {
throw new Error("Failed to access camera, please check permissions.");
}
}
function predictWebcam() {
if (!handLandmarker) {
requestAnimationFrame(predictWebcam);
return;
}
const nowInMs = Date.now();
if (video.currentTime !== lastVideoTime) {
lastVideoTime = video.currentTime;
const results = handLandmarker.detectForVideo(video, nowInMs);
canvasElement.width = video.videoWidth;
canvasElement.height = video.videoHeight;
canvasCtx.clearRect(0, 0, canvasElement.width, canvasElement.height);
if (results.landmarks?.length) {
const drawingUtils = new DrawingUtils(canvasCtx);
for (const landmarks of results.landmarks) {
drawingUtils.drawConnectors(landmarks, HandLandmarker.HAND_CONNECTIONS, { color: "#00FF00", lineWidth: 5 });
drawingUtils.drawLandmarks(landmarks, { color: "#FF0000", radius: 5 });
calculateAndSendAngles(landmarks);
}
}
}
requestAnimationFrame(predictWebcam);
}
function calculateAndSendAngles(landmarks) {
const getAngle = (p1, p2, p3) => {
const vec1 = { x: p1.x - p2.x, y: p1.y - p2.y, z: p1.z - p2.z };
const vec2 = { x: p3.x - p2.x, y: p3.y - p2.y, z: p3.z - p2.z };
const dot = vec1.x * vec2.x + vec1.y * vec2.y + vec1.z * vec2.z;
const mag1 = Math.sqrt(vec1.x**2 + vec1.y**2 + vec1.z**2);
const mag2 = Math.sqrt(vec2.x**2 + vec2.y**2 + vec2.z**2);
const cosTheta = dot / (mag1 * mag2);
return Math.round(Math.acos(Math.min(1, Math.max(-1, cosTheta))) * 180 / Math.PI);
};
const angles = {
index: 180 - getAngle(landmarks[0], landmarks[5], landmarks[8]),
middle: 180 - getAngle(landmarks[0], landmarks[9], landmarks[12]),
ring: 180 - getAngle(landmarks[0], landmarks[13], landmarks[16]),
thumb: 180 - getAngle(landmarks[2], landmarks[3], landmarks[4]),
pinky: 180 - getAngle(landmarks[0], landmarks[17], landmarks[20])
};
const updateFingerUI = (fingerName, angle) => {
const percentage = Math.max(0, Math.min(100, (angle / 180) * 100));
document.getElementById(`${fingerName}-angle-value`).textContent = `${angle}°`;
document.getElementById(`${fingerName}-progress`).style.width = `${percentage}%`;
};
Object.entries(angles).forEach(([key, value]) => updateFingerUI(key, value));
if (socket?.readyState === WebSocket.OPEN) {
const { pinky, ...dataToSend } = angles; // Exclude pinky finger
socket.send(JSON.stringify(dataToSend));
}
}
async function main() {
try {
loadingMessage.innerText = "Loading model...";
await createHandLandmarker();
loadingMessage.innerText = "Starting camera...";
await setupWebcam();
loadingMessage.innerText = "Connecting to server...";
setupWebSocket();
} catch (error) {
console.error("Initialization failed:", error);
loadingMessage.classList.add("error");
loadingMessage.innerText = `Initialization failed:\n${error.message}`;
}
}
main();
</script>
</body>
</html>
Tech Support & Product Discussion
Thank you for choosing our products! We are here to provide you with different support to ensure that your experience with our products is as smooth as possible. We offer several communication channels to cater to different preferences and needs.