Setup Guide
From CAN bus bring-up to first teleoperated episode. Covers piper_sdk, ROS2 launch, and Meta Quest 3 VR teleoperation.
CAN Bus & Host Setup
~15 minThe AgileX Piper communicates exclusively over CAN bus at 1 Mbps. You need a USB-to-CAN adapter (e.g., CANable, GS_USB) to expose a SocketCAN interface on your Linux host.
Bring up the CAN interface
Connect the USB-to-CAN adapter, then run:
# Set bitrate and bring up the CAN interface
sudo ip link set can0 type can bitrate 1000000
sudo ip link set can0 up
# Verify the interface is active
ifconfig can0
piper_sdk repository includes a can_activate.sh helper. Run it as: bash can_activate.sh can0 1000000. This is the same script used by piper_ros.
can1, can2, etc. Use ip link show to list all CAN interfaces and pass the correct name to C_PiperInterface.
OS support
Ubuntu 18.04, 20.04, and 22.04 are the officially tested platforms. Python 3.6+ is required.
Install piper_sdk
~20 minThe piper_sdk Python library handles CAN framing, joint state feedback, and gripper control. Install from PyPI (recommended) or from source.
# Option A: Install from PyPI (recommended)
pip3 install piper_sdk
# Option B: Install from source
git clone https://github.com/agilexrobotics/piper_sdk.git
cd piper_sdk
pip install -e .
# Verify installation
python3 -c "import piper_sdk; print('piper_sdk OK')"
The SDK automatically installs python-can as a dependency for CAN bus communication.
Connect and enable
The C_PiperInterface class is the main entry point. After connecting, the arm must be enabled before it accepts motion commands. EnableArm(7) enables all six joints plus the gripper.
from piper_sdk import C_PiperInterface
# Initialize with the CAN interface name (default: "can0")
piper = C_PiperInterface("can0")
# Connect to the arm
piper.ConnectPort()
# Enable all joints (required before motion commands)
piper.EnableArm(7)
print("Piper connected and enabled.")
piper_sdk/demo/V2/. Start with demo_joint_ctrl.py to verify basic motion before building your own control loop.
First Motion
~20 minWith the arm connected and enabled, read joint state and send your first position command.
Read joint state
import time
# Read joint angles in a polling loop
for _ in range(10):
joint_state = piper.GetArmJointMsgs()
print(joint_state)
time.sleep(0.1)
# Read end-effector pose
end_pose = piper.GetArmEndPoseMsgs()
print(end_pose)
Send a joint position command
# Move to a joint configuration (angles in degrees)
# Arguments: joint1, joint2, joint3, joint4, joint5, joint6
piper.MotionCtrl_2(
0, # joint 1
0, # joint 2
90, # joint 3
0, # joint 4
0, # joint 5
0 # joint 6
)
time.sleep(2) # wait for motion to complete
Gripper control
# Open gripper
piper.GripperCtrl(0, 1000)
# Close gripper (check your gripper's max value)
piper.GripperCtrl(70, 1000)
# Read gripper state
gripper_state = piper.GetArmGripperMsgs()
print(gripper_state)
piper.DisableArm(7) when finished. An enabled arm responds immediately to any command — including erroneous ones from bugs or dropped packets.
Dual-arm (master-slave) setup
For bimanual configurations, connect two Pipers on separate CAN interfaces:
piper_left = C_PiperInterface("can0")
piper_right = C_PiperInterface("can1")
piper_left.ConnectPort()
piper_right.ConnectPort()
piper_left.EnableArm(7)
piper_right.EnableArm(7)
print("Both arms connected.")
ROS2 / MoveIt Integration
~60 minThe piper_ros package provides a full ROS Noetic driver with MoveIt motion planning and Gazebo simulation. It wraps piper_sdk internally and exposes standard ROS interfaces.
Install dependencies
# Install required ROS packages
sudo apt-get install -y \
ros-noetic-moveit \
ros-noetic-ruckig \
ros-noetic-ompl
# Install Python CAN dependency
pip3 install python-can piper_sdk
Launch
# Step 1: Activate CAN interface
bash can_activate.sh can0 1000000
# Step 2: Launch the Piper control node
roslaunch piper start_single_piper.launch
# For dual-arm:
roslaunch piper start_double_piper.launch
MoveIt planning
# Launch MoveIt with RViz for interactive planning
roslaunch piper_moveit_config demo.launch
# Gazebo simulation (no physical arm required)
roslaunch piper piper_gazebo.launch
S-V1.6-3 require the legacy piper_description_old.urdf file. Newer firmware uses the standard piper_description.urdf. Check the firmware version label on the base of the arm before loading ROS models.
See the Specs page for the full ROS topics and services table.
Meta Quest 3 VR Teleoperation
~90 minThe Piper can be controlled in real time using a Meta Quest 3 headset. The architecture uses UDP over your local network: the Quest runs a Unity app that streams hand pose data, and a Python server on the robot PC translates that into Piper SDK commands.
Architecture
The Unity side (VRHandPoseSender.cs, VRGripperController.cs, VRTeleoperationManager.cs) and the UDP layer are fully reusable from xArm setups — only the robot controller module needs to be swapped out.
Setup steps
-
Start the CAN interface and enable the arm.
sudo ip link set can0 type can bitrate 1000000 sudo ip link set can0 up -
Create a
PiperControllerwrappingC_PiperInterface. Replace theXArmControllerclass in your existing teleoperation stack with a newpiper_controller.py. Implementconnect(),set_pose(x, y, z, roll, pitch, yaw),set_gripper(value), andemergency_stop()using piper_sdk calls. -
Launch the Python UDP server on the robot PC.
The server listens on UDP ports 8888/8889 and forwards received hand pose packets to the Piper.python3 teleoperation_main.py --robot-type piper -
Launch the Unity app on the Quest 3 and connect to the PC's IP address.
Adjust
positionOffset,rotationOffset, andscaleFactorin Unity to match the Piper's workspace. These parameters differ from xArm due to Piper's smaller reach envelope.
scaleFactor in Unity to prevent the arm from hitting joint limits during teleoperation. Start with a conservative scale and increase gradually while monitoring joint angles.
Data Collection
OngoingOnce teleoperation is working, use the SVRC platform to record, label, and export manipulation demonstrations.
- Record teleoperated episodes via the Python UDP server or directly through
piper_rosbag recording - Export in RLDS or LeRobot format for downstream policy training
- Use the SVRC Platform to manage datasets, run quality checks, and train ACT or Diffusion Policy models
piper.GetArmJointMsgs() and piper.GetArmEndPoseMsgs() at ~50 Hz in a background thread to capture synchronized joint and end-effector state during teleoperation.