AnyTeleop: A General Vision-Based Dexterous Robot Arm-Hand Teleoperation System

RSS 2023

1UC San Diego;   2NVIDIA
Work done during the first author's internship at NVIDIA

Abstract

Vision-based teleoperation offers the possibility to endow robots with human-level intelligence to physically interact with the environment, while only requiring low-cost camera sensors. However, current vision-based teleoperation systems are designed and engineered towards a particular robot model and deploy environment, which scales poorly as the pool of the robot models expands and the variety of the operating environment increases. In this paper, we propose AnyTeleop, a unified and general teleoperation system to support multiple different arms, hands, realities, and camera configurations within a single system. Although being designed to provide great flexibility to the choice of simulators and real hardware, our system can still achieve great performance. For real-world experiments, AnyTeleop can outperform a previous system that was designed for a specific robot hardware with a higher success rate, using the same robot. For teleoperation in simulation, AnyTeleop to better imitation learning performance, compared with a previous system that is particularly designed for that simulator.

Four Features of AnyTeleop



Remote Teleop



Collaborative Teleop



Labmates Test (User Study)



More Teleop Tasks



System Design

Hand Controls Hand, Arm Controls Arm

The human finger pose controls the dexterous hand while the human wrist pose controls the robot arm.

General Robot and Reality Support

AnyTeleop can support teleoperation with diverse robot arms and hands, in either virtual, e.g. SAPIEN and IsaacGym simulator, or the real world.

Application: Imitation Learning with Teleoperation Data

(i) BC Policy with 3K Transition

(ii) BC Policy with 5K Transition

(iii) BC Policy with 10K Transition

Video

BibTeX

@inproceedings{qin2023anyteleop,
  title     = {AnyTeleop: A General Vision-Based Dexterous Robot Arm-Hand Teleoperation System},
  author    = {Qin, Yuzhe and Yang, Wei and Huang, Binghao and Van Wyk, Karl and Su, Hao and Wang, Xiaolong and Chao, Yu-Wei and Fox, Dieter},
  booktitle = {Robotics: Science and Systems},
  year      = {2023}
}