VR-based Controller for Humanoid Robot

• Telepresence system that allows user to be embodied into the robot, and in future teach robot non-verbal social behaviors

• The system has it’s own rendering engine to process camera image to VR headset developed using C++, OpenGL, and OpenVR

• The system computes robot’s kinematics based on VR controllers/trackers

• System is implemented to run on Linux, more specifically in ROS

The VR teleoperation system consisted of three main modules:

  1. a VR rendering application;
  2. humanoid robot; and
  3. a kinematics solver.

The VR-based teleoperation interface consisted of a head mounted display (HMD), hand-held controllers, and a microphone so that a user could teleoperate the robot. The VR rendering module generated the graphical output to be displayed on an HMD to provide visual feedback to the user and create an immersive teleoperation experience from the perspective of the robot. The kinematics solver utilized the sensed HMD and hand-held controller positions to determine the joint angles of the user so that they could be mapped to the humanoid’s joints. The system also recorded simultaneously the robot’s sensory data, primarily camera feed, and joint angles generated by the teleoperator. This allowed for generation of a dataset that could later be utilized in future works.

Play Video